DEV Community

Discussion on: Javascript needs competition on the front end. Thoughts?

Collapse
 
mbtts profile image
mbtts • Edited

TLDR: The web does not need a new language, but instead a new runtime to address the weaknesses of the DOM. Crucially targeting this runtime should build upon and not abandon the best parts of web technologies (simplicity, ease of access).


I don't think JavaScript is really a bottleneck to the performance of most web applications and these modern VMs perform miracles in optimisation. Additionally the language is getting a lot of attention now and with ESLint and Prettier I find it expressive, flexible and pleasure to use.

If all JavaScript was magically rewritten to C++ overnight I am sceptical it would even make that much difference because the real problem or bottleneck in performance is the DOM (not a new or original observation).

The DOM was never designed with performance as a consideration and certainly not with rich web applications in mind. Avoiding excessive repaints and reflows, minimising the number of DOM nodes etc. are all still concerns no matter which library or language you use.

It is also harder for browser vendors to optimise because they must still be forgiving to all of the ambiguous/bad markup and CSS in the world. Compare this to modern JavaScript where the browser does not tolerate errors.

I don't think it will ever happen and would take years, but it would get the biggest win in performance if a replacement for the DOM (a new web runtime) were to be standardised. It is a hazy idea - and I imagine other people have had better thoughts - but it seems like the way to go is compilation...

  • Development would use familiar web technologies with a markup language and a stylesheet language (based on and familiar to HTML/CSS with optimisations).
  • There would then be a compilation step to convert markup and styling documents into standard binary package intended for execution by the (also standardised) web runtime. These could possibly also be transpiled to HTML/CSS for backwards compatibility.
  • HTTP 2 is a binary protocol already... so this would build on top of that as the browser would be executing the binary directly rather than parsing the HTML and building the DOM/CSS.

The main benefit of this approach is it bring the web up to the same level of responsiveness and performance as a native application by removing the bottlenecks associated with parsing and managing the DOM/CSSOM. A secondary benefit is it would remove the need to perform manual performance optimisations like inlining critical CSS and hacks like the PRPL (push, render, pre-cache, lazy load) pattern.

In some ways it would be taking us back to the plugin model - but unlike the plugins it would be an open standard governed by the w3c with an implementation test suite which could be run by all vendors.

To quote Sean Denny from earlier:

gone are the days of being able to view-source on a Web page and see how it works

This is a big concern and to address this the source documents must be accessible from the same location as the binary. There are a couple of possible ways this might be strongly encouraged:

  • Firstly have the servers (Ngnix, Apache Http, S3) perform the compilation as an implementation detail of hosting - thus requiring them to have access to the source files.
  • Secondly make crawlers (Google, Bing) require the source files or mark down sites which don't provide sources.
Collapse
 
jsn1nj4 profile image
Elliot Derhay • Edited

This idea does sound interesting. I do have a question though: when you do "view source" on the page, I assume it shows the updated HTML source, correct? So would the markup source have to be updated anyway in the browser, would markup viewing not include any updates that have been dynamically done to the page, or would there be no viewing of the markup (in your opinion)?

Also, this does make me want to look into DOM performance. For some reason, JS seems to always be blamed (and I'm also guilty of that). But it does also make sense that markup, which needs to be re-rendered with each update, could actually cause the performance issues.

Collapse
 
mbtts profile image
mbtts

I do have a question though: when you do "view source" on the page, I assume it shows the updated HTML source, correct? So would the markup source have to be updated anyway in the browser, would markup viewing not include any updates that have been dynamically done to the page, or would there be no viewing of the markup (in your opinion)?

The idea would be that upon opening developer tools the browser would fetch the source documents. I would envision this experience to be akin to how source maps function for LESS/SCSS/transpiled JavaScript - but for markup instead.

The browser would be running native (compiled) code - it wouldn't be parsing the markup, building the DOM etc. As such there would not be the concept of "generated source" as it exists today, but it would still be possible to map the rendered views back to the original (declarative) source documents.

Also, this does make me want to look into DOM performance. For some reason, JS seems to always be blamed (and I'm also guilty of that). But it does also make sense that markup, which needs to be re-rendered with each update, could actually cause the performance issues.

Definitely try profiling the layout and painting performance.
The perception of performance is far more likely in most cases to be layout related than any scripting.

Collapse
 
antonrich profile image
Anton

I've been thinking what if browsers adapt Elm completely as a new JS specification. I think there is a chance but it's like 0.01% or higher?