DEV Community

Cover image for The Future of "View Page Source"

The Future of "View Page Source"

Basti Ortiz on July 31, 2020

The Reality of the Modern Web The following terms below form a small subset of the extensive vocabulary in modern web development: Com...
Collapse
 
moopet profile image
Ben Sinclair

I think it's a shame, but the specific area I think it's a shame is semantics.
A lot of sites have terrible HTML.

Here's an example, not of the actual source (which is minified), but of what the DOM ends up looking like. It's from Reddit, but that's a fairly random choice on my part:

an example of DIV soup from Reddit's source

Nothing is semantic. You can't reliably interpret this with assistive technologies. You can't do anything much with it apart from throw it at a browser and hope it works. And that's all a lot of developers care about, which is the problem - too many people don't care about the HTML as long as the page looks pretty. In fact, they want it to be difficult to interpret because obfuscation thwarts many ad-blockers. It's security through obscurity, to be sure, but it's there as much to hide potential malware as it is to improve performance.

Collapse
 
somedood profile image
Basti Ortiz

Man, you're right. I haven't even considered the accessibility side of the argument.

I came to write this article from the viewpoint of onboarding people to the Web. Now I see that it extends beyond just that.

Collapse
 
moopet profile image
Ben Sinclair

Thing is, it's a straightforward swap to make components use semantic elements, but because the result isn't visibly different to the majority of end-users, people don't bother.

Thread Thread
 
somedood profile image
Basti Ortiz

It's unfortunate how things like accessibility can be swept under the rug for reasons like "priority" and "ticket triage". After all, accessibility features really only affect those who need it, which comprises only a minority of most user bases. It's no surprise that big feature releases are prioritized over "invisible" changes towards better accessibility.

Collapse
 
mattschwartz profile image
Matthew Schwartz

I've been a web developer for 15 years. The modern web is kind of a hack.

The web was originally designed to share and link static documents. That's it. And while we've added many things to it, the foundation remains the same. The fact it's grown this much is a testament to the flexibility in its basic design. I'd argue we're pushing its boundaries.

Think about it this way. We're making complex, stateful, dynamic applications on top of text documents. We're programmatically generating these text documents and/or altering their representation in a client application. It breaks very easily and is challenging to optimize.

Before web development I spent 10 years writing traditional client/server applications. Believe me, it had its headaches and drawbacks, but the systems I worked with were built for that purpose, were very efficient, and were relatively simple to debug.

The web wins because it has the best possible distribution model. Everyone already has the necessary client installed. Hopefully we can continue to adapt it more effectively with new standards.

Collapse
 
somedood profile image
Basti Ortiz

We're making complex, stateful, dynamic applications on top of text documents.

This pretty much sums everything up. In all my time here on DEV, this is probably the most insightful comment I have ever read.

Collapse
 
easrng profile image
easrng

As a platform, Glitch.com is wonderful as it lets you see the full source, including for the server, and edit it like a Google doc. However, that is only for glitch apps, and like you said, the rest of the web is overwhelmingly minified and bundled. I can only hope ES modules will help change that.

Collapse
 
somedood profile image
Basti Ortiz

ES modules can help with that, but as long as it is more network-efficient to bundle up code, the minified Web is unfortunately not going away any time soon.

From a network standpoint, the main issue with ES modules is the "linked list" of dependencies. The browser will only fetch scripts once it encounters an import statement. This is a problem for deeply nested dependency graphs, hence the popularity of bundlers.

Collapse
 
easrng profile image
easrng

Actually I don't think that's how it works. I'm pretty sure it loads and parses the whole thing before it executes. Don't quote me on this

Thread Thread
 
somedood profile image
Basti Ortiz

Well, yes. It does. But once it parses an import statement, then it has to fetch the next level of dependencies... and then the next... and then the next... and so on and so forth just like a linear traversal of a "linked list".

Sure, the browser can fetch and parse in parallel, but at the end of the day, code bundles work around this issue by including all imports in a single file. This removes the need to fetch for a "linked list" of nested dependencies (import statements).

Thread Thread
 
easrng profile image
easrng

However, for common dependencies like lodash, if you use a CDN like pika or jspm, you won't load them, they'll be cached. With bundles, that is simply impossible. Also I as a dev vastly prefer not needing a build step, so I use es modules.

Thread Thread
 
somedood profile image
Basti Ortiz • Edited

Yes, it would be great if everyone used smart CDNs like Pika, but the reality is otherwise. I do hope for the best, though! 🤞

But I believe you misunderstood me about the network disadvantages of ES modules. ES modules can be represented as a big graph of dependencies.

NOTE: By "dependencies", I mean both external libraries, internal application code, userland modules, and other related components. It is not limited to only NPM modules.

However, the problem with this dependency graph is traversal. When the browser fetches for import statements, it is basically traversing only one level of that big dependency graph. This must be repeated until the whole graph has been traversed, where each node (dependency/script/module) would require a network round trip (assuming the absence of HTTP/2's server push feature).

Even with HTTP caching enabled, this is still the main problem with ES modules. Yes, the network trip has been mitigated, but it would have been more efficient (in terms of CPU cycles) if the browser had just parsed a single bundle instead of recursively traversing an entire dependency graph all over again.

For small sites (with equally small dependency graphs), this would not matter at all. But I would imagine that heavy web apps such as those of Facebook and Spotify would slow down to a crawl if they used ES modules over bundles instead, even with caching enabled.

Caching only goes as far as mitigating network round trips. For large dependency graphs, traversing while parsing syntax can prove to be quite taxing on the CPU. Even more so for mobile devices.

This is why code bundling has become a necessary build step for large applications. Again, smart CDNs can only go as far as mitigating network round trips. It is still more efficient (and battery-friendly) to load a big bundle rather than a deep dependency graph.

Thread Thread
 
easrng profile image
easrng

Or, use preload links to preload all your JS, and use the await import function as needed, so the browser preloads and caches all your JS, but loads only the minimum first.

Thread Thread
 
somedood profile image
Basti Ortiz • Edited

I suppose that could work. I can see the appeal behind your method. It's definitely a much better developer experience without the build steps.

Personally, I still wouldn't rely on this behavior if I were to write a large application. Browser support for dynamic imports aside, there just seems to be more runtime overhead with ES modules than if I had just moved the import overhead at compile-time as a build step.

But then again, this is an unfortunate side effect of the "modern Web", where bundling code is just more network- and CPU-efficient than the more elegant ES modules. 😕

Don't get me wrong, I'd love to see a future where the Web is beautifully intuitive and semantic everywhere, but the reality of the situation just deems it otherwise.

ES modules are great, but the current climate of the modern Web forces me to add a tedious build step because it's a "best practice" for network and parser performance.

So yeah... As much as I want to keep my codebases simple like you do, large applications call for such complexities. ES modules are not exactly the most "sCaLaBLe" solution. I'd love to see the day when I'd be proven wrong, though.

Collapse
 
wulymammoth profile image
David

The browser application has become a compilation target. You are correct. I've worked with JS when jQuery was simply the best thing since sliced bread. I don't write JavaScript for work anymore, but do in personal projects and the ecosystem is dizzying. While I have my complaints, it's progress. I've opened "view source" maybe five times in the last five years because it is no longer approachable. I don't necessarily think this is a bad thing, though. Nobody really intends for source code to be read by the consumer of the end-product/service/client.

But let's step back for a moment -- despite this, JS is still many people's introduction to programming from a non-traditional CS background today, providing one of the tightest feedback loops there is -- write some code, refresh the browser, see something. There are more people learning JavaScript, HTML, CSS across the world than perhaps any other programming language. People don't learn JS starting with Node -- it's almost always from the browser first.

"View source" today is not meant for people to pop open and poke around anymore. Do this on Facebook and a scary warning pops up for those that aren't web devs. There is an argument to be made that perhaps view source shouldn't even be available -- because it's not like other UI clients like iOS or Android apps allow its users to do that.

From where I'm standing, JavaScript and browser-based development has more than a healthy dose of interest and people coming in despite the friction and dizzying array of constructs once one wants to actually get up to speed with modern JS development. Because of the influx of new developers, their voices almost drown out other ideas. SvelteJS is absolutely one of the simplest and fastest UI libraries out there, but people automatically side-step it because the React community is just outsized and smack down every opinion with disdain. Because people don't poke at new things with a healthy intrigue anymore, only few voices are heard. And everything that you've described simply falls by the wayside with newcomers, because it is just "accepted". They know no other way and probably don't even have anything else to compare it to if JS is the ecosystem they are first introduced to. It isn't until this same developer jumps into other languages and platforms and ask, "What's the build tool? What's the transpiler? What's the Webpack here? What's the package.json?" They are usually in for a surprise if stepping into something like Ruby, Elixir, Go, or Rust (if ever). The toolchain can be learned in a day and everything is typically bundled into a single utility (save for maybe Go) and things can be blazing fast with just as tight of a feedback loop, although not as animated as a browser can be.

But all-in-all, I think Web Assembly is paving a future where what is in the browser can be simple and approachable again -- treating it literally as a view layer. With Rust and WebAssembly, we can have our cake and eat it, too. It can write better JS and semantic HTML than we can: youtube.com/watch?v=ohuTy8MmbLc

Collapse
 
somedood profile image
Basti Ortiz • Edited

So discounting the "dizzying ecosystem" of JavaScript tools and build systems, would you say that WebAssembly is the right direction for the Web, where the heavy lifting is done by "native" code, while the UI and view layer is managed by simple HTML and JS?

Once the browser support starts coming in, I'd say that's a sustainable future for the Web. Though, I can't help but feel strange about how far it is from its humble origins.

Collapse
 
wulymammoth profile image
David

I think it’s one direction with some steam, but I hesitate to say that it is the right direction

Well, you see this idea of “humble origins” that you speak of while also true in my case is really a figment of our own imagination. Was this really the intent of its designers or even a goal? I’m not sure. I never looked that that history...

That said, though, I agree — once we’ve got all the browser support, it’s a sustainable future. I think we can finally get to semantic HTML without shoehorning application state into a document and we can again pop open the source and make sense of it again.

Collapse
 
ben profile image
Ben Halpern

This is, at the very least, a really good conversation to bring attention to.

Collapse
 
patarapolw profile image
Pacharapol Withayasakpunt • Edited

I think web browsers are moving the same paths (and trying for similar complexities) as mobile dev (iOS, Android), so compilation is inevitable. "View Page Source" / "Inspect Element" is not the goal.

One of the similarities is requiring servers to leverage the work, while servers try to give as much work as possible to the clients (as CPU on the clients becoming cheaper and more powerful).

Collapse
 
somedood profile image
Basti Ortiz

I am beginning to think that the future of the Web utltimatley aims to democratize application distribution, where we would no longer need the approval of monopolies and app stores to launch apps.

The future of the Web is not to host new applications, but to move native apps towards a more "open" platform, hence the similarities you pointed out with mobile app development.

Either way, that is a very interesting point you brought up. Although we can never be sure if this is the right direction, this is the direction we're taking nonetheless.

Collapse
 
patarapolw profile image
Pacharapol Withayasakpunt • Edited

You are right. All platform compatible, with the same standards. Unlike mobile.

Also, not App Store dependent, although currently DNS-dependent...

Thread Thread
 
somedood profile image
Basti Ortiz

I guess this just has the consequence that the future of "view page source" is no more.

If the Web will ultimately serve to replace the native mobile platform, then the compiled distribution files should serve no purpose to the end-user as well.

There would be no point in maintaining a "beautiful" page source if it would not matter to the end-user. Everything would be for the sake of network efficiency.

If this is the direction of the Web, I would say "view page source" is as good as dead. But at least it promotes platform-independence, right?

Collapse
 
akashkava profile image
Akash Kava

Source code of Web Assembly will not be available, or it will be obfuscated. I still like plain JavaScript and Source Maps.

Collapse
 
patarapolw profile image
Pacharapol Withayasakpunt • Edited

Why would you want the clients to access the source maps?

Open sourcing (OSI) is just a movement with criticisms, but it is indeed probably the currently world direction.

Collapse
 
akashkava profile image
Akash Kava

Better error messages with debug information helps in resolving issues quicker.

Thread Thread
 
patarapolw profile image
Pacharapol Withayasakpunt

I wonder if it can go that way with Android, iOS or even desktop apps as well.

What about Stack Trace?

WASM / compilation shouldn't exclude the possibility.

Thread Thread
 
akashkava profile image
Akash Kava

Yes it is big pain, Stack Trace without line number is useless.. Probably that's the reason people are more interested in React Native and JavaScript based development. We were using Xamarin for app development. And we created Web Atoms JavaScript for mobile app development on Xamarin.

Thread Thread
 
benjozork profile image
Benjamin Dupont

It's an immature technology - sadly compromises had to be made in order to get support out the door.

Source maps are already supported.

Collapse
 
somedood profile image
Basti Ortiz

That's true. I just wanted to point out that the Web has been going in the direction of just "being a compilation target"—WebAssembly being a prime example.

Collapse
 
easrng profile image
easrng

Well you will be able to view the WAST, but it won't look like source code unless you are doing very simple things. It's more readable than assembly, but even minified js is more clear.

Collapse
 
zilti_500 profile image
Daniel Ziltener

We're going in a horrible direction. Browsers and the web aren't made for the kinds of applications that are beaten into them. But because a bunch of people are hellbent and stubborn on doing it anyway, well... here we are.

Collapse
 
somedood profile image
Basti Ortiz

To be fair to the Web, the main issues were brought about by the fact that the Web just happened to become the most accessible platform in the world. To reach the largest audience, it comes to no surprise that people are so "hellbent" and "stubborn" to push the Web to its boundaries. Sadly, perhaps it's been pushed a bit too far...

Collapse
 
szabikr profile image
Szabi

Don’t think that we are building web pages anymore, at least not the majority of the time. We are building complex applications that execute business logic, deal with asynchronous interactions and are data driven in most cases.
Consider these apps similar to what you can find on the Apple Store or Google Play, but they run in the browser. Although, I do believe that we as web developers can do a better job when it comes to correct usage of HTML5

Not all elements should be a

..