You might have read my article Rethinking Vue Full stack where I presented a prototype of what I called a "Frackend". Back then, I only had a proof of concept. But first things first: why should you care?
Unless we are one of the unicorns who actually make a living with crowd-founded open source projects, we are likely employed by or are hired by companies who market or sell something using the internet. And at the end of the day, they aren't impressed by our abilities to bubble sort or our ToDo-SPAs on heroku. At least they shouldn't be (That people who are hiring us often don't understand this themselves is a rant for another time). Instead, these companies are fighting a battle between two competing principles: organic traffic vs. user experience. If you ever ran Lighthouse, you enjoyed these little insights like
"Walmart noticed a 1% increase in revenue for every saved 100ms"
About a year ago I was working for a client who represented over 3000 smaller businesses and conducted an analytical evaluation of revenue vs. page-speed. The outcome of this study was while revenue measurably went down as soon as loading times exceed 1.7 seconds,
every page load exceeding 3 seconds results in a considerable amount of lost revenue.
The truth is that when people browse products they loose interest quickly when loading times are slow and with over 90% of traffic being mobile, internet speed itself isn't always optimal in the first place. Think about your own retention span: what is the title of this post and with what "trick" did I confuse you? (check the url to get the full picture)
I guess I don't have to explain why most companies are highly interested in SEO: if I offer content (e.g. Newspapers, Blogs, etc.) or products (aka online-shops), I need to be found. If someone searches for "awesome red sneakers" and not just directly for "step-by-step-shoes.com", we want to be in the game. And organic traffic is the best traffic there is - it's free. Now, when I say free, I don't mean that there isn't cost related to SEO optimization - after all, it's a complete industry itself - I mean that organic traffic doesn't correlate to a customer in the same way advertising does. Let's explain that using an example in order to understand why SEO-optimization companies ask for such outlandish rates:
A given ad-campaign consists of customer analysis, assets, placement. Simplified, I first analyze who I want to target, then create marketing material (so the actual ad) and then place it somewhere (Facebook ads, Youtube ads, Google AdWords, etc.). This results in an initial investment (so whatever it cost to create this setup) and variable costs (so whatever every impression, click or similar costs me). With tracking in place, we can derive a direct correlation that allows us to make statements like "A new customer costs us X US$ (or whatever currency you calculate)". Needless to say, as soon as you stop spending, the traffic will subside. Organic traffic is different: If you manage to gain X amount of additional visitors through organic traffic, whatever revenue that traffic generates will flow until competitors manage to pull that away from you. Additionally, whatever a company spent on this optimization has no variable cost per visitor.
If you now realize that there is a reason why Wordpress and Shopify are so popular despite being technically unimpressive solutions, then you are not alone. However, to optimize something like an Adobe Magento ecommerce shop or similar, you have to invest in knowledge and hardware. And yet: even if you are doing a really good job, you won't get close to the speed a client-side rendered solution would have. And what does all that traffic help if people jump off?
Thankfully, solutions to this problem gain traction. The attempt of these solutions is relatively simple: Let the developers work as if they were developing a single page application, while in reality a hybrid solution will be rendered. How does this work? Let's assume you visit "my-webshop.com/products/basketball". The server responds with the HTML for this endpoint, making it indexable by search engines. In a second step, the SPA logic kicks in and presents the user with the fast reactivity that is important for a successful experience. As a side effect, the user has the impression the initial loading is faster as you see the content prior to it being reactive, taking care of the issue of initial page load SPAs usually have. Most of the solutions out there even "pre-deliver" asynchronous data. And that makes sense: Why would you deliver content that then executes a call back to the very server it came from to ask for additional data? After all, we should already know what data is needed for that route. This speeds up the hydration of our SPA even further. Additionally, smart prefetching based on rendered or visible (depending on the particular technology) routing-links make use of saving start-up time by dynamic imports while having the effect of actually being already delivered when needed. So there you go, problem solved! Well, not so fast:
Not every scenario is covered as easily as one might think. What about iterations, for example. Let's think of a scenario where I have an online shop and want to display all products of a certain category. Depending on the size of results, this can lead to quite a huge initial page load if rendered server-side. After all, we first have to get those products form a database on the server before the server can even start rendering the HTML. But if we don't (which in that scenario is currently the recommended way for this scenario in all major solutions), we have the issue that from an SEO perspective, our products aren't on that page. The pros will probably point to reducing fetched data-fields by using GraphQL and then in a second step loading additional resources (like additional pictures etc.) and merge the objects after hydration, but as you notice we dive into a complex, individual and work-intensive pool here. And the idea of "you only work on the frontend, the server builds itself" has died. I could bring in additional examples, but let's rather focus on what would need to be solved: We need a declarative way of deciding what data is preloaded and what DOM-manipulations to run on the client, the server or both in order to create a truly perfect solution that enables even relatively unexperienced developers to rapidly develop apps that have measurable performance and SEO advantages and can be developed in a fraction of the time.
Well, we are working on it. A few days ago we uploaded the vastN3 based version of blua.blue to test-drive the technology in a asset-heavy (and sorry, not yet perfectly designed) environment. And the results speak for themselves. To get a intuitive feeling for it, try the following:
|Visit blua.blue and navigate, search, explore
|Open up the source of a given page and see what is delivered by the server
|Verify indexing by searching for any article currently published (e.g. google?q=...)
What you will find is that it will feel very fast while solving all your client's or employer's needs.
We are still not ready to offer vastN3 for production projects, but be sure to follow me here or on GitHub in order to take advantage of something I hope will be the easiest form you ever developed hybrid applications.