End-of-year is a good time to forget our mistakes in the past and have a positive peek into the future. So, as an avid JS/TS full-stack development enthusiast, I've decided to pluck up my courage and make some forecasts for the upcoming year. They're probably all very controversial, so hopefully, they can ignite more meaningful discussions, and I'm eager to hear your opinions.
Edge deployment to become mainstream
Whether you've used it or not, you must have repeatedly heard about "running on the edge" in 2022. The web development ecosystem continues to be obsessed with optimizing TTFB, and supporting running on the edge, for framework builders, is not only a cool thing but also a must-have now:
- Next.js graduated its "experimental edge" API routes to GA. Its beta "React Server Component" support and new data loading patterns are fully edge-compatible.
- Nuxt.js 3.0 completely rebuilt its server engine and fully supports running on edge deployment.
- Remix is actively promoting deployment on edge services like Cloudflare workers.
- SvelteKit 1.0, after such a long wait, has finally landed and definitely didn't miss edge support.
In the simplest sense, Edge is CDN upgraded to support running custom code; so that it can serve not only static assets but also dynamic content to the clients. The challenge of implementing edge computing is executing code securely, cheaply, and consistently fast.
Today, most web apps are still being deployed in traditional hosting environments. There's a good chance that many will try deploying on edge, and some will eventually settle on it. With the more developer-friendly frameworks and their better integration with hosting providers, you can often migrate to the edge without much code change and enjoy some benefits right away:
- SSR pages should immediately load faster in many cases
- Edge networks can cache your data to reduce the load of data store
- Content personalization is easier to achieve as decisions can be made right on the edge
- You'll likely get better results with less cost
But Edge is not a silver bullet. One of the biggest gray areas about its future is data fetching. Most databases today are still not globally distributable, which will unlikely change soon. Accessing a centrally deployed database from an edge network, if not properly planned, can make performance worse. Check out the following video for a measured test:
That's why Prisma, even actively developing and advocating its edge-oriented Data Proxy, honestly claimed that:
As a best practice, we still recommend to generally deploy your database as closely as possible to your API server to minimize latency in response times.
It's definitely something you should weigh before moving to the edge. The mitigation, as Prisma implied, is to confine your edge network to regions close enough to your database. Some frameworks, e.g., Next.js, already supports configuring edge region at the route level. Well, if you're already using a globally distributed data store, I've got nothing to warn you but send my congratulations 🎉.
Unsettled debates about data loading patterns
While developers were still sorting out CSR, SSR, SSG, ISR, etc., Next.js 13 landed with style by making React Server Components its new default (still in Beta, though). This generated quite a lot of excitement for an even better performance of future web apps but has undoubtedly made things even more confusing. Nevertheless, the idea is pretty cool: server components render on the server, and their code stays on it. It can potentially reduce client bundle size dramatically, but at the same time makes the network boundary more blurry than before. As a result, your app's behavior can be harder to reason out.
Check out my other post for a more thorough explanation of Next.js 13's RSC:
Fun With Next.js 13 Server Components
ymc9 for ZenStack ・ Nov 25 '22 ・ 5 min read
There have been endless debates about the best way of loading data in full-stack web apps, and it'll just continue getting more heated because there's just no single best solution that fits all scenarios:
Do you have slow backend data fetch or not?
RSC + streaming provides an excellent solution to render and stream UI asynchronously with fine granularity.
Should you care more about bundle size or network traffic after the page is interactive?
RSC reduces bundle size at the cost of more network traffic when components rerender because it streams UI (virtual DOM) instead of data. While traditional router-based data loading, like what Remix has been doing, doesn't suffer from such a problem.
Do you prefer strict separation of data and components, or would you rather collocate them?
It's more a matter of taste. The latest advice we got from most frameworks is "fetch" then "render" (to avoid render-fetch waterfalls), but Next.js 13 (with RSC and built-in fetch deduping) offers an interesting opportunity to do "fetch as you render" performantly.
At the same time, don't forget that non-react ecosystems, like Vue and Svelte, don't have the notion of server components at all, and their developers are doing just fine. Another interesting incident is Remix's joining Shopify. Shopify's front-end framework Hydroden initially bet on RSC. Acquiring Remix effectively means it's changing direction, at least for now. Here's an interesting comment from Vue's creator 😄:
So how should you survive in this chaotic world? I suggest evaluating what framework you stick to based on other aspects. Usually, a modern framework won't fail you badly on its data loading mechanism. They have different preferences and emphases but usually have good enough solutions for most scenarios. I like Remix's idea of using "levers" as a metaphor.
Frameworks will offer all levers, but it's your responsibility to pull.
Fading of API
For example, technically speaking,
getStaticProps of Next.js are APIs, though you never call them explicitly from your client-side code. The same is true for
action in Remix, as well as the
load function in SvelteKit. The network communication, both its timing and format, is so well encapsulated that you're barely aware they're APIs. Your client and server code also naturally share type definitions (if you use Typescript, yes, you should). These will likely solve all your requirements for SSR, SSG, and part of client-side interactions.
Most frameworks also allow you to create "explicit" API endpoints for supporting more flexible client-side interactions. Your initial instinct will probably be, should I build a RESTful or GraphQL service with it then? But let's give it a second thought: your code is already in a monorepo, with the client and server-side code sitting tightly together, written in the same language. So why the heck do you take the trouble to introduce a toolchain and build such a solemn communication specification between them?
tRPC gave a fantastic way of solving the problem with less rigor. It takes full advantage of your setup: monorepo, unified framework, common language (TS), and turns API construction effectively into writing methods and calling them. It is another approach to let APIs fade away in a framework-independent way.
There're apparent sacrifices: your APIs are now tightly bound to your app (and even a specific framework) and not easily consumable by a 3rd party. But in reality, it's often not too much of a concern.
With the stronger power of frameworks' data loading features and pushes from wild ideas like tRPC, in 2023, we should see more people get comfortable with being "less serious" about building APIs and, instead, focus more on building their products.
The popularity of Web IDE
Advances in tools and frameworks do not just give us more power to solve problems. They also make it a snappier experience when doing our work. A particular domain that is worth your attention is browser-based IDEs.
People have been using tools like jsfiddle for quick experiments for years, but it's been limited to playing with client-side stuff. However, full-fledged web IDEs were maturing fast in 2022. For example, Codesandbox now provides good support for full-stack frameworks like Next and Nuxt by spinning up remote containers to run the server-side workload and emulate a "local" experience for you. Gitpod adopts a similar technology but looks more ambitious in reaching deeper into development life cycles.
The most exciting among all is StackBlitz. It took the courageous step to implement a full NodeJS implementation with Web Assembly (called WebContainer). With that, your backend code can run right inside your browser. No need to spin up remote containers and no need to transmit data back and forth across the network. It's a truly local environment. This approach sounds like the only practical way of solving the problem and turning Web IDEs into mainstream usage.
It's still an immature domain but seems to be evolving fast. So I do believe in 2023, there're chances of serious adoptions, at least for tasks like code review or ad-hoc bug-fixing.
At the end
I sincerely thank everyone for staying with me by the end of the year. It's been another rough one for many places in the world. The virus was still spreading, inflation kept running high, and even worse, wars were still being fought. While the world's full of stupidities you can't control, learning, programming, and sharing ideas has always been an excellent consolation. Advances in technologies, especially in OSS development where different peoples join forces and transcend their self-interests, prove that we're still a hopeful species.
I wish everyone a happy new year and prosperous life ahead. As Alexandre Dumas said, all human wisdom is summed up in two words: wait and hope.
Happy new year 🎉!
P.S. We're building ZenStack — a toolkit for building secure CRUD apps with Next.js + Typescript. Our goal is to let you save time writing boilerplate code and focus on building what matters — the user experience.
Top comments (0)