Because I was using 90xx web.
Page refresh doesn't happen fast enough as background interactions and rendering result as it arrives. This implies SOME logic on a client.
This is specifically true for remote sites. You will have at best speed of light delay before retrieving content. In worst case network fluctuations might give very annoying delay just to get new content. Yes, and you customer will be staring at WHITE page, because browser viewport have nothing to render.
This is why I am so assertive.
If you don't like it, fine - learn your way ;)
Modern browsers smart enough to cache your megs once and then transmit pure json back and forth, reducing overall need for bandwidth. Plus it happens behind the scene, so customer might enjoy that spinner animation or enjoy previously loaded content.
Sure you could do it backwards and do it the good old fashioned way of "I click, it reloads", but then when the server will be slow or even go 504 users will complain it's slow and unreliable, while you could've used AJAX (which we've been doing for a looooong while now) and provided helpful, human readable status messages without sending your users out or even losing their work because your API aren't reliable.
You can also do both, use server side templating and use JS to augment the website and cache.
Basically what dev.to is doing and being pretty fast at it too.
There's no single way to build a website anymore and that's great for everyone, yeah the complexity of frontend programming is escalating, but there is a lot of innovation going on at the same time
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.