Conceptually I have been thinking about render farms and how this pear to pear rendering might apply to the web, 3D artist use render farms to speed up rendering of complex and intensive calculations by spreading the load amongst a literal farm of machines dedicated to the task.
As you visit a website, you and several other visitors are talking to some sort of server about what you need in order to view this page, none of you are working together to share what you might already have, it's quite a selfish model if you think about it, the poor server is under so much load (assuming no load balancing). Lets think torrents for a moment, once you have those precious bits, you can become a seed and share what you have at the same time as downloading what you need, this becomes faster if thier are more peers seeding.
So could the web work this way? Well I think it's possible with a combination of the following technologies to atleast do 10% of the above.
- Headless Chrome
- Server Sent Events
🧙 WHIS stack (concept)
Now you might be thinking either.. what is all that? Or that sounds like the test suite at Netflix or Github. You are correct in atleast feeling a little puzzle d.
WebRTC is used typically for video and audio streaming between clients but it can send text and Buffers, Headless Chrome is found testing the web, bots and more, Indexdb client side persistence and SSE for single direction real time communication from server to client (like web sockets only in one direction).
What if for instance, for every 30 visitors a headless chrome 'worker' is spawned, this browsers job is to create a webrtc P2P rendering farm to share data between it's 30 owners, it will also persist state in its local Indexdb for those guests (possibly working around the whole 🍪 and laws situation? Client nor server saves the data). The peers will send signals to the headless browser we will coin the term 'Worker Browser', these signals will relate to fetching cached copies of the page, fetching UI and state changes and providing SSR. If needed the server can communicate with the Worker Browser and it's associated peers through Server Sent Events (oh know the worker died).
It would be great if requests could be chunked and torrented but I'm not sure how, is request interception a thing?
This was a highly speculative post with I'm sure a lot of holes 🕳️, so let's chat about it down in the comments.