DEV Community

Cover image for WebSockets and HTTP streaming in .NET πŸ€πŸ“²
Abdul
Abdul

Posted on • Edited on

WebSockets and HTTP streaming in .NET πŸ€πŸ“²

I've had a hand in maintaining, but I've never really experienced creating the start of a constant/consistent connection with a data source via any means until I started playing around with the Twitter live tweet stream API (link here).

Being new to it all, I wondered about some of the ways of keeping a constant connection with a server would be in .NET (currently using C#). This small post will briefly explain WebSockets and HTTP streaming and essentially round up some of the differences between them.

WebSockets and HTTP

Before starting the explanation, we will go through what the standard HTTP protocol is and what situations it is usually used for. The HTTP Protocol is the standard way for the client and server to interact with each other, via request and responses. It is typically stateless and so a sort of handshake will be needed for every request initialized.

Client to server communication via HTTP

This leads us to the definition of a WebSocket, in that it is not stateless. The connection it has created can be reused as it stays open throughout its life of use (pictures taken from this blog link).

Client to server communication via websockets

From the picture we can safely assume that some of the situations that would warrant a websocket would be clients that constantly need a single live connection, like live scores or other things that require instant action. Applying HTTP requests to these situations would not only be expensive in terms of constant request response processes, it would send me over the flipping edge just knowing that it's happening! πŸƒβ€β™‚οΈπŸ’¨πŸ’¨πŸ’¨ 🏒

HTTP Streaming?

If that is the case, then what could HTTP Streaming possibly be if HTTP itself is a request-response based protocol? It’s a fair question, and prepare to be confused a little. It is true that the HTTP protocol is based on request(client)-response(server), however we never mentioned that the response from the Server really needs to have an end in all cases.

It is essentially a single HTTP request whose response is held open until the server says otherwise. Imagine the client sending a HTTP request and getting back a response with no end in sight. That’s essentially what is happening here. To enable such an occurrence, the server must ensure that it sets its transfer-encoding header to chunked. This lets the client know that the data will arrive in chunks but doesn’t necessarily know how long the content is and when it will end (the server would never want to set the content length to anything if the main purpose of their response is to maintain an open stream).

HTTP streaming example

Side-note for HTTP streams

We may not need to ensure that a server sets the Transfer Encoding to β€œchunked”, because as long as we don’t define a content length in the header, I think the Transfer Encoding of the response will automatically be set to chunked (The first picture shows the headers on the request message and the second shows the headers on the response message).

First pic

Second pic

Which ones to use?

I feel that it depends on the situation and the server that we are interacting with. For my case regarding Twitter tweet streams, they mentioned they β€œ... deliver Tweet objects in JSON format through a persistent HTTP Streaming connection.β€œ. This is what essentially led me to choosing the latter option, I tried to see if they have a β€œws://” endpoint lurking somewhere but no luck. Let me know if I missed something please πŸ™Œ

I like the fact that with HTTP streaming it’s just one call, one simple request with not much overhead at all because the connection happens only once. I also never tried it before so it’s all the more reason to see how far I can go with it when trying this Twitter stream API πŸ™ˆ

Thanks for coming through, stick around for the next blog when I document the steps to creating an efficient twitter tweet bot in .NET using HTTP Streams!

Top comments (0)