DEV Community

Poorshad Shaddel
Poorshad Shaddel

Posted on • Originally published at levelup.gitconnected.com on

Understand and Implement Long-Polling and Short Polling in Node.js

HTTP polling is a technique that we can use to build our real-time apps. In this article, you will read about what is short polling and long polling. We also want to implement some examples with Node.js.

What is Short Polling?

In short polling, the client sends regular requests to the server. If the server has new information it sends it back to the client, otherwise it should send an empty response.

This period that the client sends requests depends on the type of business.

The shorter this period becomes the more pressure will be on the server.

What are the use cases of Short Polling?

If the frequency of the updates you need is not short(more than seconds and minutes) you can use short polling for updating your clients(currently there is no standard for that). A good example is sending updates about the temperature of the client environment. Temperature is not something that we update every second, maybe we can update it every 1 or 2 minutes from the server.


Short Polling Diagram

Let’s implement a short weather API with short polling.

Short Polling Example

This is a simple express application for getting the weather data in real-time.

We need two routes for implementing the latest weather status:

  • /weather/ : For getting the weather data for the first time
  • /weather/update : For getting updates in weather data.

In first route we are returning an array:

const app = require('express')()

const weathers = {
    "Berlin": 15,
    "Vienna": 17,
    "Paris": 14,
    "Madrid": 25
}

app.get('/weather', (req, res) => {
    res.json({
        weathers
    })
})
Enter fullscreen mode Exit fullscreen mode

In order to get weather changes we are going to generate one of these random numbers: -1, 0, 1 for indicating a change in a city’s temperature:

Math.ceil(Math.random()*3 - 2) // A random number: -1, 0, +1
Enter fullscreen mode Exit fullscreen mode

For imitating a change in temperature we can use a function like this one:

const changes = {}
function getWeatherChangesFromThirdParty() {

  const berlinChanges = Math.ceil(Math.random()*3 - 2)
  if (berlinChanges !== 0) {
    changes["Berlin"] = weathers["Berlin"] + berlinChanges
    weathers["Berlin"] = changes["Berlin"]
  } else {
    delete changes["Berlin"]
  }

  const viennaChanges = Math.ceil(Math.random()*3 - 2)
  if (viennaChanges !== 0) {
    changes["Vienna"] = weathers["Vienna"] + viennaChanges
    weathers["Vienna"] = changes["Vienna"]
  } else {
    delete changes["Vienna"]
  }

  const parisChanges = Math.ceil(Math.random()*3 - 2)
  if (parisChanges !== 0) {
    changes["Paris"] = weathers["Paris"] + parisChanges
    weathers["Paris"] = changes["Paris"]
  } else {
    delete changes["Paris"]
  }

  const madridChanges = Math.ceil(Math.random()*3 - 2)
  if (madridChanges !== 0) {
    changes["Madrid"] = weathers["Madrid"] + madridChanges
    weathers["Madrid"] = changes["Madrid"]
  } else {
    delete changes["Madrid"]
  }

  return changes
}
Enter fullscreen mode Exit fullscreen mode

We only add a key value to the object if the change in temperature is not 0.(not sending unneeded data to the client)

Also we have to call this function periodically to change it every minute.(Imagine in real case we are sending a request to a third party that gets sensors data from all over the world and updates the data every minute)

What we should do in this function is

  • If there is no change in the temperature of a city we need to delete that city from changes object
  • If there is a change we need to add it to changes object and also update the original object weathers that hold all the data.
const changes = {}
function getWeatherChangesFromThirdParty() {

  const berlinChanges = Math.ceil(Math.random()*3 - 2)
  if (berlinChanges !== 0) {
    changes["Berlin"] = weathers["Berlin"] + berlinChanges
    weathers["Berlin"] = changes["Berlin"]
  } else {
    delete changes["Berlin"]
  }

  const viennaChanges = Math.ceil(Math.random()*3 - 2)
  if (viennaChanges !== 0) {
    changes["Vienna"] = weathers["Vienna"] + viennaChanges
    weathers["Vienna"] = changes["Vienna"]
  } else {
    delete changes["Vienna"]
  }

  const parisChanges = Math.ceil(Math.random()*3 - 2)
  if (parisChanges !== 0) {
    changes["Paris"] = weathers["Paris"] + parisChanges
    weathers["Paris"] = changes["Paris"]
  } else {
    delete changes["Paris"]
  }

  const madridChanges = Math.ceil(Math.random()*3 - 2)
  if (madridChanges !== 0) {
    changes["Madrid"] = weathers["Madrid"] + madridChanges
    weathers["Madrid"] = changes["Madrid"]
  } else {
    delete changes["Madrid"]
  }

  return changes
}

setInterval(getWeatherChangesFromThirdParty, 60 * 1000)

app.get('/weather/update', (req, res) => {
    res.json({
        changes
    })
})
Enter fullscreen mode Exit fullscreen mode

Let’s check this data that I got from sending requests to these routes together:

  • Getting the weather data for the first time: we can see that all the cities are in this response.
  • Get the update: no changes so we received: {"changes": {}}
  • Get the update: only Paris weather has changed.
  • Get the update: Still only Paris has changed(You may ask why we got the same changes again? Because we sent the request too soon, if the data is getting updated every minute and we send a request every second then we get the same changes 59 times until the changes be updated)
  • Get the update: only Berlin has changed.
  • Get the update: All four cities had change in their temperature.


Example of getting weather data

So this was a super simple implementation of a short polling app.

What are the problems of short polling?

Generally when acceptable latency is low then the polling frequency can cause an unacceptable burden on the server, the network, or both. For instance online gaming and high-frequency trading in capital markets are two good examples that we need to update the status in a short period of time(For some online gaming even 100ms is considered an unacceptable latency). Also there some other problems which are in common between short polling and long polling that you can check out in the last part of this article:

  • Header Overhead
  • Connection Establishment
  • Caching

The whole short polling example

const app = require('express')()

const weathers = {
    "Berlin": 15,
    "Vienna": 17,
    "Paris": 14,
    "Madrid": 25
}

const changes = {}
function getWeatherChangesFromThirdParty() {

  const berlinChanges = Math.ceil(Math.random()*3 - 2)
  if (berlinChanges !== 0) {
    changes["Berlin"] = weathers["Berlin"] + berlinChanges
    weathers["Berlin"] = changes["Berlin"]
  } else {
    delete changes["Berlin"]
  }

  const viennaChanges = Math.ceil(Math.random()*3 - 2)
  if (viennaChanges !== 0) {
    changes["Vienna"] = weathers["Vienna"] + viennaChanges
    weathers["Vienna"] = changes["Vienna"]
  } else {
    delete changes["Vienna"]
  }

  const parisChanges = Math.ceil(Math.random()*3 - 2)
  if (parisChanges !== 0) {
    changes["Paris"] = weathers["Paris"] + parisChanges
    weathers["Paris"] = changes["Paris"]
  } else {
    delete changes["Paris"]
  }

  const madridChanges = Math.ceil(Math.random()*3 - 2)
  if (madridChanges !== 0) {
    changes["Madrid"] = weathers["Madrid"] + madridChanges
    weathers["Madrid"] = changes["Madrid"]
  } else {
    delete changes["Madrid"]
  }

  return changes
}

setInterval(getWeatherChangesFromThirdParty, 60*1000) // update temperature every minute

app.get('/weather', (req, res) => {
    res.json({
        weathers
    })
})


app.get('/weather/update', (req, res) => {
    res.json({
        changes
    })
})

app.listen(8000)
Enter fullscreen mode Exit fullscreen mode

Now let’s get into long polling. The problem: We want the same weather application and the users want updates in every 100ms. We do not have enough resource to handle a lot of requests. For example if we have 1000 active users checking our weather page then we need to handle 10'000 requests per second. So first thing we want to try before increasing the resources is to use HTTP Long Polling and see if it can reduce the load or not.

What is Long Polling?

In HTTP Long Polling the server let you keep an open TCP connection and then you might receive a response from something that changed, or your request might get time out after a while and in both cases you have to re-establish another connection. In other words the client always has a live connection to the server.


Long Polling Diagrem

What is the reason of this design?

  • Minimize the latency in server-client message : You have a open connection to the database so you get the new state immediately after it changes.
  • Minimize the use processing/network resources : When there is no change we are not sending requests to check the state.

Long Polling Example

We want to turn our short polling implementation to a long polling implementation.

The only difference is that we do not need to send the response in update route when there is no change. We need to listen to the changes and when something has changed in weather temperatures then we should send the updated data to the user. For implementing that we are using event emitters:

const Event = require('events')

const weatherChange = new Event()
weatherChange.addListener('new_change', (data) => {
    console.log(data)
})

weatherChange.emit('new_change', 'some data')
Enter fullscreen mode Exit fullscreen mode

Now whenever we emit an event of new_change our endpoint sends all the responses. If you want to know more about event emitters you can checkout this article.

Let’s integrate this into our application:

At first step we need to create our event emitter:

const weatherChange = new Event()
Enter fullscreen mode Exit fullscreen mode

At the second step we need to emit one event when our function detects a change, so we need to check if changes object is empty or not.

const changes = {}
function getWeatherChangesFromThirdParty() {

  const berlinChanges = Math.ceil(Math.random()*3 - 2)
  if (berlinChanges !== 0) {
    changes["Berlin"] = weathers["Berlin"] + berlinChanges
    weathers["Berlin"] = changes["Berlin"]
  } else {
    delete changes["Berlin"]
  }

  const viennaChanges = Math.ceil(Math.random()*3 - 2)
  if (viennaChanges !== 0) {
    changes["Vienna"] = weathers["Vienna"] + viennaChanges
    weathers["Vienna"] = changes["Vienna"]
  } else {
    delete changes["Vienna"]
  }

  const parisChanges = Math.ceil(Math.random()*3 - 2)
  if (parisChanges !== 0) {
    changes["Paris"] = weathers["Paris"] + parisChanges
    weathers["Paris"] = changes["Paris"]
  } else {
    delete changes["Paris"]
  }

  const madridChanges = Math.ceil(Math.random()*3 - 2)
  if (madridChanges !== 0) {
    changes["Madrid"] = weathers["Madrid"] + madridChanges
    weathers["Madrid"] = changes["Madrid"]
  } else {
    delete changes["Madrid"]
  }

  Object.isEmp

  if (!isEmpty(changes)) {
    weatherChange.emit('new_change', changes)
  }

}

function isEmpty(obj) {
    return Object.keys(obj).length === 0;
}
Enter fullscreen mode Exit fullscreen mode

Let’s change our update route to this one:

By adding a listener we only send the response when there is a change. Also be careful with the listener because if you do not remove it, it leads to memory leaks(your listener function still wants to return response to the previous request).

app.get('/weather/update', (req, res) => {
    const responseHandler = (changes) => {
        res.json(changes)
        weatherChange.removeListener('new_change', responseHandler)
    }
    const listener = weatherChange.on('new_change', responseHandler)
})
Enter fullscreen mode Exit fullscreen mode

Let’s see how long polling is working. I am going to send another request whenever we got a response:


Long Polling Example In Action

What are the problems of long polling design?

  • Header Overhead : Every poll response is a complete HTTP message and thus contains a full set of HTTP headers in the message framing.(The problem also exists in short polling design)
  • Maximal Latency : After receiving each response the server needs to wait for the next long poll request before another message can be sent to the client
  • Connection Establishment : In Http Polling you need to frequently open TCP/IP connection and it comes at a price. connections and then close them.(The problem also exists in short polling design)
  • Timeouts : The default timeout value in Firefox is 90 seconds and in chrome it is 300 seconds, but most network infrastructures include proxies and servers whose timeouts are not that long so your request probably gets timed out sooner than this.
  • Caching : Caching mechanisms implemented by intermediate entities can interfere with long poll requests. There is no way for an end client or host to signal to HTTP intermediaries that long polling is in use; As a best practice, caching is always intentionally disabled in a long poll request or response.For example this could be achieved by setting “Cache-Control” header is set to “no-cache”.(The problem also exists in short polling design)

Conclusion

From my point of view the simplest real time implementation that someone can achieve is by using short polling design. In the next step based on the business maybe you can try the long polling for reducing the pressure. In the end for scaling your real time application using other solutions like Web Sockets are makes more sense in the long run.


Top comments (0)