A cache is a storage that stores some data(responses) temporarily, so that it can be transferred to the main storage when certain conditions are met.
- It is used in cpus, suppose 2 commands are executed at the same time. The first variable adds 1 to the particular variable value whereas the second command subracts 3 from the variable value. Lets suppose the variable value to be 7, so we want the result to be 5(7+1=8; 8-3=5), but what if both the commands read the value from the variable at the same time? The first command would update the value(7) and store the result in the variable(8), and the second command would read the value(7) and store the result in the variable(4), as you can see this causes a blunder, we wanted to execute the second command on the result of the first command, whereas in this case both the commands were executed on the variable value. This blunder would not occurif we use cache, the second command would be stored in the cache until the first command is executed and the result of the first command has been stored to the variable. But this technique only works for a single threaded program. Cache is used in some other way in multithreaded programs. Suppose we have three commands, when the first command is being executed in the CPU, the second and third commands stored in the cache operate with each other so that the result of the second and third commands is stored in the cache, and then it is sent to the CPU to be operated with the value in the variable.
- It is used in youtube view counts, likes count and comments count. Suppose a very popular youtuber posts a video, thousands of people will click on the video at the same time, when so many requests are made to the youtube server at the same time, it would cause a blunder as described in the first point, thats where database caching is used, the viewer count is updated in small microcontaines and the cache is updated in the main database when certains conditions are met.
A cacheable response is an HTTP response that can be cached. It is stored to be retrieved and used later. GET, HEAD, POST and PATCH requests can be cached. PUT and DELETE requests cannot be cached as when you commit a change to the database, it has to be updated to the central database.
There are two types of cache: private and shared cache.
- Private Cache: it is the cache which can only be accessed by a single user, it is usually stored on the user's local machine.
- Shared Cache: It is the cache which can be accessed by multiple users, it is usually stored on a server which is nearer to the set of users.
- HTTP requests are cached to make the websites more responsive. It stores a previously requested website on the cache and when the user requests the same website again, it will be retrieved from the cache. This also saves a lot of time, and the main server will not have to handle a request for a website which is already in the cache.
- Taking the example of youtube, it caches the view count of a video and updates it to the main database at a time interval. The viewer's GET request is also cached along with the view count so that if the user wants to watch the video again, he can watch it with his VIEW counted and also he would not have to request for the video again from the main server. PUT and DELETE requests are not cached as they update the database and they are to be updated directly on the main server. Suppose you want to delete a video from your youtube channel, it will be a DELETE request and it cannot be cached as it is high priority and it needs to updated on the main server at the same time because you do not want the people watching the video at that moment to be able to watch the video anymore. If it was cached, it would take some time to update the main databases which would cause a blunder.