DEV Community

Cover image for Simple and Easy in-memory cache in Golang
Francisco Mendes
Francisco Mendes

Posted on • Updated on

Simple and Easy in-memory cache in Golang

We often end up caching our application data in solutions like memcached or Redis. However, not all applications have to go that way and for the overwhelming majority of cases (especially small or personal projects) they don't need to use an external source.

Exactly for this reason I decided to create an example to show how to cache data in our application with a simple approach.

The framework I'm going to use is Fiber, in addition to being very intuitive and easy to use, the main reason for its use is that if you're a JavaScript/TypeScript developer like me, you'll feel at home because Fiber was inspired by Express.

And the library that will be used to cache our application data is ttlcache, because it is easy to use and has an immensely intuitive API.

The idea of today's application is to make a request to the JSONPlaceholder API and we will get a todo according to the id that is provided in the parameters.

Let's code

First let's install the following packages:

go get github.com/gofiber/fiber/v2
go get github.com/ReneKroon/ttlcache/v2
Enter fullscreen mode Exit fullscreen mode

Then let's create a simple API:

package main

import "github.com/gofiber/fiber/v2"

func main() {
    app := fiber.New()

    app.Get("/", func(c *fiber.Ctx) error {
        return c.SendString("It seems to be working šŸ„³")
    })

    app.Listen(":3000")
}
Enter fullscreen mode Exit fullscreen mode

Now if you open a new tab in your browser and visit http://localhost:3000, in the body of the page you should have the message It seems to be working šŸ„³.

Now with our basic API working we can make an http request to the JSONPlaceholder API via the address https://jsonplaceholder.typicode.com/todos/1 and the data we got is the following:

{
  "userId": 1,
  "id": 1,
  "title": "delectus aut autem",
  "completed": false
}
Enter fullscreen mode Exit fullscreen mode

Now we can start by creating our struct which we'll name Todo, which will contain the following properties:

type Todo struct {
    Userid    int    `json:"userId"`
    ID        int    `json:"id"`
    Title     string `json:"title"`
    Completed bool   `json:"completed"`
}
Enter fullscreen mode Exit fullscreen mode

This way we can make some changes to our endpoint, first we will add the id parameter. Then we will get the value of it using the c.Params() function.

app.Get("/:id", func(c *fiber.Ctx) error {
    id := c.Params("id")
    // ...
})
Enter fullscreen mode Exit fullscreen mode

Now we will make the http request to fetch only a todo according to the id that is passed in the parameters. And we will return that same todo.

app.Get("/:id", func(c *fiber.Ctx) error {
    id := c.Params("id")
    res, err := http.Get("https://jsonplaceholder.typicode.com/todos/" + id)
    if err != nil {
        return err
    }

    defer res.Body.Close()
    body, err := ioutil.ReadAll(res.Body)
    if err != nil {
        return err
    }

    todo := Todo{}
    parseErr := json.Unmarshal(body, &todo)
    if parseErr != nil {
        return parseErr
    }

    return c.JSON(fiber.Map{"Data": todo})
})
Enter fullscreen mode Exit fullscreen mode

Now if we test our api by making an http request at the address http://localhost:3000/[id] we will get the data of its todo.

However, we will be constantly getting data from the JSONPlaceholder API, but what we want is to cache the data so that next time the data will come from the cache and not the JSONPlaceholder.

So we'll import the ttlcache library into our project and then we'll create a new instance of ttlcache, which we'll name cache.

package main

import (
    "encoding/json"
    "io/ioutil"
    "net/http"

    "github.com/ReneKroon/ttlcache/v2"
    "github.com/gofiber/fiber/v2"
)

type Todo struct {
    Userid    int    `json:"userId"`
    ID        int    `json:"id"`
    Title     string `json:"title"`
    Completed bool   `json:"completed"`
}

var cache ttlcache.SimpleCache = ttlcache.NewCache()

// ...
Enter fullscreen mode Exit fullscreen mode

Then in our main function, we will specify how long the data will persist in the cache, in this example I defined that it persisted for 10s.

func main() {
    app := fiber.New()

    cache.SetTTL(time.Duration(10 * time.Second))
    // ...
}
Enter fullscreen mode Exit fullscreen mode

Now inside our endpoint, let's cache the data before returning it in the response body.

For that we will use the cache.Set() function which will receive two arguments, the first is the key which in this case will be the id and the second argument is the data, which in this case will be the todo.

app.Get("/:id", func(c *fiber.Ctx) error {
    // ...

    cache.Set(id, todo)
    return c.JSON(fiber.Map{"Data": todo})
})
Enter fullscreen mode Exit fullscreen mode

Now we will have all cached, but we haven't finished this yet because we still need to create a middleware. What this middleware will do is check if the key already exists in the cache, if it does we will return the data we have stored in the cache.

But if the key doesn't exist in the cache, we will execute the next method of our route, which in this case will be the execution of the http request.

func verifyCache(c *fiber.Ctx) error {
    // ...
}
Enter fullscreen mode Exit fullscreen mode

First we have to get the value of the id parameter. Then we will use the cache.Get() function which will receive a single argument, which is the key, which in this case is the id.

If the key exists, we will return its data, otherwise we will proceed to the next method to perform the http request, using the c.Next() function.

func verifyCache(c *fiber.Ctx) error {
    id := c.Params("id")
    val, err := cache.Get(id)
    if err != ttlcache.ErrNotFound {
        return c.JSON(fiber.Map{"Cached": val})
    }
    return c.Next()
}
Enter fullscreen mode Exit fullscreen mode

Last but not least we need to add middleware to our route, like this:

app.Get("/:id", verifyCache, func(c *fiber.Ctx) error {
    // ...
})
Enter fullscreen mode Exit fullscreen mode

The final code should look like the following:

package main

import (
    "encoding/json"
    "io/ioutil"
    "net/http"
    "time"

    "github.com/ReneKroon/ttlcache/v2"
    "github.com/gofiber/fiber/v2"
)

type Todo struct {
    Userid    int    `json:"userId"`
    ID        int    `json:"id"`
    Title     string `json:"title"`
    Completed bool   `json:"completed"`
}

var cache ttlcache.SimpleCache = ttlcache.NewCache()

func verifyCache(c *fiber.Ctx) error {
    id := c.Params("id")
    val, err := cache.Get(id)
    if err != ttlcache.ErrNotFound {
        return c.JSON(fiber.Map{"Cached": val})
    }
    return c.Next()
}

func main() {
    app := fiber.New()

    cache.SetTTL(time.Duration(10 * time.Second))

    app.Get("/:id", verifyCache, func(c *fiber.Ctx) error {
        id := c.Params("id")
        res, err := http.Get("https://jsonplaceholder.typicode.com/todos/" + id)
        if err != nil {
            return err
        }

        defer res.Body.Close()
        body, err := ioutil.ReadAll(res.Body)
        if err != nil {
            return err
        }

        todo := Todo{}
        parseErr := json.Unmarshal(body, &todo)
        if parseErr != nil {
            return parseErr
        }

        cache.Set(id, todo)
        return c.JSON(fiber.Map{"Data": todo})
    })

    app.Listen(":3000")
}
Enter fullscreen mode Exit fullscreen mode

Now, if we are going to test the performance of our app, we might notice a big difference in its performance. Without the use of the cache each request took an average of 376ms, after the implementation of the cache each request took an average of 2ms.

Conclusion

As always, I hope you found it interesting. If you noticed any errors in this article, please mention them in the comments. šŸ§

Hope you have a great day! šŸ˜Ž

Discussion (2)

Collapse
xvbnm48 profile image
M Fariz Wisnu prananda

nice post bro

Collapse
franciscomendes10866 profile image
Francisco Mendes Author

Thanks so much for the feedback! šŸ™Œ