DEV Community

Rachid
Rachid

Posted on

1 2 1

Spring Boot Pro Tip: Boost Performance with @Cacheable + Java 17

If your Spring Boot app is hitting the database too often and slowing down, you need caching! Using @Cacheable, you can reduce unnecessary queries, speed up responses, and improve scalability.

Let’s explore how to cache efficiently and avoid common pitfalls.

The Problem: Too Many Database Calls
Imagine we have a user service fetching data from the database:

@Service
public class UserService {
    @Autowired private UserRepository userRepository;

    public User getUserById(Long id) {
        return userRepository.findById(id).orElseThrow();
    }
}

Enter fullscreen mode Exit fullscreen mode

Every time getUserById() is called, it hits the database. But do we really need to fetch the same user multiple times?

The Solution: @Cacheable
Spring Boot makes caching easy. Just add @EnableCaching in your main class:

@SpringBootApplication
@EnableCaching
public class MyApplication {
    public static void main(String[] args) {
        SpringApplication.run(MyApplication.class, args);
    }
}

Enter fullscreen mode Exit fullscreen mode

Then, modify your service to cache users:

@Service
public class UserService {
    @Autowired private UserRepository userRepository;

    @Cacheable("users")
    public User getUserById(Long id) {
        return userRepository.findById(id).orElseThrow();
    }
}

Enter fullscreen mode Exit fullscreen mode

Now, when getUserById(id) is called multiple times, it returns the cached result instead of querying the database again!

Fine-Tuning the Cache
By default, @Cacheable caches results forever. To control expiration, use Caffeine or Redis.
Use Caffeine for In-Memory Caching (Fast & Lightweight)
Add the dependency:

<dependency>
    <groupId>com.github.ben-manes.caffeine</groupId>
    <artifactId>caffeine</artifactId>
    <version>3.0.6</version>
</dependency>

Enter fullscreen mode Exit fullscreen mode

Then, configure the cache with TTL (Time-To-Live) settings:

@Configuration
public class CacheConfig {
    @Bean
    public CacheManager cacheManager() {
        return new CaffeineCacheManager("users");
    }
}

Enter fullscreen mode Exit fullscreen mode

This ensures cached users expire after a certain period, preventing stale data.

Use Redis for Distributed Caching (Great for Microservices)
For scalability, store cached data in Redis:

Add Redis dependency:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>

Enter fullscreen mode Exit fullscreen mode

Configure Redis in application.yml:

spring:
  redis:
    host: localhost
    port: 6379

Enter fullscreen mode Exit fullscreen mode

Use Redis as Cache Manager:

@Configuration
public class RedisConfig {
    @Bean
    public RedisCacheManager cacheManager(RedisConnectionFactory connectionFactory) {
        return RedisCacheManager.builder(connectionFactory).build();
    }
}

Enter fullscreen mode Exit fullscreen mode

Result: The cache is shared across multiple instances, perfect for cloud deployments!

Common Cache Mistakes to Avoid
Forgetting to update the cache when data changes
Use @CachePut to update cache when saving new data

@CachePut(value = "users", key = "#user.id")
public User saveUser(User user) {
    return userRepository.save(user);
}

Enter fullscreen mode Exit fullscreen mode

Caching large objects
Cache only frequent, small, and non-sensitive data

Not setting an expiration time
Always define TTL to prevent outdated data issues

Top comments (0)

Great read:

Is it Time to go Back to the Monolith?

History repeats itself. Everything old is new again and I’ve been around long enough to see ideas discarded, rediscovered and return triumphantly to overtake the fad. In recent years SQL has made a tremendous comeback from the dead. We love relational databases all over again. I think the Monolith will have its space odyssey moment again. Microservices and serverless are trends pushed by the cloud vendors, designed to sell us more cloud computing resources.

Microservices make very little sense financially for most use cases. Yes, they can ramp down. But when they scale up, they pay the costs in dividends. The increased observability costs alone line the pockets of the “big cloud” vendors.

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay