Prerequisites:
- Java version of 17
- Zookeeper and Kafka. I'm following this post on how to run them at my windows 10 machine.
- Visual Studio Code Spring Intializr
And now, let's go into the cooking step:
Initialize Spring Boot Application
Using Visual Studio Code, I clicked
ctrl
+shift
+p
altogether, choose Spring Initializr: Create Maven Project
Choose Spring Boot version of 3.0.12, I actually migrating my project from version 2.7.17 to 3.0.0.
Choose **Java **as project language.
Type your project's
group id
. For me, I choosecom.tujuhsembilan
.Type your
artifact id
, I choosescheduler
.Choose
Jar
as thepackaging type
.Choose 17 for Java version.
-
Add these dependency by searching them. (If you cannot find them in the search tab, you can add them manually later at
pom.xml
).
<dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-quartz</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <scope>annotationProcessor</scope> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-jpa</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> <version>3.0.0</version> </dependency> <dependency> <groupId>com.h2database</groupId> <artifactId>h2</artifactId> <scope>runtime</scope> </dependency> <dependency> <groupId>com.github.javafaker</groupId> <artifactId>javafaker</artifactId> <version>0.15</version> </dependency> <dependency> <groupId>org.modelmapper</groupId> <artifactId>modelmapper</artifactId> <version>3.0.0</version> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-hateoas</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-rest</artifactId> </dependency> </dependencies>
-
After the project initialized, I create few packages. I made these packages at the same level of
Application.java
(SchedulerApplication.java
for me)
* configuration * controller * job * model - Dto * repository * service
Create configuration for application
import org.modelmapper.ModelMapper;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.jpa.repository.config.EnableJpaAuditing;
import lombok.RequiredArgsConstructor;
@Configuration
@RequiredArgsConstructor(onConstructor = @__(@Autowired))
@EnableJpaAuditing
public class ApplicationConfig {
@Bean
public ModelMapper modelMapper() {
return new ModelMapper();
}
}
-
@Configuration
annotation will let the application knows that there is configuration to add to the application. -
@RequiredArgsConstructor(onConstructor = @__(@Autowired))
will reduce the numbers of boilerplate code. It generates a contructor for this class configuration and take all final field as parameter, and as you might already know,@Autowired
annotation is used to automatically inject dependencies. -
@EnableJpaAuditing
annotation is going to make JPA's auditing feature enabled.
Update application.properties
You can find application.properties
under resource
package. It is usually empty, but since we are not going to create anything related to actual database, we will work with h2.
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=password
spring.jpa.database-platform=org.hibernate.dialect.H2Dialect
spring.jpa.defer-datasource-initialization=true
spring.h2.console.enabled=true
Create models and dto for random locomotive data
-
Create class named
Lokomotif.java
insidemodel
package
import java.io.Serializable; import jakarta.persistence.Column; import jakarta.persistence.Entity; import jakarta.persistence.EntityListeners; import jakarta.persistence.Id; import org.springframework.data.jpa.domain.support.AuditingEntityListener; import lombok.AllArgsConstructor; import lombok.Builder; import lombok.Getter; import lombok.NoArgsConstructor; import lombok.Setter; @Getter @Setter @Entity @Builder @NoArgsConstructor @AllArgsConstructor @EntityListeners(AuditingEntityListener.class) public class Lokomotif implements Serializable { @Id private String kodeLoko; @Column private String namaLoko; @Column private String dimensiLoko; @Column private String status; @Column private String createdDate; }
-
Create a new class called
LokomotifDto.java
insidedto
package
import lombok.AllArgsConstructor; import lombok.Builder; import lombok.Data; import lombok.NoArgsConstructor; @Data @Builder @NoArgsConstructor @AllArgsConstructor public class LokomotifDto { private String kodeLoko; private String namaLoko; private String dimensiLoko; private String status; private String createdDate; }
Create repository to save generated random data into h2 database
-
Create a new class inside
repository
package calledLokomotifRepository.java
import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.stereotype.Repository; import com.tujuhsembilan.scheduler.model.Lokomotif; @Repository public interface LokomotifRepository extends JpaRepository<Lokomotif, String> { }
Create controller to generate random locomotive data
-
Create a new controller class called
LokomotifController.java
insidecontroller
package. To save time in term of generating random data, I use a java random generator dependecy called JavaFaker.
import java.time.LocalDateTime; import java.time.format.DateTimeFormatter; import org.modelmapper.ModelMapper; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.hateoas.RepresentationModel; import org.springframework.http.HttpStatus; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; import com.github.javafaker.Faker; import com.tujuhsembilan.scheduler.model.Lokomotif; import com.tujuhsembilan.scheduler.model.dto.LokomotifDto; import com.tujuhsembilan.scheduler.repository.LokomotifRepository; import lombok.RequiredArgsConstructor; import lombok.extern.slf4j.Slf4j; @Slf4j @RestController @RequestMapping("api/v1/lokomotif") @RequiredArgsConstructor(onConstructor = @__(@Autowired)) public class LokomotifController { private final ModelMapper modelMapper; private final LokomotifRepository lokomotifRepository; Faker faker = new Faker(); @GetMapping("/create-data-lokomotif") public ResponseEntity<?> createLokomotifData() { LocalDateTime randomDate = LocalDateTime.now(); DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyyMMdd'T'HHmmss"); String formattedDateTime = randomDate.format(formatter); Lokomotif lokomotif = Lokomotif.builder() .kodeLoko(faker.number().digits(5)) .namaLoko(faker.address().country()) .dimensiLoko(faker.number().digits(2)) .status(faker.number().digits(1)) .createdDate(formattedDateTime) .build(); var savedModel = lokomotifRepository.save(lokomotif); var responseBodyDto = modelMapper.map(savedModel, LokomotifDto.class); log.info("Generated Random Lokomotif Data: {}", responseBodyDto); return ResponseEntity.status(HttpStatus.CREATED).body(RepresentationModel.of(responseBodyDto)); } }
Configure scheduler using spring quartz
-
Update
application.properties
spring.datasource.url=jdbc:h2:mem:testdb spring.datasource.driverClassName=org.h2.Driver spring.datasource.username=sa spring.datasource.password=password spring.jpa.database-platform=org.hibernate.dialect.H2Dialect spring.jpa.defer-datasource-initialization=true spring.h2.console.enabled=true spring.quartz.job-store-type=jdbc spring.quartz.jdbc.initialize-schema=always
-
In
configuration
package, add another class calledQuartzConfig.java
import org.quartz.JobDetail; import org.quartz.Trigger; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.scheduling.quartz.CronTriggerFactoryBean; import org.springframework.scheduling.quartz.JobDetailFactoryBean; import org.springframework.scheduling.quartz.SchedulerFactoryBean; import org.springframework.scheduling.quartz.SpringBeanJobFactory; import com.tujuhsembilan.scheduler.job.CreateDataJob; @Configuration public class QuartzConfig { // Job @Bean public JobDetailFactoryBean jobDetail() { JobDetailFactoryBean factory = new JobDetailFactoryBean(); factory.setJobClass(CreateDataJob.class); factory.setDurability(true); return factory; } // Trigger @Bean public CronTriggerFactoryBean trigger(JobDetail job) { CronTriggerFactoryBean factory = new CronTriggerFactoryBean(); factory.setJobDetail(job); factory.setCronExpression("0/10 * * * * ?"); // Every 10 seconds factory.setName("randomDataTrigger"); factory.setGroup("randomDataGroup"); return factory; } // Scheduler @Bean public SchedulerFactoryBean scheduler(Trigger trigger, JobDetail job, SpringBeanJobFactory springBeanJobFactory) { SchedulerFactoryBean factory = new SchedulerFactoryBean(); factory.setJobDetails(job); factory.setTriggers(trigger); factory.setJobFactory(springBeanJobFactory); return factory; } @Bean public SpringBeanJobFactory springBeanJobFactory() { return new SpringBeanJobFactory(); } }
-
Add new class at
job
package calledCreateDataJob.java
@Slf4j @Component @DisallowConcurrentExecution public class CreateDataJob implements Job, ApplicationContextAware { @Autowired private LokomotifController lokomotifController; @Override public void setApplicationContext(org.springframework.context.ApplicationContext applicationContext) { lokomotifController = applicationContext.getBean(LokomotifController.class); } @Override public void execute(JobExecutionContext jobExecutionContext) { log.info("Job ** {} ** starting @ {}", jobExecutionContext.getJobDetail().getKey().getName(), jobExecutionContext.getFireTime()); lokomotifController.createLokomotifData(); log.info("Job ** {} ** completed. Next job scheduled @ {}", jobExecutionContext.getJobDetail().getKey().getName(),jobExecutionContext.getNextFireTime()); } }
Configure Kafka Producer
-
Update
application.properties
, it will be the final look ofapplication.properties
:
spring.datasource.url=jdbc:h2:mem:testdb spring.datasource.driverClassName=org.h2.Driver spring.datasource.username=sa spring.datasource.password=password spring.jpa.database-platform=org.hibernate.dialect.H2Dialect spring.jpa.defer-datasource-initialization=true spring.h2.console.enabled=true spring.quartz.job-store-type=jdbc spring.quartz.jdbc.initialize-schema=always spring.kafka.producer.bootstrap-servers=127.0.0.1:9092 spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
-
In
configuration package
, add another class calledKafkaTopicConfig.java
. You have to remember the topic name since it will be used in kafka consumer service
import org.apache.kafka.clients.admin.NewTopic; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.config.TopicBuilder; @Configuration public class KafkaTopicConfig { @Bean public NewTopic lokomotifDataTopic() { return TopicBuilder.name("lokomotifdata").build(); } }
-
In service
package
, create new class calledKafkaProducer.java
import org.springframework.kafka.core.KafkaTemplate; import org.springframework.kafka.support.KafkaHeaders; import org.springframework.messaging.Message; import org.springframework.messaging.support.MessageBuilder; import org.springframework.stereotype.Service; import com.tujuhsembilan.scheduler.model.Lokomotif; import lombok.extern.slf4j.Slf4j; @Slf4j @Service public class KafkaProducer { private KafkaTemplate<String, Lokomotif> kafkaTemplate; public KafkaProducer(KafkaTemplate<String, Lokomotif> kafkaTemplate) { this.kafkaTemplate = kafkaTemplate; } public void sendMessage(Lokomotif lokomotifData) { log.info(String.format("Message sent: %s ", lokomotifData.toString())); Message<Lokomotif> message = MessageBuilder .withPayload(lokomotifData) .setHeader(KafkaHeaders.TOPIC, "lokomotifdata") .build(); kafkaTemplate.send(message); } }
-
Finally, update the
LokomotifController.java
. This is the final look:
import java.time.LocalDateTime; import java.time.format.DateTimeFormatter; import org.modelmapper.ModelMapper; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.hateoas.RepresentationModel; import org.springframework.http.HttpStatus; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; import com.github.javafaker.Faker; import com.tujuhsembilan.scheduler.model.Lokomotif; import com.tujuhsembilan.scheduler.model.dto.LokomotifDto; import com.tujuhsembilan.scheduler.repository.LokomotifRepository; import com.tujuhsembilan.scheduler.service.KafkaProducer; import lombok.RequiredArgsConstructor; import lombok.extern.slf4j.Slf4j; @Slf4j @RestController @RequestMapping("api/v1/lokomotif") @RequiredArgsConstructor(onConstructor = @__(@Autowired)) public class LokomotifController { private final ModelMapper modelMapper; private final LokomotifRepository lokomotifRepository; private final KafkaProducer kafkaProducer; Faker faker = new Faker(); @GetMapping("/create-data-lokomotif") public ResponseEntity<?> createLokomotifData() { LocalDateTime randomDate = LocalDateTime.now(); DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyyMMdd'T'HHmmss"); String formattedDateTime = randomDate.format(formatter); Lokomotif lokomotif = Lokomotif.builder() .kodeLoko(faker.number().digits(5)) .namaLoko(faker.address().country()) .dimensiLoko(faker.number().digits(2)) .status(faker.number().digits(1)) .createdDate(formattedDateTime) .build(); kafkaProducer.sendMessage(lokomotif); var savedModel = lokomotifRepository.save(lokomotif); var responseBodyDto = modelMapper.map(savedModel, LokomotifDto.class); log.info("Generated Random Lokomotif Data: {}", responseBodyDto); return ResponseEntity.status(HttpStatus.CREATED).body(RepresentationModel.of(responseBodyDto)); } }
Running the application
- Makesure zookeeper and kafka application is run.
- Since I'm using Visual Studio Code, I can just click run button from
SchedulerApplication.java
. - The random data might not yet can be seen as a whole, but the log that we have can help to know either the application is work properly or not.
I think that's all for this service. The working code can be find at my github repository
Top comments (0)