DEV Community

Nikolay Stanchev
Nikolay Stanchev

Posted on • Originally published at nsnotes.hashnode.dev on

Step-By-Step Tutorial for Building a REST API in Java

Motivation

Having seen many tutorials on how to build REST APIs in Java using various combinations of frameworks and libraries, I decided to build my own API using the software suite that I have the most experience with. In particular, I wanted to use:

  • Maven as the build and dependency management tool
  • Jersey as the framework that provides implementation of the JAX-RS specification
  • Tomcat as the application server
    • in particular, I wanted to run Tomcat in embedded mode so that I would end up with a simple executable jar file
  • Guice as the dependency injection framework

The problem I faced was that I couldn't find any tutorials combining the software choices above, so I had to go through the process of combining the pieces myself. This didn't turn out to be a particularly straightforward task, which is why I decided to document the process on my blog and share it with others who might be facing similar problems.

Project Summary

For the purpose of this tutorial, we are going to build the standard API for managing TODO items - i.e. a CRUD API that supports the functionalities of C reating, R etrieving, U pdating and D eleting tasks.

The API specification is given below: Screenshot 2022-06-23 at 10.37.36.png

The full specification can be viewed in the Appendix.

To implement this API, we will use:

  • Java 11 (OpenJDK)
  • Apache Maven v3.8.6
  • Ecplipse Jersey v2.35
  • Apache Tomcat v9.0.62
  • Guice v4.2.3

For the purpose of simplicity, I will avoid the use of any databases as part of this tutorial and instead use a pseudo in-memory DB. However, we will see how easy it is to switch from an in-memory testing DB to an actual database when following a clean architecture.

The goal is to end up with an executable jar file generated by Maven that will include the Tomcat application server and our API implementation. We will then dockerize the entire process of generating the file and executing it, and finally run the service as a Docker container.

The following coding steps will only outline the most relevant pieces of code for the purpose of this tutorial, but you can find the full code in the GitHub repository. For most steps, we will add unit tests that won't be referenced here but included in the code change itself. To run the tests at any given point in time, you can use mvn clean test.

Coding Steps

Step 1 - Project Setup

As with every Maven project, we need a POM file (the file representing the P roject o bject M odel). We start with a very basic POM which describes the project information and sets the JDK and JRE target versions to 11. This means that the project can use Java 11 language features (but no features from later versions) and will require a JRE version 11 or later to be executed. To avoid registering a domain name for this example project, I am using a group ID that corresponds to my GitHub username where this project will be hosted - com.github.nikist97.

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <!-- Project Information -->
    <groupId>com.github.nikist97</groupId>
    <artifactId>TaskManagementService</artifactId>
    <version>1.0-SNAPSHOT</version>
    <packaging>jar</packaging>

    <name>TaskManagementService</name>

    <properties>
        <!-- Maven-related properties used during the build process -->
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
    </properties>

    <dependencies>
        <!-- This is where we will declare libraries our project depends on -->
    </dependencies>

    <build>
        <plugins>
            <!-- This is where we will declare plugins our project needs for the build process -->
        </plugins>
    </build>
</project>

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 2 - Implementing the Business Logic

We start with the most critical piece of software in general, which is our business logic. Ideally, this layer should be agnostic to the notion of any DB technologies or API protocols. Whether we implement an HTTP API using MongoDB on the backend or we use PostgreSQL and implement a command-line tool for interacting with our code, it should not affect the code for our business logic. In other words, the business logic should not depend on the persistence layer (the code interacting with the database) and the API layer (the code that will define the HTTP API endpoints).

The first thing to implement is our main entity class - Task. This class follows the builder pattern and provides argument validation. The required attributes are the task's title and description. The rest of the attributes we can default to sensible values when not explicitly provided:

  • identifier is set to a random UUID
  • createdAt is set to the current date time
  • completed is set to False
public class Task {

    private final String identifier;
    private final String title;
    private final String description;
    private final Instant createdAt;
    private final boolean completed;

    ...

    public static class TaskBuilder {

        ...

        private TaskBuilder(String title, String description) {
            validateArgNotNullOrBlank(title, "title");
            validateArgNotNullOrBlank(description, "description");

            this.title = title;
            this.description = description;
            this.identifier = UUID.randomUUID().toString();
            this.createdAt = Instant.now();
            this.completed = false;
        }

        ...

    }
}

Enter fullscreen mode Exit fullscreen mode

Then, we define the interface we need for interacting with a persistence layer (i.e. a database or another storage mechanism). Notice that this interface belongs to the business layer because, ultimately, it is the business logic that decides what storage functionality we will need. The actual implementation of this interface, though (a MongoDB implementation or an in-memory DB or something else) will belong to the persistence layer, which we will implement in a subsequent step.

public interface TaskManagementRepository {

    void save(Task task);

    List<Task> getAll();

    Optional<Task> get(String taskID);

    void delete(String taskID);
}

Enter fullscreen mode Exit fullscreen mode

Finally, we implement the service class, which has the CRUD logic. The critical piece here is that this class doesn't rely on a concrete implementation of the repository interface - it is agnostic to what DB technology we decide to use later.

public class TaskManagementService {

    private final TaskManagementRepository repository;

    ...

    public Task create(String title, String description) {
        Task task = Task.builder(title, description).build();

        repository.save(task);

        return task;
    }

    public Task update(String taskID, TaskUpdateRequest taskUpdateRequest) {
        Task oldTask = retrieve(taskID);

        Task newTask = oldTask.update(taskUpdateRequest);
        repository.save(newTask);

        return newTask;
    }

    public List<Task> retrieveAll() {
        return repository.getAll();
    }

    public Task retrieve(String taskID) {
        return repository.get(taskID).orElseThrow(() ->
                new TaskNotFoundException("Task with the given identifier cannot be found - " + taskID));
    }

    public void delete(String taskID) {
        repository.delete(taskID);
    }
}

Enter fullscreen mode Exit fullscreen mode

The way this code was written allows us to easily unit test our business logic in isolation by mocking the behavior of the repository interface. To achieve this, we will need to add two dependencies in the POM file:

        ...
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.11</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.mockito</groupId>
            <artifactId>mockito-core</artifactId>
            <version>3.5.13</version>
            <scope>test</scope>
        </dependency>
        ...

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 3 - Creating Stub API Endpoints

The next step is to implement the API layer. For this project, we are implementing an HTTP REST API using Jersey. Therefore, we start by adding the dependency in the POM file.

        ...
        <dependency>
            <groupId>org.glassfish.jersey.containers</groupId>
            <artifactId>jersey-container-servlet</artifactId>
            <version>2.35</version>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.inject</groupId>
            <artifactId>jersey-hk2</artifactId>
            <version>2.35</version>
        </dependency>
        ...

Enter fullscreen mode Exit fullscreen mode

The second dependency is needed after Jersey 2.26 - https://eclipse-ee4j.github.io/jersey.github.io/release-notes/2.26.html - following this version users need to explicitly declare the dependency injection framework for Jersey to use - in this case we go with HK2 which is what was used in previous releases.

Then we implement the resource class, which at this point only has stub methods that all return a status code 200 HTTP response with no response body.

@Path("/tasks")
public class TaskManagementResource {

    @POST
    public Response createTask() {
        return Response.ok().build();
    }

    @GET
    public Response getTasks() {
        return Response.ok().build();
    }

    @PATCH
    @Path("/{taskID}")
    public Response updateTask(@PathParam("taskID") String taskID) {
        return Response.ok().build();
    }

    @GET
    @Path("/{taskID}")
    public Response getTask(@PathParam("taskID") String taskID) {
        return Response.ok().build();
    }

    @DELETE
    @Path("/{taskID}")
    public Response deleteTask(@PathParam("taskID") String taskID) {

        return Response.ok().build();
    }
}

Enter fullscreen mode Exit fullscreen mode

We will also need an application config class to define the base URI for our API and to inform the framework about the task management resource class:

@ApplicationPath("/api")
public class ApplicationConfig extends ResourceConfig {

    public ApplicationConfig() {
        register(TaskManagementResource.class);
    }

}

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 4 - Implementing the API Layer

For this project, we will use JSON as the serialization data format for HTTP requests and repsonses.

In order to produce and consume JSON in our API, we need to add a library that's going to be responsible for the JSON serialization and deserialization of POJOs. We are going to use Jackson. The library we need in order to integrate Jersy with Jackson is given below:

        ...
        <dependency>
            <groupId>org.glassfish.jersey.media</groupId>
            <artifactId>jersey-media-json-jackson</artifactId>
            <version>2.35</version>
        </dependency>
        ...

Enter fullscreen mode Exit fullscreen mode

Then we need to customize the behavior of the JSON object mapper that will be used for serializing and deserializing the request and response POJOs. In this case, we disable ALLOW_COERCION_OF_SCALARS - this means that the service won't attempt to parse strings into numbers or booleans (e.g. {"boolean_field":"true"} will be rejected)

@Provider
public class JsonObjectMapperProvider implements ContextResolver<ObjectMapper> {

    private final ObjectMapper jsonObjectMapper;

    /**
     * Create a custom JSON object mapper provider.
     */
    public JsonObjectMapperProvider() {
        jsonObjectMapper = new ObjectMapper();
        jsonObjectMapper.disable(ALLOW_COERCION_OF_SCALARS);
    }

    @Override
    public ObjectMapper getContext(Class<?> type) {
        return jsonObjectMapper;
    }
}

Enter fullscreen mode Exit fullscreen mode

Once again, we need to make Jersey aware of this provider class:

@ApplicationPath("/api")
public class ApplicationConfig extends ResourceConfig {

    public ApplicationConfig() {
        register(TaskManagementResource.class);
        register(JsonObjectMapperProvider.class);
    }

}

Enter fullscreen mode Exit fullscreen mode

Then we define the request and response POJOs. I will skip the code for these classes, but in summary, we need:

  • TaskCreateRequest - represents the JSON request body sent to the service when creating a new task
  • TaskUpdateRequest - represents the JSON request body sent to the service when updating an existing task
  • TaskResponse - represents the JSON response body sent to the client when retrieving task(s)

The last part of this step is to replace the stub logic in the resource class with the actual API implementation that relies on the business logic encapsulated in the service class from step 2.

@Path("/tasks")
public class TaskManagementResource {

    private final TaskManagementService service;

    public TaskManagementResource(TaskManagementService service) {
        this.service = service;
    }

    @POST
    @Consumes(MediaType.APPLICATION_JSON)
    public Response createTask(TaskCreateRequest taskCreateRequest) {
        validateArgNotNull(taskCreateRequest, "task-create-request-body");

        Task task = service.create(taskCreateRequest.getTitle(), taskCreateRequest.getDescription());

        String taskID = task.getIdentifier();

        URI taskRelativeURI = URI.create("tasks/" + taskID);
        return Response.created(taskRelativeURI).build();
    }

    @GET
    @Produces(MediaType.APPLICATION_JSON)
    public List<TaskResponse> getTasks() {
        return service.retrieveAll().stream()
                .map(TaskResponse::new)
                .collect(Collectors.toUnmodifiableList());
    }

    @PATCH
    @Path("/{taskID}")
    @Produces(MediaType.APPLICATION_JSON)
    public Response updateTask(@PathParam("taskID") String taskID, TaskUpdateRequest taskUpdateRequest) {
        validateArgNotNull(taskUpdateRequest, "task-update-request-body");

        TaskUpdate update = new TaskUpdate(taskUpdateRequest.getTitle(), taskUpdateRequest.getDescription(),
                taskUpdateRequest.isCompleted());

        service.update(taskID, update);

        return Response.ok().build();
    }

    @GET
    @Path("/{taskID}")
    @Produces(MediaType.APPLICATION_JSON)
    public TaskResponse getTask(@PathParam("taskID") String taskID) {
        Task task = service.retrieve(taskID);
        return new TaskResponse(task);
    }

    @DELETE
    @Path("/{taskID}")
    public Response deleteTask(@PathParam("taskID") String taskID) {
        service.delete(taskID);
        return Response.noContent().build();
    }

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 5 - Implementing the Storage Mechanism

For simplicity, we are going to implement an in-memory storage implementation of the repository interface rather than relying on a database technology. The implementation will store all tasks inside a map - the key is the task identifier and the value is the task itself. This is just enough for simple CRUD functionality.

public class InMemoryTaskManagementRepository implements TaskManagementRepository {

    Map<String, Task> tasks = new HashMap<>();

    @Override
    public void save(Task task) {
        tasks.put(task.getIdentifier(), task);
    }

    @Override
    public List<Task> getAll() {
        return tasks.values().stream()
                .collect(Collectors.toUnmodifiableList());
    }

    @Override
    public Optional<Task> get(String taskID) {
        return Optional.ofNullable(tasks.get(taskID));
    }

    @Override
    public void delete(String taskID) {
        tasks.remove(taskID);
    }

}

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 6 - Binding Everything Together

Now that we have all the layers implemented, we need to bind them together with a dependency injection framework - in this case, we will use Guice to achieve that.

We start by adding Guice as a dependency in the POM file:

        <dependency>
            <groupId>com.google.inject</groupId>
            <artifactId>guice</artifactId>
            <version>4.2.3</version>
        </dependency>

Enter fullscreen mode Exit fullscreen mode

Then we create a simple guice module to bind the in-memory DB implementation to the repository interface. This basically means that for all classes that depend on the repository interface, Guice will inject the in-memory DB class. We use the Singleton scope because we want all classes that depend on the repository to re-use the same in-memory DB instance.

public class ApplicationModule extends AbstractModule {

    @Override
    public void configure() {
        bind(TaskManagementRepository.class).to(InMemoryTaskManagementRepository.class).in(Singleton.class);
    }

}

Enter fullscreen mode Exit fullscreen mode

Note that if we decide to use an actual database, the code change is as simple as:

  • implementing the wrapper class for the DB we choose - e.g. MongoDBTaskManagementRepository
  • changing the binding above to point to the new implementation of the repository interface

Now that we have the module implemented, we can add Inject annotation to all classes where the constructor has a dependency which needs to be injected by Guice. These would be the TaskManagementResource and the TaskManagementService classes. The magic of guice (and dependency injection in general) is that the module above is enough to build the entire tree of dependencies in our code.

TaskManagementResource depends on TaskManagementService which depends on TaskManagementRepository. Guice knows how to get an instance of the TaskManagementRepository interface so following this chain it also knows how to get an instance of the TaskManagementService and TaskManagementResource classes.

The final piece of work is to make Jersey aware of the Guice injector - remember Jersey uses HK2 as its dependency injection framework, so Jersey will rely on HK2 to be able to build a TaskManagementResource class. In order for HK2 to build a TaskManagementResource it needs to know about Guice's dependency injector container. To connect Guice and HK2, we are going to use something called the Guice/HK2 Bridge. It is basically a process of bridging the Guice container (the Injector class) into the HK2 container (the ServiceLocator class).

So we declare a dependency on the Guice/HK2 bridge library:

        ...
        <dependency>
            <groupId>org.glassfish.hk2</groupId>
            <artifactId>guice-bridge</artifactId>
            <version>2.6.1</version>
        </dependency>
        ...

Enter fullscreen mode Exit fullscreen mode

Then we change the ApplicationConfig class to create the bridge between Guice and HK2. Notice that since the ApplicationConfig class is used by Jersey (and thus managed by HK2) we can easily inject the ServiceLocator instance (the HK2 container itself) into it.

        @Inject
        public ApplicationConfig(ServiceLocator serviceLocator) {
            register(TaskManagementResource.class);
            register(JsonObjectMapperProvider.class);

            // bridge the Guice container (Injector) into the HK2 container (ServiceLocator)
            Injector injector = Guice.createInjector(new ApplicationModule());
            GuiceBridge.getGuiceBridge().initializeGuiceBridge(serviceLocator);
            GuiceIntoHK2Bridge guiceBridge = serviceLocator.getService(GuiceIntoHK2Bridge.class);
            guiceBridge.bridgeGuiceInjector(injector);
        }

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 7 - Creating the Application Launcher

The final critical step is configuring and starting the application server through a launcher class, which will serve as our main class for the executable jar file we are targeting.

We start with the code for starting an embedded Tomcat server. The dependency we need is:

    ...
    <dependency>
        <groupId>org.apache.tomcat.embed</groupId>
        <artifactId>tomcat-embed-core</artifactId>
        <version>9.0.62</version>
    </dependency>
    ...

Enter fullscreen mode Exit fullscreen mode

Then we need a launcher class. This class is responsible for starting the embedded Tomcat server and registering a servlet container for the resource config we defined earlier (when we registered the resource class).

public class Launcher {

    public static void main(String[] args) throws Exception {
        Tomcat tomcat = new Tomcat();

        // configure server port number
        tomcat.setPort(8080);

        // remove defaulted JSP configs
        tomcat.setAddDefaultWebXmlToWebapp(false);

        // add the web app
        StandardContext ctx = (StandardContext) tomcat.addWebapp("/", new File(".").getAbsolutePath());
        ResourceConfig resourceConfig = new ResourceConfig(ApplicationConfig.class);
        Tomcat.addServlet(ctx, "jersey-container-servlet", new ServletContainer(resourceConfig));
        ctx.addServletMappingDecoded("/*", "jersey-container-servlet");

        // start the server
        tomcat.start();
        System.out.println("Server listening on " + tomcat.getHost().getName() + ":" + tomcat.getConnector().getPort());
        tomcat.getServer().await();
    }
}

Enter fullscreen mode Exit fullscreen mode

If using InteliJ to code this project, then you should ideally be able to run the main method of the Launcher class. There is one caveat here - with the release of JDK 9 and after (and hence the introduction of the Java Platform Module System), reflective access is only allowed to publicly exported packages. This means that Guice will fail at runtime because it uses reflection to access JDK modules. See this StackOverflow post for more information.

The only workaround I found so far for this was to add the following as a JVM option --add-opens java.base/java.lang=ALL-UNNAMED to the run configuration of the main method as suggested in the StackOverflow post I linked. This basically allows Guice to continue doing its reflection as in the pre-JDK 9 releases.

After we use the workaround above and test our launcher, we get to the part of generating an executable JAR file which can be used to start the service. To achieve this, we need the appassembler plugin. Note that we still need to add the --add-opens java.base/java.lang=ALL-UNNAMED JVM argument in order for the executable jar file to work.

         ...
         <plugins>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>appassembler-maven-plugin</artifactId>
                <version>2.0.0</version>
                <configuration>
                    <assembleDirectory>target</assembleDirectory>
                    <extraJvmArguments>--add-opens java.base/java.lang=ALL-UNNAMED</extraJvmArguments>
                    <programs>
                        <program>
                            <mainClass>taskmanagement.Launcher</mainClass>
                            <name>taskmanagement_webapp</name>
                        </program>
                    </programs>
                </configuration>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>assemble</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
        ...

Enter fullscreen mode Exit fullscreen mode

With this plugin, we can finally generate an executable file and then use it to start the service:

mvn clean package
./target/bin/taskmanagement_webapp

Enter fullscreen mode Exit fullscreen mode

The full commit for this step can be found here.

Step 8 - Adding Exception Mappers

You might have noticed that so far we have defined two custom exceptions that are thrown when the service receives input data it cannot handle:

  • TaskNotFoundException
  • InvalidTaskDataException

If these exceptions aren't handled properly when encountered, then the embedded tomcat server will wrap them inside an internal server error (status code 500) which is not very user friendly. As per the API specification we defined in the beginning (see Appendix), we want clients to receive a 404 status code if, for example, they use a task ID that doesn't exist.

To achieve this, we use exception mappers. When we register those mappers, Jersey will use them to transform instances of these exceptions to proper HTTP Response objects.

public class TaskNotFoundExceptionMapper implements ExceptionMapper<TaskNotFoundException> {

    @Override
    public Response toResponse(TaskNotFoundException exception) {
        return Response
                .status(Response.Status.NOT_FOUND)
                .entity(new ExceptionMessage(exception.getMessage()))
                .type(MediaType.APPLICATION_JSON)
                .build();
    }

}


public class InvalidTaskDataExceptionMapper implements ExceptionMapper<InvalidTaskDataException> {

    @Override
    public Response toResponse(InvalidTaskDataException exception) {
        return Response
                .status(Response.Status.BAD_REQUEST)
                .entity(new ExceptionMessage(exception.getMessage()))
                .type(MediaType.APPLICATION_JSON)
                .build();
    }

}


    @Inject
    public ApplicationConfig(ServiceLocator serviceLocator) {
        ...
        register(InvalidTaskDataExceptionMapper.class);
        register(TaskNotFoundExceptionMapper.class);
        ...
    }

Enter fullscreen mode Exit fullscreen mode

Notice the use of a new POJO - ExceptionMessage - which is used to convey the exception message as a JSON response. Now, whenever the business logic throws any of these exceptions, we will get a proper JSON response with the appropriate status code.

The full commit for this step can be found here.

Dockerizing the Application

There are lots of benefits of using Docker but given that this article is not about containers, I won't spend time talking about them. I will only mention that I always prefer to run applications in a Docker container because it makes the build process much more efficient (think application portability, well-defined build behavior, improved deployment process, etc.)

The Dockerfile for our service is relatively simple and based on the maven OpenJDK image. It automates what we did in step 7 - packaging the application and running the executable jar file.

FROM maven:3.8.5-openjdk-11-slim
WORKDIR /application

COPY . .

RUN mvn clean package

CMD ["./target/bin/taskmanagement_webapp"]

Enter fullscreen mode Exit fullscreen mode

With this, we can build the container image and start our service as a Docker container. The commands below assume you have the Docker daemon running on your local machine.

docker build --tag task-management-service .
docker run -d -p 127.0.0.1:8080:8080 --name test-task-management-service task-management-service

Enter fullscreen mode Exit fullscreen mode

Now the service should be running in the background and be accessible from your local machine on port 8080. For starting/stopping it, use this command:

docker start/stop test-task-management-service

Enter fullscreen mode Exit fullscreen mode

Testing the Service

Now that we have the service running, we can use Curl to send some test requests.

  • creating a few tasks
curl -i -X POST -H "Content-Type:application/json" -d "{\"title\": \"test-title\", \"description\":\"description\"}" "http://localhost:8080/api/tasks" 

HTTP/1.1 201 
Location: http://localhost:8080/api/tasks/d2c4ed20-2538-44e5-bf19-150db9f6d83f
Content-Length: 0
Date: Tue, 28 Jun 2022 07:52:46 GMT

curl -i -X POST -H "Content-Type:application/json" -d "{\"title\": \"test-title\", \"description\":\"description\"}" "http://localhost:8080/api/tasks"

HTTP/1.1 201 
Location: http://localhost:8080/api/tasks/64d85db4-905b-4c62-ba10-13fcb19a2546
Content-Length: 0
Date: Tue, 28 Jun 2022 07:52:47 GMT

Enter fullscreen mode Exit fullscreen mode
  • retrieving a task
curl -i -X GET "http://localhost:8080/api/tasks/64d85db4-905b-4c62-ba10-13fcb19a2546"

HTTP/1.1 200 
Content-Type: application/json
Content-Length: 162
Date: Tue, 28 Jun 2022 07:54:21 GMT

{"identifier":"64d85db4-905b-4c62-ba10-13fcb19a2546","title":"test-title","description":"description","createdAt":"2022-06-28T07:52:47.872859Z","completed":false}

Enter fullscreen mode Exit fullscreen mode
  • retrieving a non-existing task
curl -i -X GET "http://localhost:8080/api/tasks/random-task-id-123"                                                       

HTTP/1.1 404 
Content-Type: application/json
Content-Length: 81
Date: Tue, 28 Jun 2022 09:44:53 GMT

{"message":"Task with the given identifier cannot be found - random-task-id-123"}

Enter fullscreen mode Exit fullscreen mode
  • retrieving all tasks
curl -i -X GET "http://localhost:8080/api/tasks"     

HTTP/1.1 200 
Content-Type: application/json
Content-Length: 490
Date: Tue, 28 Jun 2022 07:55:08 GMT

[{"identifier":"64d85db4-905b-4c62-ba10-13fcb19a2546","title":"test-title","description":"description","createdAt":"2022-06-28T07:52:47.872859Z","completed":false},{"identifier":"d2c4ed20-2538-44e5-bf19-150db9f6d83f","title":"test-title","description":"description","createdAt":"2022-06-28T07:52:46.444179Z","completed":false}]

Enter fullscreen mode Exit fullscreen mode
  • deleting a task
curl -i -X DELETE "http://localhost:8080/api/tasks/64d85db4-905b-4c62-ba10-13fcb19a2546"

HTTP/1.1 204 
Date: Tue, 28 Jun 2022 07:56:55 GMT

Enter fullscreen mode Exit fullscreen mode
  • patching a task
curl -i -X PATCH -H "Content-Type:application/json" -d "{\"completed\": true, \"title\": \"new-title\", \"description\":\"new-description\"}" "http://localhost:8080/api/tasks/d2c4ed20-2538-44e5-bf19-150db9f6d83f"

HTTP/1.1 200 
Content-Length: 0
Date: Tue, 28 Jun 2022 08:00:37 GMT

curl -i -X GET "http://localhost:8080/api/tasks/d2c4ed20-2538-44e5-bf19-150db9f6d83f"   
HTTP/1.1 200 
Content-Type: application/json
Content-Length: 164
Date: Tue, 28 Jun 2022 08:01:07 GMT

{"identifier":"d2c4ed20-2538-44e5-bf19-150db9f6d83f","title":"new-title","description":"new-description","createdAt":"2022-06-28T07:52:46.444179Z","completed":true}

Enter fullscreen mode Exit fullscreen mode
  • patching a task with empty title
curl -i -X PATCH -H "Content-Type:application/json" -d "{\"title\": \"\"}" "http://localhost:8080/api/tasks/d2c4ed20-2538-44e5-bf19-150db9f6d83f"

HTTP/1.1 400 
Content-Type: application/json
Content-Length: 43
Date: Tue, 28 Jun 2022 09:47:09 GMT
Connection: close

{"message":"title cannot be null or blank"}

Enter fullscreen mode Exit fullscreen mode

Future Improvements

What we have built so far is obviously not a production-ready API, but it demonstrates how to get started with the software suite I mentioned in the beginning of this article when building a REST API. Here are some future improvements that can be made:

  • using a database for persistent storage
  • adding user authentication and authorization - tasks should be scoped per user rather than being available globally
  • adding logging
  • adding KPI (Key Performance Indicators) metrics - things like the count of total requests, latency, failures count, etc.
  • adding a mapper for unexpected exceptions - we don't want to expose a stack trace if the service encounters an unexpected null pointer exception, instead we want a JSON response with status code 500
  • adding automated integration tests
  • adding a more verbose response to the patch endpoint - e.g. indicating whether the request resulted in a change or not
  • scanning packages and automatically registering provider and resource classes instead of manually registering them one-by-one
  • adding CORS (Cross-Origin-Resource-Sharing) support if we intend to call the API from a browser application hosted under a different domain
  • adding SSL support
  • adding rate limiting

If you found this article helpful and would like to see a follow-up on the topics above, please comment or message me with a preference choice of what you would like to learn about the most.

Appendix

The full API specification using the Open API description format can be found below. You can use the Swagger Editor to display the API specification in a more friendly manner.

swagger: '2.0'

info:
  description: This is a RESTful task management API specification.
  version: 1.0.0
  title: Task Management API
  license:
    name: Apache 2.0
    url: 'http://www.apache.org/licenses/LICENSE-2.0.html'

host: 'localhost:8080'
basePath: /api

schemes:
  - http

paths:

  /tasks:
    post:
      summary: Create a new task
      operationId: createTask
      consumes:
        - application/json
      parameters:
        - in: body
          name: taskCreateRequest
          description: new task object that needs to be added to the list of tasks
          required: true
          schema:
            $ref: '#/definitions/TaskCreateRequest'
      responses:
        '201':
          description: successfully created new task
        '400':
          description: task create request failed validation
    get:
      summary: Retrieve all existing tasks
      operationId: retrieveTasks
      produces:
        - application/json
      responses:
        '200':
          description: successfully retrieved all tasks
          schema:
            type: array
            items:
              $ref: '#/definitions/TaskResponse'

  '/tasks/{taskID}':
    get:
      summary: Retrieve task
      operationId: retrieveTask
      produces:
        - application/json
      parameters:
        - name: taskID
          in: path
          description: task identifier
          required: true
          type: string
      responses:
        '200':
          description: successfully retrieved task
          schema:
            $ref: '#/definitions/TaskResponse'
        '404':
          description: task not found
    patch:
      summary: Update task
      operationId: updateTask
      consumes:
        - application/json
      parameters:
        - name: taskID
          in: path
          description: task identifier
          required: true
          type: string
        - name: taskUpdateRequest
          in: body
          description: task update request
          required: true
          schema:
            $ref: '#/definitions/TaskUpdateRequest'
      responses:
        '200':
          description: successfully updated task
        '400':
          description: task update request failed validation
        '404':
          description: task not found
    delete:
      summary: Delete task
      operationId: deleteTask
      parameters:
        - name: taskID
          in: path
          description: task identifier
          required: true
          type: string
      responses:
        '204':
          description: >-
            successfully deleted task or task with the given identifier did not
            exist

definitions:
  TaskCreateRequest:
    type: object
    required:
      - title
      - description
    properties:
      title:
        type: string
      description:
        type: string
  TaskUpdateRequest:
    type: object
    properties:
      title:
        type: string
      description:
        type: string
      completed:
        type: boolean
  TaskResponse:
    type: object
    required:
      - identifier
      - title
      - description
      - completed
      - createdAt
    properties:
      identifier:
        type: string
      title:
        type: string
      description:
        type: string
      createdAt:
        type: string
        format: date-time
      completed:
        type: boolean

Enter fullscreen mode Exit fullscreen mode

Discussion (0)