DEV Community

Cover image for Different Ways to Send a File as a Response in Spring Boot for a REST API
Ravula Pranay Kumar Reddy
Ravula Pranay Kumar Reddy

Posted on

Different Ways to Send a File as a Response in Spring Boot for a REST API

Recently, I needed to send a huge file as a response for a REST call 📂. Initially, I thought about checking if any existing API calls were returning a file so that I could use the same approach.

However, I realized that mimicking an approach designed for another API might not meet the needs of my API 🤔. As a developer, I decided to explore different ways of sending a response, along with their pros and cons. This blog post is about that exploration⏳. If you're in a hurry to code, please skip this blog and search on StackOverflow😜.

The Use Case

Given a List<Entity>, the task is to generate a CSV file dynamically and send it as a response. We’ll fetch the entities from a service method, use Java streams to extract headers and values, and then send the CSV file using different approaches in Spring Boot.

Approach-1: Return byte[]

This approach involves converting the file content into a byte array and sending it in the response. It's straightforward and easy to implement.

@GetMapping("/download")
public ResponseEntity<byte[]> downloadFile() {
    List<Entity> entities = entityService.getEntities();
    String csvContent = generateCsv(entities); //returns comma separated value out of each entities.
    byte[] csvBytes = csvContent.getBytes(StandardCharsets.UTF_8);

    HttpHeaders headers = new HttpHeaders();
    headers.setContentType("text/csv");
    headers.setContentDisposition(ContentDisposition.builder("attachment").filename("file.csv").build());

    return ResponseEntity.ok()
            .headers(headers)
            .body(csvBytes);
}
Enter fullscreen mode Exit fullscreen mode

Pros:

  • Simplicity: Easy to implement and understand.
  • Compatibility: Works well with various resources as it sends the entire file content as a byte array.

Cons:

  • Memory Usage: The entire file is loaded into memory, which might be problematic for very large files.
  • Performance: Can be slow for large files due to memory consumption and potential delays in converting large files to byte arrays.

Approach-2: Return InputStreamResource

This approach involves streaming the file content from an InputStream wrapped in an InputStreamResource. InputStream allows for reading data from a source in a sequential manner, supporting streaming of data in chunks for efficient handling of large files.

@GetMapping("/download")
public ResponseEntity<InputStreamResource> downloadFile() {
    List<Entity> entities = entityService.getEntities();
    ByteArrayInputStream inputStream = new ByteArrayInputStream(generateCsv(entities).getBytes(StandardCharsets.UTF_8));

    HttpHeaders headers = new HttpHeaders();
    headers.setContentType("text/csv");
    headers.setContentDisposition(ContentDisposition.builder("attachment").filename("file.csv").build());

    // Use try-with-resources to ensure the stream is closed after use
    try (InputStreamResource resource = new InputStreamResource(inputStream)) {
    return ResponseEntity.ok()
                .headers(headers)
                .body(resource);
    } catch (IOException e) {
        return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
    }
}
Enter fullscreen mode Exit fullscreen mode

Pros:

  • Memory Efficiency: More memory-efficient than using byte arrays since it streams data.
  • Handling Large Files: Better suited for larger files compared to byte[].

Cons:

  • Complexity: Slightly more complex to implement compared to byte[].
  • Resource Management: Requires careful management of the InputStream to ensure it is closed properly.

Approach-3: Returning StreamingResponseBody

This approach involves using StreamingResponseBody to write the file content in chunks, allowing for efficient streaming.

@GetMapping("/download")
public ResponseEntity<StreamingResponseBody> downloadFile() {
    List<Entity> entities = entityService.getEntities();

    StreamingResponseBody stream = outputStream -> {
        try (PrintWriter writer = new PrintWriter(new OutputStreamWriter(outputStream, StandardCharsets.UTF_8))) {
            writer.println(Entity.getCSVHeaderString());
            if (entities != null && !entities.isEmpty()) {
                for (Entity entity : entities) {
                    writer.println(entity.mapEntityToCSVString());
                }
            }
            writer.flush(); // Ensure all data is sent
        } catch (IOException e) {
            logger.error("Error while streaming CSV file: {}", e.getMessage());
        }
    };

    HttpHeaders headers = new HttpHeaders();
    headers.setContentType("text/csv");
    headers.setContentDisposition(ContentDisposition.builder("attachment").filename("file.csv").build());

    return ResponseEntity.ok()
            .headers(headers)
            .body(stream);
}
Enter fullscreen mode Exit fullscreen mode

Pros:

  • Efficient Streaming: Allows streaming of the response data directly to the client, reducing memory usage.
  • Handling Large Files: Ideal for handling very large files as it doesn't require loading the entire file into memory.

Cons:

  • Complexity: More complex implementation and error handling compared to byte[] or InputStreamResource.
  • Error Handling: Difficult to change the status code in case of IO exceptions during streaming. Error handling can be less intuitive and requires careful management of streaming and exceptions.
  • Client Handling: Some clients may have trouble with streaming responses if not implemented correctly.

Approach-4: Using HttpServletResponse

This approach involves directly writing the CSV content to the HttpServletResponse output stream. It provides a straightforward way to stream the file content from your server to the client, without needing to load the entire file into memory.

@GetMapping("/export-csv")
public void handleCSVExport(HttpServletResponse response) {
    response.setContentType("text/csv");
    response.setHeader(HttpHeaders.CONTENT_DISPOSITION, ContentDisposition.builder("attachment").filename("file.csv").build());
    List<Entity> entities = entityService.getEntities();

    try (ServletOutputStream outputStream = response.getOutputStream();
            PrintWriter writer = new PrintWriter(outputStream)) {
        writer.println(Entity.getCSVHeaderString());
        if (entities != null && !entities.isEmpty()) {
            for (Entity entity : entities) {
                writer.println(entity.mapEntityToCSVString());
            }
        }
        response.setStatus(HttpStatus.OK.value());
        response.flushBuffer();
    } catch (IOException e) {
        response.setStatus(HttpStatus.INTERNAL_SERVER_ERROR.value());
    }
}
Enter fullscreen mode Exit fullscreen mode

Pros:

  • Direct Streaming: Directly writes data to the response output stream, minimizing memory usage.
  • Simplicity: Simple to understand and implement.

Cons:

  • Error Handling: Requires careful error handling and flushing of the buffer.
  • Resource Management: Needs to manage output stream closing and response status carefully.

Recommendations:

  1. For Large Files:
    • HttpServletResponse: Recommended if you expect very large files and need straightforward error handling and status code management. This approach streams data directly to the response output and allows you to handle errors and set status codes more effectively.
    • StreamingResponseBody: Also suitable for large files due to its memory efficiency and ability to stream data in chunks. However, it can be complex to handle errors and set status codes, as it operates asynchronously and outside the direct request-response lifecycle.
  2. For Moderate File Sizes:
    • InputStreamResource: Appropriate for files of manageable size where you prefer a balance between memory efficiency and implementation complexity. It streams data using an InputStream, making it more memory-efficient than byte[] but simpler than StreamingResponseBody.
  3. For Small Files:
    • byte[]: Ideal for relatively small files where simplicity and ease of implementation are priorities. While straightforward, this approach may not handle larger files efficiently due to potential memory usage issues.

Estimated time to download 100 rows:

Approach-1 byte[]: 2.36 seconds
Approach-2 InputStreamResource: 1.86 seconds
Approach-3 StreamingResponseBody: 0.84 seconds
Approach-4 HttpServletResponse: 0.62 seconds

Conclusion

Selecting the right approach for sending a file as a response in a Spring Boot REST API is crucial and depends on various factors such as file size, memory usage, and implementation complexity ⚖️.

By understanding the pros and cons of each method—byte[], InputStreamResource, StreamingResponseBody, and HttpServletResponse—you can make an informed decision that aligns with your application's needs and constraints 📊.

Consider the specific requirements of your use case to choose the most suitable approach. Whether you're dealing with small, moderate, or large files, selecting the right strategy will help ensure efficient performance and effective error handling 🚀.

Happy coding! 😊

Top comments (0)