This article consists of two sections. In part one we will focus on Logging as a service concept and introduction, in Part two, we will see how to set up, integrate Elasticsearch and its tools and how to access logs via Elastic search. Below are section categories.
- What is Logging?
- What is a Log file
- Types of logs
- What is Elasticsearch?
- What is ELK stack?
- How to setup Elasticsearch
- How to integrate Elasticsearch into your REST service
- How to access logs in Elasticsearch
- Elasticsearch in relation to REST API
- Benefits of using Elasticsearch?
- Logging advantage
Is a process of collecting and processing any type of log files coming from any given source or location such as server services, applications, devices, etc.
Is a file that contains records of either events or processes that are taking place in a system. This file may contain a status, warning, or any other intended information that explains what is going on in a system.
In various REST service architectures, there are several types of log files that can be implemented that define the type of information the file contains. Below are some of the log files.
- General Purpose logs
This is a type of log file that contains general information such as the service port that a system is using or the current state of a service
- Warning logs
These are logs intended to handle issues that are not fatal or disruptive. This usually denotes that the issue is not disruptive but it should be taken into consideration. (example: managed to save data in a DB but after multiple attempts)
- Error logs
This contains information regarding unhandled issues that are fatal or disruptive. (Example: Failed to save data in a DB but all the validation passed and data is clear to be saved)
- Debug logs
This contains information that helps us to debug the logic in case of an error or warning. These usually are intended for developers.
- Verbose logs
These are the logs to provide insights about the behavior of the app, intended for operators and the support team.
Various REST Architectures support default logging services out of the box and they can be configured to be accessed without any third-party services. This is suitable while when you are interacting with small data sets and managing small to medium systems, monitoring extensive systems can become an issue. A solution to this is to add dedicated services to support your logging process. In this article, I will be walking you through how to implement logging on REST API with Elasticsearch covering deploying, managing, and analyzing logs.
Elasticsearch is a distributed RESTful search and analytics engine capable of addressing a growing number of use cases. It centrally stores your data for lightning-fast search, fine‑tuned relevancy, and powerful analytics that scale with ease.
Elasticsearch consists of various tools that can be used alongside to support your demands. Explore more to see other tools that Elasticsearch offers.
"ELK" is the acronym for three open source projects Elasticsearch, Logstash, and Kibana.
A distributed, free and open search and analytics engine for all types of data, including textual, numerical, geospatial, structured, and unstructured.
A server‑side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a "stash" like Elasticsearch.
Lets users visualize data with charts and graphs in Elasticsearch.
ELK stack workflow with Spring Boot logging service
This is all for this chapter regarding the introduction to logging as a service and tooling. In the next chapter, we will see how to implement the Elasticsearch on the REST service to manage and monitor logs.