Today, when we code our applications, we often ignore the fact that the codes we write or a tool we use behave differently than it should be. The reason for this is that, for whatever reason, we know that the application will give an error or warning in an exceptional situation and that the problem can be found in log files. Therefore, as developers, we rely heavily on log files when developing applications. Debugging, observing application behavior; are essential parts of any stage of the application.
Log management applications reduce the great time spent on logs. Features such as adding filters to the logs and searching in the log files, thanks to the management applications, save us from getting stuck with the logs and save time for us to solve the problem.
Why is Log Management Necessary?
It allows us to observe the behavior of our log management application.
With log management tools, it is ensured that log records can be observed by people who are not experienced in software.
Log management applications are required for logs to be stored and analyzed as needed.
Log management applications allow system administrators to define failed login attempts or warnings specific to suspicious processes.
First, let's clone the project from GitHub. I will cite this project for the rest of the article.
The part that runs the database is the db service in docker-compose.yml. Various environment variables are used here. You can edit them as desired from the .env file located in the root of the project. These environment variables are also included in the application.yml file of the project, they work with their default values if no external variables are given.
Elasticksearch runs with the elasticsearch service in docker-compose.yml. We also see that it will run with the names single-node and elastic-search-cluster in the containers/elasticsearch folder located in the project root directory.
LogStash runs with the logstash service in docker-compose.yml. There is also a configuration file in the containers/logstash folder located in the project root directory.
The point to be considered here is the location of the log file in the input section. This file path is then made in docker-compose.yml with the logging.file information in the application.yml file of the application.
In the Output section, there is index information. With this index information, we will configure on kibana in the next step.
Kibana runs with the kibana service in docker-compose.yml. There is also a configuration file in the containers/kibana folder located in the project root directory. After all the processes are completed, we can observe our logs by going to http://localhost:5601 and creating a new index with the index name specified in the output configuration of LogStash.