Photo by Chandler Cruttenden on Unsplash
As I promised in the previous post, here we'll take a look at how to spin up an ELK. Let's go! 🚀
Elastic
Straight to the point: open Portainer. Go to "Containers" and smash the "Add container" button. Then look at the screenshot below and fill values for Name, Image, Ports and Volumes. Maybe change Restart Policy to Unless stopped. One mild difference is that you need to click on "Advanced mode" in order to be able to pull an image from Elastic hub. In the screenshot it says "Simple mode" since I already chose advanced.
Then hit the "Deploy the container" button and wait a little.
Almost nothing new, if you've read my first article about dockerizing stuff you need. You did read that, right? 🤔
Here I'm not creating any volumes just because. But you most definitely can. Just map it to /usr/share/elasticsearch/data
.
Kibana
First create a volume, for example kibana_data
. Then create container as on the screenshot below:
Logstash
Man, am I tired to repeat this 😅 Create volume and compare the stuff you're typing with a screenshot:
Deploy and wait.
Next, configuring.
sudo nano /var/lib/docker/volumes/logstash_data/_data/config/logstash.yml
Don't know what is this, but it's important:
http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://172.17.0.1:9200" ]
Now pipeline. I have zero explanation why it was necessary to separate main and pipeline configs, but when I was first figuring it out it took me 3 hours of swear, frustration and eye strain. Maybe it's somewhere in documentation but it's buried so good I couldn't find it.
sudo nano /var/lib/docker/volumes/logstash_data/_data/pipeline/logstash.conf
I went with this config below since I don't need beat or w/e it is called. I just want to POST logs to Logstash. And you can always configure that as you wish by referencing the official documentation.
input {
http {
port => 5044
}
}
output {
stdout {
codec => json
}
elasticsearch {
hosts => ["http://172.17.0.1:9200"]
index => "logstash-%{+YYYY}"
}
}
Now we try this! Simple curl -XPUT 'http://127.0.0.1:5044/' -d 'log'
will suffice. Now navigate to http://0.0.0.0:5601/app/management. It's under "Management" / "Stack management" inside the toast menu. Choose "Index Management" below "Data" from the side menu. You can see our "logstash-2021" index. Or whatever year you live in. That means it has data!
Go to http://0.0.0.0:5601/app/discover. It's "Kibana" / "Discover" inside the toast menu. And what we'll see? Right, one hit we've just sent!
Here you can filter everything as you'd like 🐧
Hope you've learned something new from this article. And now you're a master of local deployment with Docker and a help from Portainer! Cheers and happy coding!
Top comments (3)
cannot pass discovery.type=single-node on elastic search on the newest version of portainer
must be a bug from portainer :D
Interesting, I've rebuilt the container not that long ago and it was ok. Not sure if I had the latest Portainer version or not though