About 1 year ago I started to work on ELK(ElasticSearch, Logstash, Kibana) setup for BI platform. While I was working on processing the events coming to Logstash I found that it's pretty annoying to work on filter changes if you have complex regular expressions, structures or conditions.
I had a lot of experience in Ruby On Rails development the last years and decided to cover Logstash filters by RSpec test cases to save my time, because logstash is using jruby inside of their packages and gems to distribute the extensions.
Also it would be great to have everything in docker to skip Logstash and gem installations on local machine.
Lets start from the Dockerfile and then I will extend step by step the project structure.
FROM logstash:2.4
# Install Rspec related dependencies
RUN logstash-plugin install --development
# Install prod dependencies
RUN logstash-plugin install logstash-filter-prune
ARG ES_PLUGIN=logstash-output-elasticsearch-6.2.4-java.gem
ARG KAFKA_PLUGIN=logstash-input-kafka-7.0.0.gem
COPY gems/${ES_PLUGIN} /tmp/${ES_PLUGIN}
RUN logstash-plugin install /tmp/${ES_PLUGIN}
COPY gems/${KAFKA_PLUGIN} /tmp/${KAFKA_PLUGIN}
RUN logstash-plugin install /tmp/${KAFKA_PLUGIN}
gems/
contains freezed gem versions for logstash setup.
Faster way to run test cases lets define Makefile:
NAME = your_logstash
build:
docker build -t $(NAME) .
.PHONY: build
clean:
docker rmi --force $(NAME)
.PHONY: clean
test:
@docker run --rm -t -i \
-v `pwd`/../:/app \
-w /app \
$(NAME) \
/bin/bash -c "rspec /app/logstash/spec/$(TEST_CASE)"
.PHONY: test
console:
@docker run --rm -t -i \
-v `pwd`/../:/app \
-w /app \
$(NAME) \
/bin/bash
.PHONY: console
make test
- run rspec inside of your container
make console
- opens interactive terminal inside of docker container in case if you want to run everything manually and debug(binding.pry
) your specs.
Now it would be great to start working on spec/spec_helper.rb
that's root of rspec.
require "logstash/devutils/rspec/spec_helper"
require 'rspec'
require 'rspec/expectations'
require 'ostruct'
require 'erb'
require 'yaml'
require 'json'
# running the grok code outside a logstash package means
# LOGSTASH_HOME will not be defined, so let's set it here
# before requiring the grok filter
# coming from the original examples for logtash specs.
unless LogStash::Environment.const_defined?(:LOGSTASH_HOME)
LogStash::Environment::LOGSTASH_HOME = File.expand_path("../", __FILE__)
end
module Helpers
ROOT_PATH = File.dirname(File.expand_path(__FILE__))
TEMPLATES_PATH = File.join(ROOT_PATH, '..', 'conf.d/')
def load_fixture(filename, settings = {})
message = File.read(File.join(ROOT_PATH, 'fixtures', filename))
settings.merge('message' => message)
end
def load_filter(filename, render_vars = {})
content = File.read(File.join(TEMPLATES_PATH, filename))
render_vars = OpenStruct.new(render_vars)
# it's not elegant enough :) but found it as the easiest way to handle
# jinja2 simple variables replacement.
template = ERB.new(content.gsub('{{', '<%=').gsub('}}', '%>'))
template.result(render_vars.instance_eval { binding })
end
end
require "logstash/filters/grok"
In my logstash filters I am using the statements from jinja2 python library {{ }}
to replace variables on deploy by using ansible.
Now we are ready to define actual spec to test our logstash filter.
lets assume you want to regex line like username=<username>
. Define your filter_spec.rb
inside spec/filters
folder. Filters should not deal with input and output logstash statements. Instead we should read fixture with example of expected line to process and then apply filter.
require_relative '../spec_helper'
describe 'elb' do
extend Helpers
# set config using our defined filter in `conf.d/01-filter.conf` file.
config load_filter('01-filter.conf')
# You can define in spec/fixtures/sample1.txt
# Example: username=oivoodoo
sample(load_fixture('sample1.txt', 'type' => 'your-source-data')) do
insist { subject.get('username') } == 'oivoodoo'
end
end
filter {
if [type] == "{{ type }}" {
grok {
match => {
"message" => [
"%{WORD:username}"
]
}
}
prune {
blacklist_names => [
"@version",
"message"
]
}
}
}
I believe it should be enough to test your filters. Found it pretty useful to save time instead of deploying and waiting new coming data. Also it's easy to add ruby
filter inside of filter and place binding.pry
and verify event
object.
Top comments (0)