Installing Logstash
Josh Stroschein
Security Researcher
@jstrosch www.0xevilc0de.com
Overview We need to get data into Elasticsearch
- Logstash will provide a data processing
pipeline
Explore inputs, filters and outputs
Continue our build-out by installing Logstash
in a Linux server
Logstash is a Data Processing Pipeline
Ingests data Filters Forwards
Data can come from a Allows you to Finally, sends data to
variety of sources normalize, enrich and your favorite “stash”
even exclude data
Logstash Plugins
There is already a collection of input, filter,
output and codec plugins
- Plugins help to ease the use of Logstash
A popular set of input plugins is Beats
- But there are a significant number of
plugins for phases of the pipeline available
Plugins are provided in self-contained Gems
from RubyGems.org
- Plugin manager script provides ability to
add, update and remove plugins for your
deployment
1.2.3.4 - -[15/Jun/2021:08:51:34] “GET / HTTP/1.1” 200 731 “-” “Mozilla/5.0…”
client ip time of request request line user-agent
Filter Example - Grok
The grok filter provides the ability to provide structure to arbitrary text
This helps to make the data queryable
Grok works well with log data that is written to be human-readable, such as Apache logs
Example Web Server Pipeline
config files
Add geolocation ship to elastic
Web Server Logs client IP address
geoip filter elasticsearch
file input plugin grok filter plugin output plugin
plugin
Demo Install Logstash in our Linux server
Ensure all pre-installation requirements are
met
Test our Logstash installation data by
sending data to Elasticsearch