Installing and Configuring Logstash to Ingest Fortinet Syslogs
Summary
TLDRThis tutorial video guides viewers through the process of installing and configuring Logstash to send syslogs from a Fortigate firewall to Elasticsearch. The presenter, building on a previous video, demonstrates setting up the Logstash input, filter, and output plugins, including parsing and enriching data. The video also covers troubleshooting steps, such as adjusting firewall settings and ensuring correct permissions for certificates, to ensure a smooth data flow into Elasticsearch for indexing and visualization in Kibana.
Takeaways
- 😀 The video is a tutorial on installing and configuring Logstash to send syslogs from a firewall to Elasticsearch.
- 🔧 Part two of a previous video series, where Elasticsearch and Kibana were installed and configured to work together.
- 💻 The presenter uses Oracle Linux and installs Java 11 or 17 as a prerequisite for Logstash, using the command `yum install java-1.8.0-openjdk`.
- 📝 Logstash is introduced as an open-source data collection engine that can enrich data, like adding geo-locations.
- 📑 The presenter guides through the process of setting up Logstash using a `yum` repository and a public signing key.
- 🔌 The video demonstrates configuring the firewall to send syslogs to Logstash, specifying the IP address and port 5144.
- 📡 The Logstash configuration file is detailed, explaining the input, filter, and output plugins necessary for processing the syslog data.
- 📚 The use of the 'grok' filter is highlighted for parsing syslog messages into structured data.
- 🛠 The 'mutate' filter is used to clean up the log data by removing unnecessary fields like the original syslog priority value.
- 🗓️ The 'kv' filter is introduced to parse key-value pairs from the log messages, and a 'date' filter is used to convert log timestamps into a usable format for Elasticsearch.
- 🔒 The video concludes with securing data transmission to Elasticsearch using SSL, including copying the CA certificate and setting appropriate file permissions.
- 📊 Finally, the presenter shows how to view the ingested data in Kibana and mentions future plans to create dashboards for data visualization.
Q & A
What is the purpose of the video?
-The purpose of the video is to demonstrate the installation and configuration of Logstash to send syslogs from a firewall to Elasticsearch, which is a continuation of a previous video where Elasticsearch and Kibana were installed and configured.
What are the prerequisites for installing Logstash as mentioned in the video?
-The prerequisites for installing Logstash include having Java installed, specifically JDK 11 or JDK 17, and the video uses JDK 17.
How does the video demonstrate the installation of Logstash?
-The video demonstrates the installation of Logstash by using the yum repository, downloading the public signing key, creating a repository file, and then executing the 'yum install logstash' command.
What is the default port used for syslog in the video?
-The default port used for syslog in the video is 5144.
How does the video handle the firewall configuration for sending syslogs?
-The video shows how to log in to the firewall, configure syslog settings, enable the syslog server, set the IP address of the syslog server to the Logstash server, and specify the port to 5144.
What are the three main components of Logstash configuration files?
-The three main components of Logstash configuration files are the input plugin, filter plugin, and output plugin.
What is the purpose of the 'grok' filter plugin used in the video?
-The 'grok' filter plugin is used to parse arbitrary text and structure it, allowing the user to define patterns for extracting data from logs.
How does the video address the issue of viewing the data before applying filters?
-The video suggests sending the output to the console (stdout) initially to view the data before applying any filters for further processing.
What is the significance of setting the correct permissions for the certificate in the video?
-Setting the correct permissions for the certificate is crucial for the Logstash service to run without issues, as it ensures that the service can access and use the certificate for secure communication with Elasticsearch.
How does the video demonstrate troubleshooting steps for Logstash?
-The video demonstrates troubleshooting by showing the process of giving the correct permissions to the certificate file, changing the file owner to 'logstash', and then successfully restarting the Logstash service.
What is the final step shown in the video for confirming that data is being sent to Elasticsearch?
-The final step shown in the video is logging into Kibana, navigating to the 'Discover' section, and confirming that the data from the firewall is being displayed in real-time.
Outlines
🔧 Setting Up Logstash for Firewall Syslogs
The video begins with an introduction to the task of installing Logstash and configuring it to receive syslogs from a firewall and send them to Elasticsearch. This is a continuation of a previous tutorial where Elasticsearch and Kibana were set up. The speaker provides a brief overview of their setup and mentions the need for Java 11 or 17 for Logstash. They demonstrate the installation process using the YUM repository and public signing key, and proceed to configure the firewall to send syslogs to the Logstash server on port 5144. The video also covers the basic components of a Logstash configuration file, which includes input, filter, and output plugins.
📡 Configuring Logstash Input and Testing with Stdout
In this section, the speaker delves into configuring the Logstash input plugin, specifically using the UDP input plugin to receive syslog messages from the firewall. They explain the importance of setting the correct host and port, and then proceed to test the setup by directing the output to the standard output (stdout) for initial parsing. The speaker also discusses the necessity of opening the specified port in the firewall settings and demonstrates how to start Logstash with the specified configuration file. The video shows the raw syslog messages being received, setting the stage for further filtering and processing.
🌐 Filtering and Parsing Firewall Logs with Logstash
The speaker continues by adding filters to the Logstash configuration. They start with the grok filter to parse the syslog message into a structured format, using a specific pattern to extract the syslog priority and message. The video illustrates how to use the grok debugger and apply the filter to the incoming logs. Subsequently, the mutate filter is introduced to clean up the log data by removing unnecessary fields. The speaker also explains how to use the kv filter to split the log message into individual fields and the subsequent steps to refine the log data further, including removing the original message and adding a new 'log date' field for better timestamp management.
⏰ Timestamp Conversion and Finalizing Logstash Filters
This part of the video focuses on converting the 'log date' field into a proper timestamp format that Elasticsearch can use. The speaker describes the process of using the date filter plugin to parse the date and time into a structured format. They also discuss the importance of removing redundant fields like 'log date', 'date', and 'time' once the timestamp has been created. Additionally, the speaker converts the 'received bytes' and 'send bytes' fields to integer types to facilitate mathematical operations in Elasticsearch. The video concludes this section by showing a cleaner log data structure ready for output to Elasticsearch.
🔗 Sending Parsed Logs to Elasticsearch and Verifying with Kibana
The final segment of the video script covers configuring the Logstash output plugin to send the parsed log data to Elasticsearch. The speaker details the necessary Elasticsearch output plugin settings, including specifying the hosts, index name, user authentication, and enabling SSL with a CA certificate. They demonstrate how to copy the certificate from the Elasticsearch node and set the appropriate file permissions for Logstash to access it. After making these configurations, the speaker shows how to start Logstash as a service and verify that the logs are being ingested into Elasticsearch using Kibana. They also troubleshoot a permissions issue with the certificate and demonstrate the final, successful log data stream into Elasticsearch, which can now be visualized in Kibana.
Mindmap
Keywords
💡Logstash
💡Elasticsearch
💡Kibana
💡Syslog
💡Fortigate
💡Java
💡YUM Repository
💡Configuration File
💡Grok Filter
💡Mutate Filter
💡Date Filter
💡Elasticsearch Output Plugin
Highlights
Introduction to installing Logstash for sending firewall syslogs to Elasticsearch.
Part two of a previous video series on setting up Elasticsearch and Kibana.
Requirement of Java 11 or Java 17 for Logstash installation.
Installation of Java 1.8.0 Open JDK on Oracle Linux.
Downloading and installing Logstash using the YUM repository.
Configuring Fortigate syslog server to send logs to Logstash.
Explanation of Logstash components: input, filter, and output plugins.
Setting up Logstash to receive UDP messages from Fortigate on port 5144.
Initial testing of Logstash configuration with stdout output.
Using the grok filter to parse syslog messages.
Utilizing mutate filter to clean and remove unnecessary fields from logs.
Introduction of the kv filter for parsing key-value pairs in log messages.
Creating a timestamp field from log date and time for Elasticsearch indexing.
Converting received and sent bytes to integer for data analysis.
Configuring the Elasticsearch output plugin for near real-time search and analytics.
Securing Elasticsearch connection with SSL and CA certificate.
Verification of data ingestion in Kibana's Discover feature.
Troubleshooting permissions for the CA certificate for Logstash service.
Final confirmation of successful data streaming from Fortigate to Elasticsearch.
Transcripts
hello everyone welcome to my channel
in this video i'm going to be installing
log stash and configuring it so that i
can send uh the syslogs of my firewall
uh to elasticsearch
in a previous video this is part two of
uh of another video where i installed
elasticsearch analog and kibana and
configured them i will post the link of
that video in the description
and now we need to send the data to
elasticsearch from my firewall
so we are going i'll show you first my
setup here
so last time i
installed elasticsearch and cabana got
them to talk to each other and now we
need to send fortigate syslogs
to stash do some filtering here and then
send them to elasticsearch to be able to
uh
to be uh indexed
so right here on their
website you can look at the log stash
introduction it's basically a an open
source data collection engine and i'm
going to use log stash that i am able to
do something like
enriching the data with for example guip
locations
i'm going to be doing that later on
if we go to getting started with log
stash
it tells us we need
java
and for that you need java 11 or java 17
using jdk 17
so i want to open jdk
to installing here at the top and i have
oracle linux so i run the command yum
install java 1.8.0
open jdk
and this is my server here i already ran
this command i installed java
it should not
do anything right
now okay nothing to do
i'm gonna go back to
log stash
documentation here
installing log stash i'm gonna be using
the yum repository so i'm gonna be
downloading and installing this public
signing key
just copy paste this command and i will
now create
a file and this repository called logs
logs dash dot repo
oh and i have this from before so
basically i just copy pasted this
information here and all i have to do is
say yum install log stash
now install log stash
and it will start
installing right now
i'm gonna go to my firewall
log in and see how to set up assist log
open command prompt from here
config log
syslog and with port gates you can
install or you can
setup forces log servers
i'm just going to use the first one
and i have those settings ready
status enable server the ip address of
the syslog server in my case it's going
to be the log stash server
and the port
we are sending to 5144
and if we look at the
full configuration here you can see all
the other default values
for example the format is still a
default
you can
you can use other format csv or cf or
rfc 5424
but i'm just going to keep it on the
default format
okay
let's see where it is at and it is
finished we're gonna jump right away to
the configuration file now
so nano
let's see log stash
conf d and i'm gonna
firewall.conf
this is from before so i'm gonna say
this time forty gate
48.5
okay and with logs we have three
main components of the configuration
files the input plug-in
if without
filter
plugin and the output
plugin so input
is where the data is coming from
filter is where you add filters to
basically customize how the data looks
and then output send it to wherever you
want in our case to elastic search
so i'm going to start with the input
and i'm going to open the documentation
here we can see the input plugins we
have many input plugins file i worked
with this one
and i'm going to go to the udb
udp input plugin so read messages as
events over network via udp
and the settings here i'm going to use
the host and the port
the host and the port
and
so it's a udp
plugin
the spacing is very important here
host
and this is where you put the ip of log
stash so basically the ip that logs
listens on
and then
port
i'm just going to do five one four four
five and four is this default the syslog
port
uh but if you're gonna use any port
under
1024
or the main uh or the well-known ports
i think you're gonna have to give
logstash
admin privileges or something like that
so
i'm going to use 5144
and this is basically it for the input
plugin
now filter before we do any filters i'm
just going to look at the data first and
we need to add something to the output
right now we're not going to send to
elasticsearch we want to parse it first
so to start testing i'm going to send
the output to the shell just
basically to this console right here std
out
and that's all i have to do
and something very important because we
are listening on this port
five one four four
we have to enable it i'm gonna say yes
here fortigate.gov
and i opened this
port already
firewall.cmd
list
all
and i opened this port so this is
important here as well
and to run
logstash
we're gonna go here
log stash
and dash f to specify the file
log stash conf and fortigate.com
okay takes few seconds
now we're going to be seeing all the
strata coming from the firewall
not porous not anything so see this
message here this is what we're going to
be working with
we're going to grab this message
and
we're going to start
filtering now we're going to work on the
filters the first filter i'm going to
use is
the grog filter
so
you go to the filter plug-ins uh filter
plug-ins
and where's the garage filter
grock filter a quick description here
parse arbitrary text and structure it
and you can open uh
grok
debuggers online
i did this before
so i basically copy pasted this message
whatever you have
from here for example this whole message
this is what we need to filter
or what we need to find the pattern of
so
it's basically on this format percentage
open bracket and we can see here this
number this is the syslog
5424
priority
and then percentage of greedy data and
we put them in a field called message
so now
we kind of paused it a little bit and
now we will work with this
message
okay
and
before i run i'm gonna run this command
with dash or
to reload the configurations
automatically
and i'm gonna duplicate this session
bring it here
i'm going to open the configuration file
on this side
go to gate and now i'm going to be
adding
the
filters
and it i'll save the file and it will
reload on the side automatically
so the first one we said was the growth
filter
i'm gonna copy paste from
okay so double space is important
i'm gonna copy paste from the document
here on the lab
so grog
just follow documentation how the format
of this
uh
filter works
okay so growth match and then message
and then
you'll put the pattern in
so message and then
this is the pattern and then we're going
to overwrite the message this this
message is right here
i'm gonna save
see how that looks
so now it's reloading automatically
we won't be able to see much here but
it's better than before
we don't have this is the event original
the message
i have to catch it
okay we're just gonna continue building
on the filters
the next one i'm going to do is mutate i
don't want to see all these original
events and logs and the version
of the whole of the
of the basically of the event of the log
and timestamp this is the time that the
event got
ingested in logs not the event that we
need actually
which is here of the event itself
so i'm going to remove those fields we
don't need them
control and save
let's see how it looks now
so a little cleaner now we see only the
message
so we see the message it does not have
that syslog priority value here
and now we're gonna grab this message or
we're not gonna grab it anyway we're
just gonna add another filter
that will parse each and every field by
itself
and for that we're gonna use
a filter plug-in plug-in called
kv
and we're feel we're doing a field we're
doing a field split on the on the space
so between every field see
field
and then its value and then space and
then another field and then its value
and then space
so this is what will help us parse those
um those fields so ctrl x save yes
it will reload
and now look at this we are seeing the
fields parsed but we're still seeing the
main
the main message here so we're going to
remove it because we parse them there
are individual fields
okay
we're going to do another mutate filter
and remove
that message
now we're going to do something else
so we're we're going to remove this
message the big chunky message and then
we're going to add a filter
or we're going to add a field sorry
we're going to call this field log date
and we're going to
put the value of date space and time
because you see those
date and time
if i
find them here and the first line of
this message date
space time
in
elasticsearch you need that at time
stamp you need something like this
but generated from the field values
themselves
to be able to ingest them and
index them in elasticsearch
so we're gonna do that first
we're gonna go next so
this is a step ahead
i'm gonna save that
reload
and now we have those clean fields
without that big message
okay
and now the last
filter
uh last filter
plugin is the date one
so we added this field from previous
step
we call log date it has the values of
date and time
and then we're matching
we're looking at this
log date we say
convert it to this format
year month day and then hour minute
seconds
we can set the time zone to convert
the time zone to our timezone and then
the target is at time stamp we need this
field
to um to ingest our data into
elasticsearch
so let's see what happens now
and this
i want to catch it from
the top
to be able to see
i'm going to stop it for a bit and look
at that
new timestamp field
so this timestamp field is made from
this field time
and
this field date now we don't need date
we don't need time
and definitely definitely not this log
date that we created just to create this
timestamp
so another filter one more filter
at the end
to say
i'm gonna keep this running
just say remove log date date and time
i'm also gonna we're also going to
convert received bytes and send bytes to
integer type
because we will
be able to uh
do mathematical calculations on those if
it's a string we can't add a string with
a string now
sent bytes and received whites are
integers
and we can see timestamp and we don't
see time and date anymore
now this is much cleaner than before we
are able to
send those
to elasticsearch
to do that we're going to go to the
output plugin now
and we're going to say elasticsearch
i'll show you the output plugin
and the documentation
so the last search here so we're sending
the data to elasticsearch
it'll search
this plugin
output plugin provides near near
real-time search and analytics for all
types of data so we're just sending to
elasticsearch
and there are many many options and
settings here
but what i'm going to use
is the following
okay
i don't want to copy anything wrong
okay so hosts was which is the last
search
post with https because we're
gonna use the cert ca cert
so the index we're gonna save them in a
an index called firewall
you can keep it as firewall but i added
the date of today
and then
user and password it's very secure
password i know and then we're enabling
ssl i'm saying this is the ca cert
this is closing the brackets for
elasticsearch
and now should we comment this out as cd
out
now we'll keep it so that we can make
sure that the data is still
uh running
we're gonna save
uh
yes
but it will not send right away because
we need to copy the cert from the last
search node
or from the ca
i'm just going to say yes
so it's still outputting to the console
here but not going to ask search
because we need
to copy that cert and this is
my elastic search node
scp
let's see
elasticsearch inserts
http underscore ca
now we're sending it
to 192 168 25
and
let's see log
logs
dash
oh that's correct
um
oh i did this before
history
girl scp i'm just going to copy this
no maybe the one before
this recipe
[Music]
okay host variation failed
oh because i have to
um
somehow
i need to remove this key
from here
now copy again are you sure yes
password
now the cser is added this reloaded
it's still going let us log in
to kibana
with my very secure password
and let's look
at
or go here management stack management
index management
see if we have
a firewall index and we sure do
those are from yesterday and before
yesterday and this is the new index
it's reload documents 46.
the size is increasing here
and to look at the data
we will go to data views
and oh i already added
the data view
so you can create data view and
put a name here that's what will match
the index and then we can go to
discover
and we can see the data coming in from
the firewall
let me see when was the last time before
i
broke it broke it oh that was
yesterday
i'll go back to the last 15 minutes
let's stream the data live
every 10 seconds let's say apply
and now we can see our data coming in
from the firewall
you can look at each document
um in alaska they call the document it's
the same as event the same as a log
we're looking at this document it is
a ping
um
media player i don't know destination ip
so now you can look at your data
visualize the data build dashboards i'll
do another video of some examples
of building a dashboard
and this is basically
how you configure logstash
to send the data i'm going to stop this
right here oh actually
i stopped it
and now the data will will stop
streaming into
i need to do
another thing here
i don't want to stream
to the shell so i'm going to comment
this out
and i'm going to run
from either one i'm going to run lux as
a service
start log stash hopefully this does not
break
it shouldn't break status
logs service
now after i started the logs service
it stopped logging in cabana i fixed it
but it stopped logging and the reason
was
i needed to give permission to
the certificate i'll show you what
what permissions i gave it
log stash
so this is the certificate i copied over
and i gave it six six zero
so change mod or
mod
zero and
this command this file
and also
i gave log stash user permission also
change
owner like this
root log stash
and the same
same file okay
and after that i started the service and
it worked
uh it worked again
so there's a lot of troubleshooting
while doing this
but hopefully this video was um
was informative
and it helped you
configure logstash
you can use it to ingest data from
basically anything
and this is my data running
and
ingesting it's coming into elasticsearch
thank you for watching and i will see
you in the next one
関連動画をさらに表示
RSLogix 5000 Analog Input Programming | Wiring Scaling Tutorial for PLC Analog Input Signal Example
How to deploy a NextJS app on Digital Ocean instead of Vercel
CSS NC II COC 3: SET UP COMPUTER SERVER. #computersystem #computernetwork #computerrepair #css
[TUTORIAL 1] DASAR MENGGUNAKAN TOTAL STATION TOPCON GM-101 SERIES | ANAK TEKNIK
CARA INSTALL DAN KONFIGURASI DHCP SERVER PADA LINUX UBUNTU SERVER 23.10 (ISC DHCP SERVER)
#03 💻 Membuat Project Laravel Baru menggunakan Docker Container
5.0 / 5 (0 votes)