ELK: Copy-Paste Quick Installation

Preparation

sudo nano /etc/apt/sources.list.d/elastic-8.x.list

Adding it there

deb [trusted=yes] https://mirror.yandex.ru/mirrors/elastic/8/ stable main

Importing the keys

sudo curl -s https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo gpg --no-default-keyring --keyring gnupg-ring:/etc/apt/trusted.gpg.d/elasticsearch-keyring.gpg --import
sudo apt update

Installing and configuring Elasticsearch

sudo apt устанавливает elasticsearch

At the end, the "elastic" login and password will be displayed.

systemctl daemon-reload
systemctl enable elasticsearch.service
systemctl start elasticsearch.service

Check

curl -k --user elastic:'PASSWORD' https://127.0.0.1:9200
sudo nano /etc/elasticsearch/elasticsearch.yml

We are interested in the following lines

systemctl restart elasticsearch.service
ss -tulnp | grep 9200

Installing and configuring Kibana

sudo apt install kibana
systemctl daemon-reload
systemctl enable kibana.service
systemctl start kibana.service

We wait a bit (Kibana takes a long time to start) and check

ss -tulnp | grep 5601
sudo nano /etc/kibana/kibana.yml
sudo /usr/share/elasticsearch/bin/elasticsearch-reset-password -u kibana_system

Your username and password should be written down

sudo nano /etc/kibana/kibana.yml

In the elasticsearch.hosts line, specify https

sudo cp -R /etc/elasticsearch/certs /etc/kibana
sudo chown -R root:kibana /etc/kibana/certs
sudo nano /etc/kibana/kibana.yml
systemctl restart kibana.service

Go to the browser using the IP address of the server (where ELK is installed) and port 5601, and you will be taken to the elasticsearch page

Enter your elastic login and password, log in, click the blue Add integrations button, and return to the server

Installing and configuring Logstash

sudo apt install logstash
systemctl enable logstash.service
cd /etc/logstash/conf.d && sudo touch input.conf output.conf
sudo nano /etc/logstash/conf.d/input.conf
input {
 beats {
   port => 5044
 }
}
sudo nano /etc/logstash/conf.d/output.conf
output {
         elasticsearch {
             hosts => "https://localhost:9200"
             index => "winsrv-%{+YYYY.MM}"
             user => "elastic"
             password => "PASSWORD"
             cacert => "/etc/logstash/certs/http_ca.crt"
       } 
}
sudo cp -R /etc/elasticsearch/certs /etc/logstash
sudo chown -R root:logstash /etc/logstash/certs
systemctl start logstash.service

Installing and configuring Winlogbeat

Download the ZIP archive with winlogbeat, unzip it, run powershell as an administrator, and navigate to the winlogbeat directory. The directory should contain similar folders and files as shown in the screenshot below.

Then we edit the policy for executing scripts, after that we install winlogbeat, and also unlock the script by clicking on it PCM->Properties.

> Set-ExecutionPolicy RemoteSigned

> y

> .\install-service-winlogbeat.ps1

I placed the winlogbeat directory in the root of the C drive for example, and created a folder where the logs will be stored.

Next, open the winlogbeat.yml configuration file, delete everything inside, and paste the following configuration

winlogbeat.event_logs:
 - name: Application
 ignore_older: 72h
 - name: Security
 - name: System
tags: ["winsrv"]
output.logstash:
 hosts: ["192.168.1.80:5044"]
logging.level: info
logging.to_files: true
logging.files:
 path: C:/logs
 name: winlogbeat.log

Attention!!!

In the "hosts" field, enter the IP address of the server where ELK is located. In the "path" field, specify the path where the logs will be stored. Additionally, the "tags" field should match the "index" field in the output.conf file in the logstash directory on the ELK server.

After that, we check the configuration

> .\winlogbeat.exe test config -c .\winlogbeat.yml -e

We get at the end: Config OK

Add winlogbeat to your startup list, run it, and check it

> Set-Service winlogbeat -StartupType Automatic

> Start-Service winlogbeat

> Get-Service winlogbeat

Return to the web interface. Click on the three bars at the top left and go to Stack Management -> Data views. Then click on the blue Create data view button

We specify the name and enter the index (the one we specified as "tags" in the winlogbeat configuration file) and save it.

Then we go to Discover and see that the logs have been sent

Last updated

Was this helpful?