Logstash Tutorial : A complete guide for the beginners how to index data from logstash to Elasticsearch and Kibana

Thursday, December 26, 2019

Logstash Tutorial : A complete guide for the beginners how to index data from logstash to Elasticsearch and Kibana



Logstash is an open source, server-side data procession pipeline that ingests data from a multitude of sources simultaneously, transforms it and then sends it to your favorite "stash"


Logstash
Logstash


1. Overview of Logstash


Developed by - Elastic NV 


2. What is Logstash ?

Logstash is a light-weight, open-source, server-side data processing pipeline that allows you to collect data from a variety of sources, transform it on the fly and send it to your desire destination. It is most often used as a data pipeline for Elasticsearch and open-source analytics and search engine. Because of its tight integration with Elasticsearch, powerful log processing capabilities and over 200 pre-built open-source plugins that can help you easily index your data. Logstash is the popular choice for loading data into Elasticsearch.

3. Installation of Logstash

For installation of Logstash please visit the below link - 


4. Download Dataset to import via Logstash into Elasticsearch and Kibana

Please download the sample dataset from the below link - 



5. How to run Logstash?

So When you kick off logstash, you do that with the intention of loading some data and the structure of the data example of any dataset, it can have no of columns and data types. So if you want them to configure separately, you need to map the file name inside the input blogs.


Config Settings file

input {
   stdin { } 
}

filters {}

output {
  elasticsearch { host => ["localhost:9200"] }
  stdout { codec => rubydebug }
}

Run Logstash

After downloading logstash - > open command prompt
go to logstash folder

bin/logstash -f simple.conf   //simple.conf is your file name 

Official Website Link for the Configuration 



all the column name inside filters block and the output block you have to say where the file will be indexed, and save the file in .conf extension.

6. Logstash Simple.conf?


input {
file {
path => "/Users/Atique/Desktop/Projects/myblog/Youtube/youtube_logstash/employee.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => [ "Age","Attrition","BusinessTravel","DailyRate","Department",
"DistanceFromHome","Education","EducationField","EmployeeCount",
"EmployeeNumber","EnvironmentSatisfaction","Gender","HourlyRate",
"JobInvolvement","JobLevel","JobRole","JobSatisfaction","MaritalStatus",
"MonthlyIncome","MonthlyRate","NumCompaniesWorked","Over18",
"OverTime","PercentSalaryHike","PerformanceRating",
"RelationshipSatisfaction","StandardHours","StockOptionLevel",
"TotalWorkingYears","TrainingTimesLastYear","WorkLifeBalance",
"YearsAtCompany","YearsInCurrentRole","YearsSinceLastPromotion","YearsWithCurrManager" ]
}
mutate {convert => ["Age", "integer"] }
mutate {convert => ["DailyRate", "integer"] }
mutate {convert => ["DistanceFromHome", "integer"] }
mutate {convert => ["Education", "integer"] }
mutate {convert => ["EmployeeCount", "integer"] }
mutate {convert => ["EmployeeNumber", "integer"] }
mutate {convert => ["EnvironmentSatisfaction", "integer"] }
mutate {convert => ["HourlyRate", "integer"] }
mutate {convert => ["JobInvolvement", "integer"] }
mutate {convert => ["JobLevel", "integer"] }
mutate {convert => ["JobSatisfaction", "integer"] }
mutate {convert => ["MonthlyIncome", "integer"] }
mutate {convert => ["MonthlyRate", "integer"] }
mutate {convert => ["NumCompaniesWorked", "integer"] }
mutate {convert => ["PercentSalaryHike", "integer"] }
mutate {convert => ["PerformanceRating", "integer"] }
mutate {convert => ["RelationshipSatisfaction", "integer"] }
mutate {convert => ["StandardHours", "integer"] }
mutate {convert => ["StockOptionLevel", "integer"] }
mutate {convert => ["TotalWorkingYears", "integer"] }
mutate {convert => ["TrainingTimesLastYear", "integer"] }
mutate {convert => ["WorkLifeBalance", "integer"] }
mutate {convert => ["YearsAtCompany", "integer"] }
mutate {convert => ["YearsInCurrentRole", "integer"] }
mutate {convert => ["YearsSinceLastPromotion", "integer"] }
mutate {convert => ["YearsWithCurrManager", "integer"] }
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "employee"
document_type => "employee_details"
}
stdout {}
}

Project Link -
git clone https://atique1224@bitbucket.org/atique1224/youtube_logstash_tutorial.git

More details explanation & hands on please watch the below video link - 



1 comments :

madogeavey said...

And to know which one is a wonderful slot to play, you need to|you should|you have to} care in regards to the Return to Player share. In different words, if you are making an attempt to discover how to choose on} a slot machine and how to to|tips on how to} find the most effective slot machines to play on-line, you will love this guide. All Wildcard Club members receive 1 base point for each $1.00 cycled by way of the machines. As lengthy as you play once as} each 6 months your points will keep energetic in your cards. Aside from the leisure of casinos, some folks do get swept into an dependancy that far surpasses the leisure value of the video games. Only a small share of gamblers attain this point, however unfortunately, it’s estimated that their losses make up a quarter of the profits for 메리트카지노 the casinos.