Introduction
A few days back I encountered with a simple but painful issue. I am using ELK to parse my application logs and generate some meaningful views. Here I met with an issue which is, logstash inserts my logs into elasticsearch as per the current timestamp, instead of the actual time of log generation.
This creates a mess to generate graphs with correct time value on Kibana.
So I had a dig around this and found a way to overcome this concern. I made some changes in my logstash configuration to replace default time-stamp of logstash with the actual timestamp of my logs.
Logstash Filter
Add following piece of code in your filter plugin section of logstash’s configuration file, and it will make logstash to insert logs into elasticsearch with the actual timestamp of your logs, besides the timestamp of logstash (current timestamp).
date {
locale => "en"
timezone => "GMT"
match => [ "timestamp", "yyyy-mm-dd HH:mm:ss +0000" ]
}
In my case, the timezone was GMT for my logs. You need to change these entries “yyyy-mm-dd HH:mm:ss +0000” with the corresponding to the regex for actual timestamp of your logs.
Valuable information thanks for sharing devops Online Training
LikeLike
It's Really Great Post, Thank you for sharing such a nice information.Best Web Desiging Training in BangaloreBest Oracle Training in Bangalore
LikeLike
Really it was an awesome article…very interesting to read..You have provided an nice article….Thanks for sharing..Devops TrainingEMC SAN Training
LikeLike