{"id":295,"date":"2017-07-24T13:17:00","date_gmt":"2017-07-24T13:17:00","guid":{"rendered":"https:\/\/opstree.com\/blog\/\/2017\/07\/24\/logstash-timestamp\/"},"modified":"2019-09-18T17:00:50","modified_gmt":"2019-09-18T11:30:50","slug":"logstash-timestamp","status":"publish","type":"post","link":"https:\/\/opstree.com\/blog\/2017\/07\/24\/logstash-timestamp\/","title":{"rendered":"Logstash Timestamp"},"content":{"rendered":"<div dir=\"ltr\" style=\"text-align:left;\">\n<div dir=\"ltr\" style=\"text-align:left;\">\n<h3 style=\"line-height:1.2;margin-bottom:4pt;margin-top:16pt;\"><span style=\"background-color:transparent;color:#434343;font-size:14pt;font-style:normal;font-weight:bold;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">Introduction<\/span><\/h3>\n<div dir=\"ltr\"><\/div>\n<div dir=\"ltr\" style=\"line-height:1.2;margin-bottom:0;margin-top:0;\"><span style=\"background-color:transparent;color:black;font-size:11pt;font-style:normal;font-weight:400;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">A few days back I encountered with a simple but painful issue. I am using ELK to parse my application logs &nbsp;and generate some meaningful views. Here I met with an issue which is, logstash inserts my logs into elasticsearch as per the current timestamp, instead of the actual time of log generation.<\/span><\/div>\n<div dir=\"ltr\" style=\"line-height:1.2;margin-bottom:0;margin-top:0;text-align:justify;\"><span style=\"background-color:transparent;color:black;font-size:11pt;font-style:normal;font-weight:400;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">This creates a mess to generate graphs with correct time value on Kibana.<\/span><\/div>\n<div dir=\"ltr\" style=\"line-height:1.2;margin-bottom:0;margin-top:0;text-align:justify;\"><\/div>\n<div dir=\"ltr\" style=\"line-height:1.2;margin-bottom:0;margin-top:0;text-align:justify;\"><span style=\"background-color:transparent;color:black;font-size:11pt;font-style:normal;font-weight:400;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">So I had a dig around this and found a way to overcome this concern. I made some changes in my logstash configuration to replace default time-stamp of logstash with the actual timestamp of my logs.<\/span><\/div>\n<h3 style=\"line-height:1.2;margin-bottom:4pt;margin-top:16pt;\"><span style=\"background-color:transparent;color:#434343;font-size:14pt;font-style:normal;font-weight:bold;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">Logstash Filter<\/span><\/h3>\n<div dir=\"ltr\"><\/div>\n<div dir=\"ltr\" style=\"line-height:1.2;margin-bottom:0;margin-top:0;text-align:justify;\"><span style=\"font-size:11pt;vertical-align:baseline;white-space:pre-wrap;\">Add following piece of code in your &nbsp;filter plugin section of logstash&#8217;s configuration file, and it will make logstash to insert logs into elasticsearch with the actual timestamp of your logs, besides the timestamp of logstash (current timestamp).<\/span><\/div>\n<div dir=\"ltr\" style=\"line-height:1.2;margin-bottom:0;margin-top:0;text-align:justify;\"><span style=\"font-size:11pt;vertical-align:baseline;white-space:pre-wrap;\">&nbsp;<\/span><\/div>\n<\/div>\n<pre style=\"background-color:#eeeeee;border:1px dashed #999999;color:black;font-size:12px;line-height:14px;overflow:auto;padding:5px;width:100%;\"><code style=\"color:black;word-wrap:normal;\">date {\n  locale =&gt; \"en\"\n  timezone =&gt; \"GMT\"\n  match =&gt; [ \"timestamp\", \"yyyy-mm-dd HH:mm:ss +0000\" ]\n}\n<\/code><\/pre>\n<div dir=\"ltr\" style=\"display:inline !important;line-height:1.38;margin-bottom:0;margin-top:0;\">In my case, the timezone was GMT &nbsp;for my logs. You need to change these entries &nbsp;<span style=\"background-color:transparent;color:#1c4587;font-size:11pt;font-style:normal;font-weight:400;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">&#8220;yyyy-mm-dd HH:mm:ss +0000&#8221;<\/span><span style=\"background-color:transparent;color:black;font-size:11pt;font-style:normal;font-weight:400;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\"> &nbsp;with the corresponding to the regex for actual timestamp of your logs. <\/span><\/div>\n<h3 style=\"line-height:1.38;margin-bottom:2pt;margin-top:12pt;\"><span style=\"background-color:transparent;color:#434343;font-size:14pt;font-style:normal;text-decoration:none;vertical-align:baseline;white-space:pre-wrap;\">Description<\/span><\/h3>\n<div dir=\"ltr\" style=\"line-height:1.38;margin-bottom:0;margin-top:0;\"><span style=\"font-size:11pt;font-weight:bold;vertical-align:baseline;white-space:pre-wrap;\">Date plugin<\/span><span style=\"font-size:11pt;vertical-align:baseline;white-space:pre-wrap;\"> will override the logstash&#8217;s timestamp with the timestamp of your logs. Now you can easily adjust timezone in kibana and it will show your logs on correct time. <\/span><\/div>\n<div dir=\"ltr\" style=\"line-height:1.38;margin-bottom:0;margin-top:0;\">(Note: Kibana adjust UTC time with you bowser&#8217;s timezone)<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Introduction A few days back I encountered with a simple but painful issue. I am using ELK to parse my application logs &nbsp;and generate some meaningful views. Here I met with an issue which is, logstash inserts my logs into elasticsearch as per the current timestamp, instead of the actual time of log generation. This &hellip; <a href=\"https:\/\/opstree.com\/blog\/2017\/07\/24\/logstash-timestamp\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Logstash Timestamp&#8221;<\/span><\/a><\/p>\n","protected":false},"author":171775670,"featured_media":29900,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_coblocks_attr":"","_coblocks_dimensions":"","_coblocks_responsive_height":"","_coblocks_accordion_ie_support":"","jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false},"version":2}},"categories":[1],"tags":[44070,768739308,676319247,52970,12657,59657565],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/opstree.com\/blog\/wp-content\/uploads\/2025\/11\/DevSecOps-1.jpg","jetpack_likes_enabled":true,"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/pfDBOm-4L","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts\/295"}],"collection":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/users\/171775670"}],"replies":[{"embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/comments?post=295"}],"version-history":[{"count":4,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts\/295\/revisions"}],"predecessor-version":[{"id":1097,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/posts\/295\/revisions\/1097"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/media\/29900"}],"wp:attachment":[{"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/media?parent=295"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/categories?post=295"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/opstree.com\/blog\/wp-json\/wp\/v2\/tags?post=295"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}