This is definitely possible.
Incoming logs are typically tagged as the current time (in UTC) of Nagios Log Server, and are displayed in the GUI according to that tagged time. The exception to this is if the 'date' is picked up by Logstash properly. Do some reading here to see what I mean:
http://www.logstash.net/docs/1.4.3/filters/date
The date filter is especially important for sorting events and for backfilling old data. If you don’t get the date correct in your event, then searching for them later will likely sort out of order.
In the absence of this filter, logstash will choose a timestamp based on the first time it sees the event (at input time), if the timestamp is not already set in the event. For example, with file input, the timestamp is set to the time of each read.
You will need to setup the 'date' filter to parse your incoming logs. To do so, your input/filter chain might look something like this.
Input is first, filter is second:
Code: Select all
syslog {
type => 'nagiosincominglogs'
port => 8999
}
if [type] == "nagiosincominglogs" {
date {
match => [ "logdate", "MMM dd YYY HH:mm:ss",
"MMM d YYY HH:mm:ss", "ISO8601" ]
}
}
For the above to work, your 'timestamp' field
must be called 'logdate'.
Your field is very likely going to be named something different. Take a look at some logs imported already, and check for which field contains the 'timestamp' of your logs.
2015-06-04 11_09_47-Dashboard • Nagios Log Server - Firefox Developer Edition.png
Assuming that your timestamp field is called 'timestamp' as mine is, we simply change 'logdate' to 'timestamp' as per above. If you do not have a timestamp field, you will need to create another filter to parse that field out appropriately - likely a grok filter. Let me know if you need further assistance with this.
You do not have the required permissions to view the files attached to this post.