Shuqin Li-Kokko Shuqin Li-Kokko - 1 year ago 61
Bash Question

How to get all the logs with certain string value in the last a few minutes

I have an application using its own log file format. Now I want to get all the lines of logs with certain string values such as "Fatal error" within a certain period of time. The log data format is like the following:

Thread-28689296: Thu Aug 25 15:18:41 2016 [ info ]: abcd efddf
Thread-28689296: Thu Aug 25 15:19:01 2016 [ info ]: xvbdfdre dfdfd
Thread-28689296: Thu Aug 25 15:19:11 2016 [ info ]: Fatal error
Thread-28689296: Thu Aug 25 15:19:41 2016 [ info ]: dfdfdfd

If "now" is Aug 25 15:19:41 2016, I want to find between 15:19:41 and 15:17:41 those lines that have "Fatal error" in my log file. So the current time should be from date and x minutes ago should be from "date x minutes ago" to find certain error messages from the application log.

If I use the following command line:

awk -v Date="$(date "+%b %d %H:%M:%S %Y")" -v Date2="$(date -- date="2
minutes ago" "+%b %d %H:%M:%S %Y")" '$5 > Date && $5 < Date2' log_file |
grep "Fatal error"

the variable "$5" in the condition actually gets the value of minute "17" and "19" in my sample log data but it compares with a date value. So this won't work.

How can I construct the value of time in log timestamp from $3 to $7 fields to compare the value of current time. I m not so familiar with shell scripting.

Thanks for your advice and help in advance.

Answer Source

Can you try this:


  Date="$(date "+%b %d %H:%M:%S %Y")"
  Date2="$(date --date="2 minutes ago" "+%b %d %H:%M:%S %Y")"

  first_line=$(grep -n "$Date2" log_file | awk -F ":" '{print $1}')

  last_line=$(grep -n "$Date" log_file | awk -F ":" '{print $1}')

  sed -n "${first_line},${last_line}p" log_file | grep "Fatal error"
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download