log4j2 log extraction by identifier, multiple lines log

This script is able to extract log4j2 log belonging to a specific thread, identified by an unique identifier.

log4j configuration have the following pattern

 <pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level (%26t) [%C{1}] %msg%n</pattern>

26%t indicates that log will contain the thread name in fixed length of 26 chars. In order to have a unique identifier you can manipulate thread name in your java code as shown

Thread t = Thread.currentThread();
       
   // change the name
   t.setName("Some unique identifier");

but msg can be multiline and in that case some lines could not have the unique identifier in the line,
so grepping the identifier in log is not sufficient to extract the entire log.

Immagine to have the following log, and you want to extract only log belonging to transaction identified by
NODE1_identifier_5-3

2015-07-22 15:24:21.317 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.317 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.317 DEBUG (NODE1_identifier_5-3) [class_name2] some application message...
 2015-07-22 15:24:21.318 INFO (NODE1_identifier_5-3) [class_name3] some application message...
 2015-07-22 15:24:21.318 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.318 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.329 DEBUG (NODE1_identifier_4-1) [class_name1] some application message...
 2015-07-22 15:24:21.319 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.329 DEBUG (NODE1_identifier_4-1) [class_name1] some application message...
 2015-07-22 15:24:21.319 DEBUG (NODE1_identifier_5-3) [class_name4] some application message...
 2015-07-22 15:24:21.319 DEBUG (NODE1_identifier_5-3) [class_name4] some application message...
 2015-07-22 15:24:21.319 DEBUG (NODE1_identifier_5-3) [class_name4] some application message...
 some application message 2 line..................
 some application message 3 line..................
 some application message 4 line..................
 2015-07-22 15:24:21.329 DEBUG (NODE1_identifier_4-1) [class_name1] some application message...
 2015-07-22 15:24:21.320 DEBUG (NODE1_identifier_5-3) [class_name5] some application message...
 2015-07-22 15:24:21.320 DEBUG (NODE1_identifier_5-3) [class_name6] some application message...
 2015-07-22 15:24:21.320 INFO (NODE1_identifier_5-3) [class_name7] some application message...
 2015-07-22 15:24:21.320 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.321 DEBUG (NODE1_identifier_5-3) [class_name1] some application message...
 2015-07-22 15:24:21.329 DEBUG (NODE1_identifier_4-1) [class_name1] some application message...

try to use the following script based on bash shell 🙂

#!/bin/bash
found=0
log_path=/home/log/
if [[ $# -ne 2 ]] ; then
 echo ""
 echo "USAGE: .\filter_log.sh <log_file_name> <SEARCH_PATTERN>"
 echo ""
 exit 1
fi
search_pattern=$2
filename_log=$log_path$1
filtered_log=$log_path"filtered_"$1
echo "log file: "$filename_log
echo "log filtered output is: "$filtered_log
echo "search pattern file: "$search_pattern
echo "">$filtered_log
while read line
do
pattern=`echo $line | grep ${search_pattern}`
if [ -z "${pattern}" ]
then
#echo "pattern empty"
if [ $found -gt 0 ]
then
tmp=`echo $line | grep ' TRACE \| DEBUG \| INFO \| WARN \| ERROR '`
if [ -z "$tmp" ]
then
echo $line>>$filtered_log
else
found=0
fi
fi
else
found=1
#echo "found line with pattern, set found=1"
echo $line>>$filtered_log
fi
done < <(cat $filename_log |awk -v c=0 '/'${search_pattern}'/&&c++<1 {a=$0;next} /'${search_pattern}'/&&c>0 { a=a"\n"$0;f=a;next} c>0 {a=a"\n"$0;next} END { print f}' )
echo "log filtered output is: "$filtered_log
exit 0

This post has already been read 2462 times!

Precedente Mini tour Andalusia, esperienza indimenticabile Successivo Oracle warehouse builder: how to use dynamic name for Flat File in mapping

Lascia un commento

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.