Pulling info from various files

J

jaluht

Guest
Hi, Im new to linux so please excuse any ignorance.

I have a folder with 562 log files. The log files contain info with a trace count which keeps increasing by 1 until approximately 22,000. What i need to do is search each file in the folder and pull out the file name and the final trace count. Is there any way this can be done without pulling the previous 21,999 trace counts in each file?
 


OP
R

Rob

Guest
So you need to cat the last line of each one, then print a few parts of it?

Code:
for x in $(tail -n 1 /folder/*.log); do cat $x | awk '{print $2, $5}';done

So, if this works ... :)

Breaking it down:

Code:
tail -n 1
will print the last line of a file

Code:
cat $x | awk 'print $2, $5}'
will take the line and print the 2nd and 5th fields (using space as separator).
 
OP
J

jaluht

Guest
Its not the last line though, ex.

seg fault
seg fault
trace count 1
...
trace count 22000
seg fault
seg fault
etc
 
OP
R

Rob

Guest
How about..

Code:
for x in $(grep 'trace count' /folder/*.log|tail -n 1); do cat $x | awk '{print $2, $5}';done
?
 
OP
R

Rob

Guest
Or - if the lines are that short and there's no need to awk things out of it..

Code:
grep 'trace count' /folder/*.log| tail -n 1
(maybe)
 
OP
J

jaluht

Guest
This did nothing? Each file has approx 160,000 lines
 
OP
R

Rob

Guest
My syntax is probably wrong then - try looking into cat/grep/tail, etc..
 

Members online


Latest posts

Top