loghack - process and query apache logs
Regenerate the skiplist for a given chunk.
Parse a raw logfile and split it into hourly chunks.
loghack prep servername/logfile.gz
loghack confirm *
List files in the repository.
loghack list 2008-01-01 thru 2008-01-31 in *
Assemble reports into daily chunks (in the .compiled/ directory.)
loghack compile 2007-10-01
Build aggregate reports.
loghack aggregate month $start_date loghack aggregate week $start_date
loghack tabulate daily 2007-10-01 thru 2007-10-31
Crunch the prepared data and generate a report for the given chunk(s).
loghack report $server/$chunk.tar.gz
Experimental: count/report unique visitors within a chunk.
Experimental: count/report unique visitors within a day.
Experimental: count/report unique visitors within a month.
Experimental: count/report unique visitors within a month (alternate, memory-hungry algorithm.)
Create hardlinks with dated names.
Run the prep, report, compile, and aggregate actions (nice for automatic daily imports.)
loghack import $file1 $file2 ...
Count the records in a given chunk (accounting for skiplist.)
Dump the records in a given chunk (accounting for skiplist.)
Print a date for the first line in a raw logfile.
date=$(loghack date logfile.gz)
To install ApacheLog::Parser, copy and paste the appropriate command in to your terminal.
cpanm
cpanm ApacheLog::Parser
CPAN shell
perl -MCPAN -e shell install ApacheLog::Parser
For more information on module installation, please visit the detailed CPAN module installation guide.