Create a usable log file from the 1and1 log files.
There are several types of log files: access, email,
and ftp. Only the access log file is covered.
The 1and1 log files are located in the logs directory
on the server. There are several access.log.NN.gz files. Each one is
1 week's worth of statistics except the current week which is in
daily increments. The gz extension shows the log is
compressed (using a program called gzip).
The 1and1 log files are compressed (to save a lot of server space) by a program called gzip. The file extension is gz. In order for us to use them the files must be restored (uncompressed) to their original format (which the per-line format is defined in the LogFormat entry in the config file).
For 1and1, each log file is 1 week's worth of stats.
I rename mine access.log.MM.DD.YY.gz where MM.DD.YY is the
*last* day of the log. This shows the log for the 7 day period
ENDING on MM.DD.YY. For example, access.log.07.27.08.gz is for
July 21 to July 27.
A summary of all the offline tasks is at the end of the tutorial which you can refer to once you are set up.
I created a generic config file named awstats.daily.conf in
the cgi-bin directory.
In the LogFile entry I have a path to a generic log called
access.daily.log in the daily directory.
LogFile="LOCALPATH/daily/access.daily.log"
To create a log for one day, copy the access log (e.g. access.log.45.6.gz) to the daily directory and run the batch file shown below.
Here's a batch file that will create a daily logfile.
Change the LOCALPATH to the awstats directory on your computer
The batch file:
The batch file can handle spaces in the LOCALPATH directory name.
@echo off set LOCALPATH=d:\dev\awstats rem rem create access.daily.log rem cd "%LOCALPATH%\daily" "%LOCALPATH%\mytools\gzip.exe" -dfvc access.log.*.gz > "%LOCALPATH%\daily\access.daily.log"
To create a log for one week, copy the access log (e.g. access.log.06-01-08.gz) to the weekly directory and run the batch file shown below.
If you want to save the log file, rename it to access.06.01.2008.log and archive it. You may want to compress it (using zip) to save space.
Change the LOCALPATH to fit your environment.
@echo off set LOCALPATH=d:\dev\awstats set month=12 set year=2008 rem rem create access.weekly.log rem cd "%LOCALPATH%\weekly" "%LOCALPATH%\mytools\gzip.exe" -dfvc access.log.*.gz > "%LOCALPATH%\weekly\access.weekly.log"
To create a log for one month, gather up the access logs that comprise the month. For example June 2008 needed:
The tool logresolvemerge.pl will combine these logs into
chronological order.
This will get June 1 through June 30th. I name the resulting log
access.month.log.
It doesn't matter if there is overlap because only a month
database is created; the rest is ignored.
If you want to save the log file, rename it to
access.06.2008.log and archive it. You may want to compress it
(using zip) to save space.
Change the month, year, and LOCALPATH to fit your environment.
@echo off set month=11 set year=2008 set LOCALPATH=d:\dev\awstats rem del /Q "%LOCALPATH%\monthly\*.*" rem rem create combined log from several weekly logs rem cd "%LOCALPATH%\merge" "%LOCALPATH%\mytools\gzip.exe" -dfv access.log.*.gz perl "%LOCALPATH%\wwwroot\cgi-bin\logresolvemerge.pl" "%LOCALPATH%\merge\access.log.*" > "%LOCALPATH%\monthly\access.month.log"
If you are not using AWStats online you can skip this part.
This technique will read the AWStats log on-the-fly so you do not have to create a log file. Since this is resource-intensive I found it only works for daily log (i.e. single file)...and if your daily log is large it may not work on it.
On-the-fly multiple log processing took too many resources and locked out my account for about an hour. Here is an example of what I tried:
LogFile="DOCROOT/awstats/wwwroot/cgi-bin/logresolvemerge.pl DOCROOT/logs/access.log.*.gz |"
I believe the problem is logresolvemerge.pl taking too long to process several logs. I found for processing multiple logs I had to do it offline. If you have access to SSH you may be able to do it on the 1and1 server.
Reading today's logfile:
LogFile="cat DOCROOT/logs/access.log.??.? |"
This setup reads the only uncompressed log file, which is today's log. cat is a command which lists the specified file to output (like the type command in dos).
This works well for either one week or a day's worth of
statistics. Just set the access log filename to the one you want
to view. The example below is for one day (July 26, 2008
(Saturday)).
When you update the selected month and year database will be
overwritten.
LogFile="gzip -d < DOCROOT/logs/access.log.29.6.gz |"
Example of the week of July 21-27, 2008
LogFile="gzip -d < DOCROOT/logs/access.log.29.gz |"
This would be an ideal way to go but I had severe problems with 1and1. The processing time is enormous and never finished. I have had to do this offline.
logresolvemerge.pl is necessary because it will put the access logs in chronological order.
There are a few misleading setup instructions floating around the Internet.
in the config file
LogFile="gzip -d < DOCROOT/logs/access.log.*.gz |"
OR
LogFile="gunzip -c $(ls -t
$HOME/logs/access*.gz | head -1) |"
will decompress the gz files and show them "on the fly". This does NOT work because the files will not be in chronological order. This is why the aforementioned logresolvemege.pl is used.
Next: Update AWStats Database