Managing SARG
One of my tasks in my new job is to monitor network usage, including internet access. The workplace implements Webmin authentication for the Squid proxy, i.e. users have to give their usernames and passwords before being allowed to access the internet.
Lately, though, there have been reports of password-sharing and visiting of banned (read: pr0n and warez) sites. Monitoring was limited to user authentication, so in this case it wasn't enough. SARG to the rescue.
I configured SARG to generate daily, weekly and monthly reports. I was surprised, though, at the sizes of the files it generated. Besides, I had no way of displaying the reports because the proxy server did not have a web server, and not even a browser. I also did not have physical access to the server, except through SSH.
SARG was configured to output the daily reports to my home directory. From there, I can view individual HTML files but after a while, this seemed cumbersome. So I thought: why not write a script to compress the outputs and have the gzip'd files mailed to me?
Hence, this script:
Okay, I know, I know: it's a kludge. But it happens to work for me. Besides, I want to flex my fingers into coding again, even if it's something this UGLY.
The script's pretty rudimentary. It just takes the first day, last day, month, and year as parameters; then it iterates through the given dates and subsequently tars and gzips the output directories. It doesn't have much functionality -- just a demonstration that I haven't completely forgotten how to write shell scripts. Heh.
(And, oh: make sure to
Lately, though, there have been reports of password-sharing and visiting of banned (read: pr0n and warez) sites. Monitoring was limited to user authentication, so in this case it wasn't enough. SARG to the rescue.
I configured SARG to generate daily, weekly and monthly reports. I was surprised, though, at the sizes of the files it generated. Besides, I had no way of displaying the reports because the proxy server did not have a web server, and not even a browser. I also did not have physical access to the server, except through SSH.
SARG was configured to output the daily reports to my home directory. From there, I can view individual HTML files but after a while, this seemed cumbersome. So I thought: why not write a script to compress the outputs and have the gzip'd files mailed to me?
Hence, this script:
#!/bin/bash
#
# This script compresses daily reports
# generated by SARG.
# Created 28 July 2004 by iandexter[at]gmail[dot]com
# Program paths
TAR=/bin/tar
GZIP=/bin/gzip
# I have to tar the files in their respective
# directories first so the paths
# from the SARG-generated "index.html" are preserved.
echo "Compressing sarg.daily reports..."
# Get input dates in the form. Do some swapping,
# if necessary.
first_day=$1
last_day=$2
month=$3
year=$4
# This is specific to my configuration: the output
# directories are in the form,
# ddMMYYYY-ddMMYYYY
if [ $first_day -gt $last_day ]
then
tmp=$first_day
first_day=$last_day
last_day=$tmp
fi
# Run through the dates
for i in `seq $first_day 1 $last_day`
do
tar_file=${i}${month}${year}
dir=${tar_file}-${tar_file}
$TAR -cf ${tar_file}.tar ${dir}/ | $GZIP -9 ${tar_file}.tar
done
echo "Done."
Okay, I know, I know: it's a kludge. But it happens to work for me. Besides, I want to flex my fingers into coding again, even if it's something this UGLY.
The script's pretty rudimentary. It just takes the first day, last day, month, and year as parameters; then it iterates through the given dates and subsequently tars and gzips the output directories. It doesn't have much functionality -- just a demonstration that I haven't completely forgotten how to write shell scripts. Heh.
(And, oh: make sure to
chmod u+x the script.)
Comments
Post a Comment