Archive for Oktober, 2007|Monthly archive page

Debian Package Cacher With apt-cacher

I’m so proud of Linux world wide community, they know what people need, what Linux lack of, then they contribute a useful package for the community. Thanks to apt-cacher package, sharing debian packages in LAN is easy seamless and no longer bandwith consuming.

I suggest you check out the output of ‘apt-cache show apt-cacher’ command to get a brief description about what the package is for.

Before apt-cacher saved my life, I used to move packages found in /var/cache/archives into specific folder, create index manually, then share the directory across computers in my network. This of course forced me to edit source.list to point to the shared directory. Oh, this is inconvinient. Each time I need a specific package, I have to download to cache, reindex which is very time consuming, then apt-get update in each computers which need it.

When apt-cacher installed on a server, it works like Squid Proxy (apt-cache show squid). It intercepts apt request to it, then download packages or indexes for the requesting user while it keep a copy of them as cache in the server for later request. All client computers need to do to make this relationship works, is just by putting in /etc/apt/apt.conf:

Acquire::http::Proxy “http://192.168.1.1:3142”;

Yes, that’s all. You don’t need to change source.list. In this case, http://192.168.1.1 is the server.

Now, we’ll see how to setup the server:
1) Download the package
sudo apt-get install apt-cacher

2) Configure to make this service automatically started. Edit /etc/default/apt-cacher and please make sure you find this line

AUTOSTART=1

3) Optional. Configure main configuration /etc/apt-cacher/apt-cacher.conf

# To generate report daily
generate_reports=1

4) Start the service and verify that it is running
/etc/init.d/apt-cacher restart

Open in your browser to http://192.168.1.1:3142

5) Let apt-cacher cache your already-downloaded packages.

/usr/share/apt-cacher/apt-cacher-import.pl /var/cache/apt/archives

5) Want to see how seamless it works?
Watch it access log:
tail -f /var/log/apt-cacher/access.log

In a client computer which has a modified apt.conf, do apt-get update or apt-get install. You’ll see something in the log. If your request is already downloaded -just like Squid-, it will show HIT

Happy apt-get!

Iklan

SquidGuard: Efficient Web Filterer

お久しぶり. It’s been a long time until I start this writing. This time is about Squid utilities: to filter porn sites efficiently with SquidGuard. I feel filtering with Squid acl feature is not efficient way to filter sites. SquidGuard has utilize database (Berkeley if I’m not mistaken) to make filtering fast.

Please follow the steps below to impelement SquidGuard to Squid.

1) Please make sure that Squid is already running perfectly except filtering porn sites.

2) Install SquidGuard

apt-get install squidguard

3) What should we define here?
Uncomment (remove the most left #) the following lines in SquidGuard configuration file /etc/squid/squidGuard.conf.

#dest adult {
#       domainlist      adult/domains
#       urllist         adult/urls
#       expressionlist  adult/expressions
#       redirect        http://admin.foo.bar.no/cgi-bin/squidGuard.cgi?clientaddr=%a+clientname=%n+clientident=%i+srcclass=%s+targetclass=%t+url=%u
#}

Those lines define a group of urls to redirect. Change to fit your taste (e.g. redirect http://porn-is-sin.com). Later we’ll see what is domainlist, urllist and the rest.

4) Put the group name into default acl. Edit snippets found in SquidGuard configuration file like the following lines.

default {
  pass     local !adult none

If there are some sites that are falsely judged as porn sites, we can make some exceptions by creating a group, say whitelist. So that, we have the following line instead of the previous one:

pass     local whitelist !adult none

5) Download periodically a free black list urls (You can use crontab) ftp://ftp.univ-tlse1.fr/pub/reseau/cache/squidguard_contrib/blacklists.tar.gz

6) It’s better to make each automatic update in it’s own group name, e.g. adult, cracks, etc. Please remember that the structure of redirected urls and domains is recommended to be like this:

/var/lib/squidguard/db
                      /adult
                      /domains
                      /expressions
                      /urls

– domains contains:
pornoabis.com
babeeheaven.com

– urls contains:
nicesites.com/relevance/search/pornhidden

7) SquidGuard works efficiently with database. Therefore, you’d better convert the 3 list files into db friendly version by doing this in command line:

squidGuard -C all
chown -R proxy /var/lib/squidguard/db/*

When it’s done proses you’ll find unempty files with extensions .db and you’ll find the following lines (more or less) in /var/log/squid/squidGuard.log:

2007-10-02 11:10:06 [10498] db update done
2007-10-02 11:10:06 [10498] squidGuard stopped (1191298206.833)

8) Test first
You need to test your configuration out of the box first. For example if the client who one to access is from 192.168.1.113, you should run

echo "http://www.pornsite.com 192.168.1.113/ - - GET" | squidGuard -c /etc/squid/squidGuard.conf -d

If it’s redirected, then you’ll find the redirect url. Misalnya:

2007-10-02 11:32:59 [10574] squidGuard ready for requests (1191299579.991)
http://porn-is-sin.com 192.168.1.111/- - -
2007-10-02 11:32:59 [10574] squidGuard stopped (1191299579.996)

Else:

2007-10-02 11:33:22 [10576] squidGuard ready for requests (1191299602.507) 2007-10-02 11:33:22 [10576] squidGuard stopped (1191299602.509)

9) It’s time to fight againts crime.
Put the following line into the last line of /etc/squid/squid.conf

redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf

Don’t forget to reload Squid configuration:

/etc/init.d/squid reload

Have nice day! 🙂