Squid cache
|
Squid is a popular open source (GPL) proxy server and web cache. It has a variety of uses, from speeding up a web server by caching repeated requests, to caching web, DNS, and other network lookups for a group of people sharing network resources. It is primarily designed to run on Unix-like systems.
Squid has been in development for many years and is considered very complete and robust. It supports many protocols, although it is primarily used for HTTP and FTP. Some support is available for TLS, SSL, and HTTPS [1] (http://www.squid-cache.org/Doc/FAQ/FAQ-1.html#ss1.12).
Contents |
Web proxy
Caching is a way to store requested Internet objects (i.e., data available via the HTTP, FTP, and gopher protocols) on a system closer to the requesting site. Web browsers can then use the local Squid cache as a proxy HTTP server, reducing access time as well as bandwidth consumption. This is often useful for internet service providers to increase speed to their customers, and LANs that share an Internet connection. Because it is also a proxy (i.e. it behaves like a client on behalf of the real client), it provides some anonymity and security.
A client program (e.g. browser) either has to explicitly specify the proxy server it wants to use (typical for ISP customers), or it could be using a proxy without any extra configuration: "transparent caching,", in which case all outgoing HTTP requests are intercepted by Squid and all responses are cached. The latter is typically a corporate set-up (all clients are on the same LAN).
Squid has some features that can further help anonymize connections, such as disabling or changing specific header fields in a client's HTTP requests. See the documentation for header_access
and header_replace
for further details.
Reverse Proxy
The above set-up - caching the contents of an unlimited number of webservers for a limited number of clients - is the classical one. Another set-up is 'reverse-proxy (http://squid.visolve.com/squid/reverseproxy.htm)' or 'webserver acceleration' (using httpd_accel_host
). In this set-up, the cache serves an unlimited number of clients for a limited number of - or just one - web servers.
Suppose slow.example.com is a 'real' web server, and www.example.com is a Squid cache server that 'accelerates' it. The first time any page was requested from www.example.com, the cache server would get the actual page from slow.example.com, but for the next hour/day/year (matter of cache configuration) every next request would get this stored copy directly from the accelerator. Result: less traffic on the source server, which means less CPU usage, less memory usage and less bandwidth.
It is possible for a single Squid server to serve both as a normal and a reverse proxy simultaneously.
Compatibility
Squid can run on the following operating systems:
- Linux
- FreeBSD
- OpenBSD
- NetBSD
- BSDI
- Mac OS X
- OSF and Digital Unix
- IRIX
- SunOS/Solaris
- NeXTStep
- SCO Unix
- AIX
- HP-UX
Recent versions of Squid will also compile and run on Windows NT with the Cygwin / GnuWin32 packages.
Wikimedia servers use Squid to cache frequently requested pages and reduce load on the main database and web servers.
As of February 2005, the current stable version is 2.5; there is also a 3.0 version in development.
External links
Information
- Squid Cache (http://www.squid-cache.org/) - official project homepage
- Squid + PF (http://www.benzedrine.cx/transquid.html) - Transparent proxying with Squid and PF.
- Logfile Analysis (http://www.squid-cache.org/Scripts/) - Squid-Cache list of logfile analyzers
- ViSolve Squid Support (http://squid.visolve.com/squid/): manual, configuration tips, ...
Add-ons
- Squidguard (http://www.squidguard.org) - A flexible plugin for advanced filtering.
- DansGuardian (http://dansguardian.org) - Smart filtering, can be used together with Squid.
- Calamaris (http://cord.de/tools/squid/calamaris/Welcome.html.en) - Squid logfile report
- Squeezer2 (http://www.rraz.net/squeezer2/) - Squid logfile report
Check cache behaviour
- web-caching.com (http://www.web-caching.com/cgi-web-caching/cacheability.py): check page cacheability
- analyze.forret.com: analyze HTTP headers and compare to Squid policyda:Squid cache