Home
| Calendar
| Mail Lists
| List Archives
| Desktop SIG
| Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU |
Hi Folks, Is there anyone out there with Squid knowledge, specifically using squid as an accelerated caching web-server ? I am having .. understanding .. issues .. I have setup Squid in the following manner, to act as a caching accelerator in front of our web server. We serve a lot of mashup pages that take a lot of time to put together, but once constructed, never change. Thus serving from a cached page should make the page delivery really quick after the first request. 1. Put squid in front of the web server, have it answer the incoming requests on port 80 2. If squid has a copy of a requested page, then send the copy. 3. if not, then request it from the real domain server (apache on port 81), send the page, and cache it at the same time Conundrum ----------------- The basic setup seems to be cool, but I run into issues with the fact that some of my content has varying expire times. For example, we take XML feeds from some supplier, and output content derived from the feeds. However the feeds come in bursts. This means that for several days there is no change, and the cache should spit out the copy it has,. Then, when the feed comes, it updates the pages on an 'every 30 second' basis. I still want those pages cached, with 30 second expirt times, as we will serve it many times in that 30 second gap.Then, once the feed stops, the last page should be cached until the feed starts again... at some undetermined time in the future. Most of what I have read says to use ACLs for what should and should not be cached, but not much explains how to do the above ..... anyone out there hit this before ? Or if there is a better solution that Squid, I am all ears. I have no special attachment to squid, it was chosen through familiarity with its name as much as any other reason, thanks for any help, Richard -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean. _______________________________________________ Discuss mailing list [hidden email] http://lists.blu.org/mailman/listinfo/discuss
BLU is a member of BostonUserGroups | |
We also thank MIT for the use of their facilities. |