How to deal with slow HTTP clients?

    February 5, 2007

Every now and then a bunch of really slow HTTP clients decide to suck down pages off my web site. This is bad because when enough of them do this, it dramatically lowers the number of free Apache processes available to handle requests from the rest of the world.

I don’t know if it’s some lame DDoS attack or just really slow clients.

In years gone by, I know that lingerd was a solution to this problem. But there doesn’t appear to be much activity around it these days. In fact, the lack of a lingerd package in Debian (there is an old unofficial packagas) suggests that there are better methods.

I’ve been using mod_limitipconn to partly deal with the problem, but I need to keep that number high enough that it doesn’t penalize normal browsers. That makes it a sort of half-assed solution.

It occurs to me that I could put Squid in front of Apache, but that seems a little heavyweight. Or maybe my impressions of Squid are skewed.

Anyway, I’m looking for ideas or pointers to the obvious thing I’ve missed.



Jeremy Zawodny is the author of the popular Jeremy Zawodny’s blog. Jeremy is part of the Yahoo search team and frequently posts in the Yahoo! Search blog as well.

Visit Jeremy’s blog: Jeremy Zawodny’s blog.