On Wed, 29 Sep 2004 08:14:42 +0900, Patrick May <patrick / hexane.org> wrote:
> A tarpit would be easier to implement than a captcha.  In the usemod
> settings, you use NetAddr::IP to check if the env's Remote Addr is
> within a known spammer domain.  If it is a spammer, set the pages
> database to a copy.  Nightly / weekly / whatever, dump the latest pages
> directory on top of the tarpit.
> 
> There goes one of my points for my presentation :-)
> 
> The main resource in fighting spammers is time.  You want to waste
> their time, let them think that things are working.

I'm approaching it, again, from a slightly different perspective. My
goal is to make the page seem as if it were entirely a read-only
website to robots, and 403 if they are known bad crawlers. I don't yet
have IP banning, but I have robot exclusion.

-austin
-- 
Austin Ziegler * halostatue / gmail.com
               * Alternate: austin / halostatue.ca
: as of this email, I have [ 6 ] Gmail invitations