On Mon, Oct 24, 2011 at 8:26 AM, Ken Irwin <[log in to unmask]> wrote:

> One that occurs to me to try, and I have no idea if this would match well with actual bot behavior: at the time the form loads, include at hidden field with id=[unixtimestamp]. When the form is submitted, ignore any forms that took less than (10? 15? 20 seconds?) to fill out on the assumption that bots probably do it way faster - or possibly way slower? Do they save them up for later? Should I add an upper bound? Is this just a really dumb idea?

Other things that can work: XSRF protection (many bots won't maintain
cookies and sessions), and text input field, hidden by CSS, that must
remain blank (or have some initial value unchanged).

If spam protection is more important than user-bulletproof-ness, don't
give different feedback in the event that someone has failed one of
these "normal humans will never ever fail this" case. Certainly don't
return a different HTTP code for a fail.

All bets are off if your form is being worked over by a human, but
these will foil the vast majority of crawlerbots.