Here's a method that's by no means foolproof but is practically zero cost (you may be using a version already). Disclaimer -- I have not actually tested this to any extent:
Include a text input field in your form that needs to be blank for the form to validate in the back end. Keep the field hidden with CSS (or z-indexed behind another element, size set to zero, etc). Users will never see it, so their forms will validate; I doubt that most spambots are sophisticated enough to check whether a form field is hidden or obfuscated before filling it in. Then silently reject submissions with that field filled.
I am not sure whether this would cause any problems with tab navigation, screen readers or other assistive technologies, but you may be able to do something to sidestep those issues.... On the other hand, captcha brings its own host of accessibility problems.
One other disadvantage is that this might be hard to implement in a CMS-based form plugin. But if you're coding forms the old-fashioned way, it's worth a shot.
>>> "Parker, Anson (adp6j)" <[log in to unmask]> 10/24/2011 9:36 AM >>>
Mollom is pretty decent...
http://mollom.com works with a lot of cms's
It is commercial with 100 free positives per day, and can require captcha,
but it tries to avoid it with a crowd sourced algorithm approach
On 10/24/11 9:26 AM, "Ken Irwin" <[log in to unmask]> wrote:
>Some of our online forms (contact, archives request, etc.) have been
>getting a bunch of spam lately. I have heretofore avoided using any of
>those obnoxious Captcha things and would rather not start now. (I
>personally loathe them and they keep getting harder, which tells me that
>the spambots are probably better at them than we are...)
>Does anyone have some good/easy/free/less-stressful spam-inhibiting ideas?
>One that occurs to me to try, and I have no idea if this would match well
>with actual bot behavior: at the time the form loads, include at hidden
>field with id=[unixtimestamp]. When the form is submitted, ignore any
>forms that took less than (10? 15? 20 seconds?) to fill out on the
>assumption that bots probably do it way faster - or possibly way slower?
>Do they save them up for later? Should I add an upper bound? Is this just
>a really dumb idea?
>If I try that one, I would start not by eliminating the bad results but
>by marking them as spam and seeing how effective it is.
>Other ideas? (PHP-friendly answers would be easiest for me to implement,
>but others may work too.)
>What works for you?