I strongly suggest moving the URL and then putting it in robots.txt.
You can keep it open and link to it as much as you want, but if
spammers can't search for boilerplate wiki phrases in Google, they'll
never find it. Of course, nobody else will be able to find it
either...
That worked for us for two years. Recently we opened it up because we
wanted people to find pages. So far no spam...
Tim
On Tue, Oct 28, 2008 at 10:29 AM, Jay Luker <[log in to unmask]> wrote:
> Our wiki page got clobbered by a spambot because I never got around to
> password protecting it. lbjay--
>
> So I've slapped an http basic auth challenge in front of it for the
> time being. The user/pass is code4lib/code4lib. Hopefully that will be
> enough to thwart the bots for a little while. Maybe we can migrate to
> drupal and hook into the main site's authentication using Birkin's
> nice hack.
>
> Also, I don't have a backup (oy, this is more embarrassing by the minute).
>
> http://ne.code4lib.org/wiki/
>
> --jay
>
--
Check out my library at http://www.librarything.com/profile/timspalding
|