Andy, I think there are three issues here:
1. Should the GPO put in place, at least at the moment, some throttling for
user agents behaving like dicks?
2. Should III (and others), when acting as a user agent, be such a dick?
3. How do I know if I'm being a dick?
The answers folks are offering, I think, are (1) Yes, (2) No, and (3) It's
hard to know, but you should always check robots.txt, and you should always
throttle yourself to a reasonable level unless you know the target can take
the abuse.
For the majority of the web, for the majority of the time, basic courtesy
and the gentleperson's agreement ensconced in robots.txt works fine -- most
folks who write user agents don't want to be dicks. When this informality
doesn't work, as you point out, there are solutions you can implement at
some edge of your network. Of course, at that point the requests are already
flooding through to *somewhere*, so getting things stopped as close to the
point of origin is key.
On Wed, Sep 2, 2009 at 11:26 AM, Houghton,Andrew <[log in to unmask]> wrote:
> > From: Code for Libraries [mailto:[log in to unmask]] On Behalf Of
> > Thomas Dowling
> > Sent: Wednesday, September 02, 2009 10:25 AM
> > To: [log in to unmask]
> > Subject: Re: [CODE4LIB] FW: PURL Server Update 2
> >
> > The III crawler has been a pain for years and Innovative has shown no
> > interest
> > in cleaning it up. It not only ignores robots.txt, but it hits target
> > servers
> > just as fast and hard as it can. If you have a lot of links that a lot
> > of III
> > catalogs check, its behavior is indistinguishable from a DOS attack. (I
> > know
> > because our journals server often used to crash about 2:00am on the
> > first of
> > the month...)
>
> I see that I didn't fully make the connection to the point I was
> making... which is that there are hardware solutions to these
> issues rather than using robots.txt or sitemap.xml. If a user
> agent is a problem, then network folks should change the router
> to ignore the user agent or reduce the number of requests it is
> allowed to make to the server.
>
> In the case you point to with III hitting the server as fast as
> it can and it looking like a DOS attack to the network which
> caused the server to crash, then 1) the router hasn't been setup
> to impose throttling limits on user agents, and 2) the server
> probably isn't part of a server farm that is being load balanced.
>
> In the case of GPO, they mentioned or implied, that they were
> having contention issues with user agents hitting the server
> while trying to restore the data. This contention could be
> mitigated by imposing lower throttling limits in the router on
> user agents until the data is restored and then raising the
> limits back to the whatever their prescribed SLA (service level
> agreement) was.
>
> You really don't need to have a document on the server to tell
> user agents what to do. You can and should impose a network
> policy on user agents which is far better solution in my opinion.
>
>
> Andy.
>
--
Bill Dueber
Library Systems Programmer
University of Michigan Library
|