Print

Print


I think another software that can help is Integrity. Integrity checks all
the broken URL links on any website address that you put in.

Hope this helps.

Best,

Nida Islam
Graduate School of Library and Information Studies
University of Rhode Island
[log in to unmask]
(401) 391-3526

On Jul 7, 2017 1:32 PM, "Dan Scott" <[log in to unmask]> wrote:

> You might want to investigate the E-Resource Access Checker script [1]
> Kristina Spurgin wrote and documented in the 2014 code4lib article "Getting
> What We Paid for: a Script to Verify Full Access to E-Resources" [2] -
> sounds like it would achieve your goals (and beyond).
>
> 1. https://github.com/UNC-Libraries/Ebook-Access-Checker
> 2. http://journal.code4lib.org/articles/9684
>
> On Fri, Jul 7, 2017 at 12:19 PM, Ken Irwin <[log in to unmask]> wrote:
>
> > Hi folks,
> >
> > I'm looking for a tool to do automated unit-style testing on urls. I want
> > to know about HTTP errors, but also be able to test for things like "does
> > this page include particular text", "does this page fall within the
> > expected length parameters" and things of that sort. I have a few dozen
> > URLs that I want to be able to run tests periodically in our dev process
> > for a project that's been having some unexpected affects on a variety of
> > the library's online tools.
> >
> > By preference, a web-based or PHP-based tool would be ideal, but I'm open
> > to less familiar-to-me options as well. I could probably write something
> > with cURL, but maybe somebody's already got an elegant solution.
> >
> > Any ideas?
> >
> > Thanks
> > Ken
> >
>