How are you (and your institution) assessing 3rd party vendors (databases, digital asset managements) for accessibility?
Our institution will be formally including accessibility as a part of our criteria for determining which
3rd party vendors (e-learning resources, databases, digital asset managements) to purchase.
Do you run any automated or user testing on 3rd party vendors?
We're trying to determine a realistic (in our staff capacity) of criteria for determining accessibility for current and prospective 3rd party vendors.
I've found about VPATs which can be a bit complex that I've found and also not always be an accurate representation of the vendors'
actual sites/resources when I've conducted automated tests on their pages. Additionally, If I were to use them, I'd need
to create a scoring system whether (perhaps based on thes value of the supporting features'column) which I've considered.
Because what they state on a VPAT and what's in reality may not match, I'm leaning towards not using the VPATs at all.
When you've found any discrepancy between a vendor's VPAT and your own testing and requested them to fix the compliance issue?
For those not familiar with VPAT, check out
With the aforementioned issues of VPAT, I'm considering adopting a criteria checklist provided by the ASCLA (I've uploaded to my library's website -
(Yes, I'm aware that it's a PDF).
into a score-based card. Thoughts?
Thanks in advance for your attention and insight.
Cleveland Public Library
NOTICE: This e-mail message and all attachments transmitted with it are intended solely for the use of the addressees and may contain legally privileged, protected or confidential information. If you have received this message in error, and/or you are not the intended recipient, please notify the sender immediately by e-mail reply and please delete this message from your computer and destroy any copies. Any unauthorized use, reproduction, forwarding, distribution, or other dissemination of this transmission is strictly prohibited and may be unlawful.