I appreciate and understand this perception, it is certainly not uncommon.
To compare this scenario to Microsoft isn't really comparing apples and apples. They've clearly got the resources to pull of that level of testing and it's because the markets into which the products are sold are radically different.
I would say there are several key differences in the ILS market that need to be taken into consideration. First let's consider price vs. cost. I've long claimed and will still maintain that the maintenance rates paid by libraries for ILS systems is way out of proportion with what is expected for that money. The product complexity, as I mentioned in my original post has grown enormously, while the products have been perceived by the markets to have been turned into commodities. Thus the purchase prices have dropped dramatically and therefore the maintenance paid (which is always a % of the purchase price). Secondly, maintenance rates are still 12%-15%/year -- the same rate as when I first entered this industry a few decades ago. In addition, there are external factors like the number of browsers, databases, operating systems, etc. that ILS vendors are supposed to support, which is not small, nor is the frequency with which new updates appear. Most ILS customers call a day or two after the release of same and are asking if they can use those. No small burden for the vendor. Many vendors use to sell hardware to help subsidize maintenance and even product development, but that has also dried up for the ILS vendor. So, if you plot all these data points, there is an inevitable crash of goals -- and no one really wins. If you read the "Cathedral and the Bazaar", there is some pretty interesting observations on why this is the case and I've posted about that in my blog previously. But the bottom line here is that I think ILS companies are guilty in that they've tried to work within library budgets and have tied maintenance rates to fixed annual increases, etc. and when you couple this with a market expansion rate that is petty near zero (see Marshall Breeding's annual surveys -- $500-600M/year, year after year after year), then the market model doesn't work and something has to give and service is exactly what has been sacrificed (right or wrong).
So, is there a solution? Well, I would point out that after a long career on the proprietary side of the tracks, I moved to the OSS side because I deeply believe, based on my experience, that it offers valid and important solutions to some of these very kinds of problems.
CARE Affiliates, Inc.
On Thursday, November 08, 2007, at 08:38AM, "Jonathan Rochkind" <[log in to unmask]> wrote:
>Carl Grant wrote:
>> a. You've got to accept responsibility for helping to test
>> software. There can be 1000's of pathways through code. We know you
>> want bug-free code, but the developer/vendor can't
>> test them all by
>> themselves or you'd never actually get the code!
>I don't know about this. There is a thing called "Quality Assurance".
>Mass market software makers like Microsoft spend quite a bit of effort
>on QA procedures to try and assure a basic level of _working_ before a
>product is released. In our market, we very very seldomly get this.
>There are techniques and methodologies that other software companies
>have developed to try to assure quality without "never getting the
>code". What is it about our particular industry that leads to us not
>being able to expect this?
>On the other hand, yes, customers should be willing to be beta testers.
>But the software we are often given to 'beta test' (or even as _release
>quality software_) is sometimes at a level that wouldn't even be called
>'alpha' in other industries. Other times final release software has
>serious flaws in it that keep the software from doing what it's
>advertised to do. Customers should not have to themselves perform as
>unpaid testers for the vendor to achieve a basic level of quality.
>I guess the question is in what is that 'basic' level of quality. I
>guess vendors think they are currently delivering it, but customers
>don't think they are currently getting it. (I guess the various
>constraints that keep us from _no longer buying_ the software even if we
>think we aren't getting that basic level of quality is the answer to why
>we aren't able to expect that basic level of testing before software is
>released... Many of us are trying to work on those constraints.)
>> b. If you're paying a commercial vendor to support/maintain,
>> understand that costs should go up to compensate them for supporting
>> that increasing complexity.
>> 5. Try to standardize practices, **where possible**, between like
>> institutions. Use development resources for great ideas, not just
>> to support local idiosyncrasies...
>> 6. Understand if you're trying to please everyone, it means lowest
>> common denominator. If you're trying to lead and develop new ideas,
>> somebody is going to be upset. It's not the
>> developer/vendors responsibilities to decide which of these
>> apply to
>> your institution or what to do about it when it happens. Decide up
>> front, are you following, or are you leading?
>> Carl Grant
>> CARE Affiliates, Inc.
>> E: [log in to unmask]
>> M: 540-529-7885
>> O: 540-552-2912
>> 866-340-9580 x 801 (Toll-Free)
>> Website: www.care-affiliates.com
>> Adium: carl_r_grant
>> Skype: carl_grant
>> On Nov 6, 2007, at 1:33 PM, Roy Tennant wrote:
>>> On 11/6/07 10:27 AM, "Jonathan Gorman" <[log in to unmask]> wrote:
>>>> How about an equivalent list from the vendor/software developer's
>>>> I think that would help balance the picture, but perhaps that's
>>>> already in
>>>> your plans ;).
>>> Funny you should ask...I had originally intended to do this, but
>>> then I was
>>> wondering if it start to be redundant -- that is, would a number of
>>> simply be restated from the vendor's viewpoint? But if there are
>>> points to make from that perspective it would be worthwhile to
>>> include them.
>>> This is an area where I consider myself even more ignorant than
>>> usual, so if
>>> those of you who work on that side of the fence would like to chime
>>> in with
>>> relevant manifesto points from the perspective of developers and
>>> I'm all ears. Thanks,
>Digital Services Software Engineer
>The Sheridan Libraries
>Johns Hopkins University
>rochkind (at) jhu.edu