On Fri, Mar 5, 2010 at 3:14 PM, Houghton,Andrew <[log in to unmask]> wrote:
> As you point out JSON streaming doesn't work with all clients and I am
> hesitent to build on anything that all clients cannot accept. I think part
> of the issue here is proper API design. Sending tens of megabytes back to a
> client and expecting them to process it seems like a poor API design
> regardless of whether they can stream it or not. It might make more sense
> to have a server API send back 10 of our MARC-JSON records in a JSON
> collection and have the client request an additional batch of records for
> the result set. In addition, if I remember correctly, JSON streaming or
> other streaming methods keep the connection to the server open which is not
> a good thing to do to maintain server throughput.
>
I guess my concern here is that the specification, as you're describing it,
is closing off potential uses. It seems fine if, for example, your primary
concern is javascript-in-the-browser, and browser-request,
pagination-enabled systems might be all you're worried about right now.
That's not the whole universe of uses, though. People are going to want to
dump these things into a file to read later -- no possibility for pagination
in that situation. Others may, in fact, want to stream a few thousand
records down the pipe at once, but without a streaming parser that can't
happen if it's all one big array.
I worry that as specified, the *only* use will be, "Pull these down a thin
pipe, and if you want to keep them for later, or want a bunch of them, you
have to deal with marc-xml." Part of my incentive is to *not* have to use
marc-xml, but in this case I'd just be trading one technology I don't like
(marc-xml) for two technologies, one of which I don't like (that'd be
marc-xml again).
I really do understand the desire to make this parallel to marc-xml, but
there's a seem between the two technologies that makes that a problematic
approach.
--
Bill Dueber
Library Systems Programmer
University of Michigan Library
|