This is the mail archive of the xsl-list@mulberrytech.com mailing list .


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

RE: Large XML Files


> > Does any one know of an XSLT processor wich will not
> read-in all of the
> > XML
> > input at once? I read somewhere in the archives that Saxon
> has the ability
> > to read-in only a subtree at a time.

Saxon's <saxon:preview> element basically allows you to transform a sub-tree
as soon as it has been read, and then discard it from memory.

> Are there any other XSLT processors
> > that can do this?

Not directly, but what you can do is write a SAX filter application that
sits between the XML parser and the XSLT processor, so that the filter
effectively breaks up the large document into lots of small ones and
transforms each small document as soon as it has been read.

> > The  W3C Candidate recommendation called XML Fragment Interchange at
> > <http://www.w3.org/TR/xml-fragment> addresses this issue.

I don't think that proposal is relevant (or it least, I haven't understood
its relevance!)

Mike Kay


 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]