This is the mail archive of the
xsl-list@mulberrytech.com
mailing list .
RE: Large XML Files
- From: "Michael Kay" <michael dot h dot kay at ntlworld dot com>
- To: <xsl-list at lists dot mulberrytech dot com>
- Date: Mon, 7 Jan 2002 16:05:40 -0000
- Subject: RE: [xsl] Large XML Files
- Reply-to: xsl-list at lists dot mulberrytech dot com
> > Does any one know of an XSLT processor wich will not
> read-in all of the
> > XML
> > input at once? I read somewhere in the archives that Saxon
> has the ability
> > to read-in only a subtree at a time.
Saxon's <saxon:preview> element basically allows you to transform a sub-tree
as soon as it has been read, and then discard it from memory.
> Are there any other XSLT processors
> > that can do this?
Not directly, but what you can do is write a SAX filter application that
sits between the XML parser and the XSLT processor, so that the filter
effectively breaks up the large document into lots of small ones and
transforms each small document as soon as it has been read.
> > The W3C Candidate recommendation called XML Fragment Interchange at
> > <http://www.w3.org/TR/xml-fragment> addresses this issue.
I don't think that proposal is relevant (or it least, I haven't understood
its relevance!)
Mike Kay
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list