This is the mail archive of the
mailing list .
RE: transform optimization for a schema-constrained domain
- To: <xsl-list at lists dot mulberrytech dot com>
- Subject: RE: [xsl] transform optimization for a schema-constrained domain
- From: "Michael Kay" <mhkay at iclway dot co dot uk>
- Date: Thu, 26 Jul 2001 11:20:08 +0100
- Reply-To: xsl-list at lists dot mulberrytech dot com
> > Are there any XSLT processors that can use a schema for the
> input domain to
> > improve performance?
> It would provide for some optimizations. The most obvious example is
> that the processor could use a table which tells whether elements
> may have certain descendents to optimize tree scanning for expressions
> involving //
This is true. My current view, though, is that if you only do a single
transformation on a document, it's probably faster to collect this
information from the document instance as it's being loaded than to (a)
collect it from the schema and (b) validate the document against the schema.
It only becomes an optimisation when the work of (a) and (b) can be
amortized over a large number of transformations, which is more likely to be
the case in an XML database scenario than with most XSLT scenarios.
> This leads to my last point: A DTD/Schema-aware XSL processor
> could warn
> me of misspellings and incompatibilities of the XML structure
> and XPath-
> expression in the XSL.
This kind of static type checking is very much part of the philosophy of
XQuery. It's difficult to know where to stop, though. Suppose you have a
query designed to check for an input condition that shouldn't exist (e.g. an
A appearing as a descendant of a B, or an attribute whose string-value is
empty). Then the schema is enhanced so that the condition can't exist. Is
the query now in error?
As the XSL WG and the XQuery WG are working together on the development of
XPath 2.0, such debates are a very hot topic!
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list