Execute Multiple Unique xQuery Against One File Optimization Saxon EE 9.8
Added by Ken Beitel almost 6 years ago
Hello Michael,
We are running 100 to 500 unique xQueries against the same file and are looking to optimize performance when calling the Saxon EE 9.8 processor from Java.
We've tested file sizes from roughly 14kb to 1.4mb and are seeing response times in roughly the 10 to 20 millisecond range per query which results in a total processing time of roughly 1 to 14 seconds per file. Per your recommendation in another post, these numbers attempt to represent pure evaluation time excluding Java startup, data transmission and other overhead time.
Currently I believe our developer's code may be reloading the document every time a new single xQuery is executed.
I did quite a bit of searching and wasn't able to find an answer - basically we would like to optimize the running of multiple unique queries against the same file using Saxon EE.
Our thought is that loading the XML document once, then running multiple xQueries against the loaded document would help. Do you have any design patterns or Java sample code that would help achieve this?
We are very open to other performance optimization suggestions you may have on this topic. Thanks very much!
Ken
Replies (4)
Please register to reply
RE: Execute Multiple Unique xQuery Against One File Optimization Saxon EE 9.8 - Added by Michael Kay almost 6 years ago
It's certainly true that parsing and document building time will often exceed query execution time (especially for simple queries) and so you should structure the application so it only builds the document once.
This is pretty straightforward with s9api.
To build the document:
Processor p = new Processor(...);
DocumentBuilder b = p.newDocumentBuilder();
XdmNode doc = b.build(...);
and to execute a query:
XQueryCompiler c = p.newXQueryCompiler();
XQueryExecutable e = c.compile();
XQueryEvaluator eval = e.load();
eval.setContextItem(doc);
eval.evaluate(...);
The parsing and tree building costs are incurred by the call on build(), and this only needs to be done once.
RE: Execute Multiple Unique xQuery Against One File Optimization Saxon EE 9.8 - Added by Ken Beitel almost 6 years ago
Thank you for the fast response Michael. I checked our dev's code and it looks like the the document is currently being built every time an xQuery is ran. The bindDocument statement builds the document, right?
//loop through 100 to 500 unique queries for one xml file
//...
XQPreparedExpression xqExpression = xqConn.prepareExpression(mQuery);
xqExpression.bindDocument(XQConstants.CONTEXT_ITEM, xmlStringMessageToQuery, null, null); //this statement runs for each query
XQResultSequence xqResultSequence = xqExpression.executeQuery();
//continue looping for next unique query
We'll upgrade to the method you've recommended and will post the updated query times over the next few days or as soon as the update has been made.
Thanks again! Ken
RE: Execute Multiple Unique xQuery Against One File Optimization Saxon EE 9.8 - Added by Michael Kay almost 6 years ago
Yes, I think that this is quite hard to achieve with the XQJ interface - in fact I don't think it can be achieved with "pure" XQJ, you have to dive into Saxon interfaces somewhere. If you want to keep most of the code using XQJ then the version of XQExpression.bindDocument()
that takes a Source
object as the second argument should do the trick: Saxon's NodeInfo
implements Source
, so you can supply a NodeInfo
for this argument, and you can get a suitable NodeInfo
using XQDataFactory.createItemFromDocument()
, casting the resulting XQItem
to SaxonXQItem
, calling SaxonXQItem.getSaxonItem()
, and casting the resulting Item
to NodeInfo
(or directly to Source
).
But I don't think you would regret converting the whole application to use s9api rather than XQJ - it's a much better fit with Saxon functionality and architecture.
RE: Execute Multiple Unique xQuery Against One File Optimization Saxon EE 9.8 - Added by Ken Beitel almost 6 years ago
Thanks - that's most helpful and we'll implement the s9api approach.
Please register to reply