In general, there are xml documents large, for 9mb. Specific data. For them, there are xslt templates. If I cut the document, shorten it to 500kb. Everything works quickly, I tried it through the browser, php and xsltproc (on ubuntu server) everything works. But here's the original document, everything hangs, the browser hangs, and the xsltproc utility works for a long time, but I didn’t wait for the end.

Are there ways to speed up the process?

In general, I need everything to work on php

  • For such volumes, it is better to do transformations with an external module written in, say, c ++. And ... it is possible to do the processing not through the xml library, the msxml_xx.dll series itself works rather slowly with such volumes. - nick_n_a
  • You can try to start the conversion process in a separate stream, and give every minute, for example a space and do flush() plus show some kind of animation like “wait”. When ready - give the page and the js script that extinguishes the animation. - nick_n_a
  • This is all clear with animations and expectations. but the specialized utility xsltproc is it not written in C? maybe there is some sort of utility. I have a smaller file up to 3.5 MB on it, too, everything hangs up completely and I can not wait for the end ... unacceptable. I understand if you wait for a minute a maximum, but I waited about 30 minutes and turned it off ... - Alex Lizenberg
  • Maybe on C, but apparently it is not designed for large volumes. It seems to me that the specification itself is so complex ... that it is difficult (or impossible) to do it. If there are GC + variant types, then it is accurate for small volumes. For the private case, you can. My experience with xslt conversions showed that converters do not do well with large volumes. - nick_n_a
  • Well, I read somewhere on the forums, there files under 100-500 mb are thrust into this utility, xsltproc, it processes them. here apparently the structure itself can be very complex ... - Alex Lizenberg

0