What is the cost of parsing a large XML file using PHP on every page request?
I would like to implement custom tags in HTML.
<?xml version="1.0"?>
<html>
<head>
<title>The Title</title>
</head>
<body>
<textbox name="txtUsername" />
</body>
</html>
After I load this XML file in PHP, I search for the custom tags using XPath and manipulate or replace them.
Is this very costly or is it acceptable? What about applying this to a large scale web site?
In the past I also used XSLT for large sites, and it didn't seem to slow things down. This is somehow similar to XSLT, but manual.
I would guess pretty costly, but the best way is to test it yourself and measure the peak memory usage and time required to run the script.
You might be able to cache some intermediate state, so that the heavy XML parsing doesn't have to be done everytime - maybe you could replace the tags with actual PHP code like Smarty does and then include that generated/cached PHP file instead.
The cached file could look like the code in Soulmerge's answer.
Is this very costly or is it acceptable?
Don't guess. Measure.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With