I'm using Python's suds library which tries to fetch xml.xsd over the network. Unfortunately, the w3c server is hammered due to other programs like mine and cannot usually serve the document.
How do I intercept suds' URL fetching to always grab a local copy of this file, even without having to download it into a long-lived cache successfully the first time?
The problem with fetching xml.xsd has to do with the "http://www.w3.org/XML/1998/namespace" namespace, which is required for most WSDLs. This namespace is mapped by default to http://www.w3.org/2001/xml.xsd.
You may override the location binding for this namespace to point to a local file:
from suds.xsd.sxbasic import Import
file_url = 'file://<path to xml.xsd>'
Import.bind('http://www.w3.org/XML/1998/namespace', file_url)
The suds library has a class suds.store.DocumentStore that holds bundled XML in a uri -> text dictionary. It can be patched like so:
suds.store.DocumentStore.store['www.w3.org/2001/xml.xsd'] = \
file('xml.xsd', 'r').read()
Unfortunately this doesn't work because DocumentStore only honors requests for the suds:// protocol. One monkey patch later and you're in business.
It would also be possible to override the Cache() instance passed to your suds Client(), but the cache deals with numeric ids based on Python's hash() and does not get the URLs of its contents.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With