But is the content made readily available in the original markup (wiki markup, markdown, whatever), or would you have to scrape everything from the website? Wiki content is normally stored in a database, and presumably that database isn't just open to the world to query directly
Many wikis have open APIs (e.g. mediawiki instances typically do). Not sure about MDNs custom thing, but scraping is always an alternative - or even working off one of the HTML dumps available for offline use (although loosing history and original format sucks).
Theoretically since you can edit the original-format content from the website, you could scrape it from the website. I'm just wondering if this is something anybody is thinking about/working on.