A significant problem with this is trust. You can't trust websites to reliably or accurately index their sites due to both incompetence and malice. I don't think there's any way around the malicious component. Formal or informal standards may take care of the competence factor with the feature being built into common publishing platforms.
XML sitemaps are a microcosm of putting the indexing onus on websites instead of the search engines - they are basically ignored by search engines because they have been abused and are not a useful signal. If pages aren't important enough to be linked to throughout your website then they aren't interpreted as being important enough to return to users. The optimistic case is that sitemaps/indices will send parallel signals to the search engines in which case they are redundant. The pessimistic case is that the sitemaps/indices will send signals orthogonal to the content provided to users in which case the website is either being deceitful or incompetent. In any case, the search engine will not want to use the sitemap/index as a signal as it either doesn't provide value or provides negative value.
The code for doing the indexing (at least by default) could be built right in to the web server, so it'd just be a matter of enabling an option in Apache or the like.
It would be pretty easy to verify whether or not the index is accurate with a small random sample of pages on the site, and then penalize / exclude (or do a de-novo crawl) for those sites not providing a legit index.
Honestly I'd much rather have a bunch of dice rolls on incompetence than the current centralized, single point of control over the entire index.
Google has been purging large swaths of data from the indexes and they won't say how or why or exactly what criteria they are using. It's difficult to imagine a worse solution for the web than this current model.
XML sitemaps are a microcosm of putting the indexing onus on websites instead of the search engines - they are basically ignored by search engines because they have been abused and are not a useful signal. If pages aren't important enough to be linked to throughout your website then they aren't interpreted as being important enough to return to users. The optimistic case is that sitemaps/indices will send parallel signals to the search engines in which case they are redundant. The pessimistic case is that the sitemaps/indices will send signals orthogonal to the content provided to users in which case the website is either being deceitful or incompetent. In any case, the search engine will not want to use the sitemap/index as a signal as it either doesn't provide value or provides negative value.