I'm not too familiar with HTTP headers and the like, but I suspect that there ought to be somewhere (heck, it could be in the page source) to pass a "permanence" tag. Providers or other intermediaries which perform any form of caching could read these tags and permanently cache the page (i.e., never request it again). Still, what if someone did in fact change the original page and it was later requested for the first time from a different server? Hmm. I've been reading some of
Lawrence Lessig's stuff, which pointed out to me once again that the Internet was designed around the idea of edge intelligence rather than centralized intelligence. So maybe this sort of thing would require such substantial changes in architecture that it can't be done without changing the structure of the 'net beyond recognition.