Via BBGM, I hear of WebCite, an on-demand Wayback Machine for web content cited within academic publications. It’s important to make sure that links to web content in academic publications don’t fail to resolve to their intended content over time, but how valuable is it, and whose responsibility is it?
If the citing author feels it’s important, they should make a local copy. They have the same right to make a local copy as a repository does. If the cited author feels the link is important, he should take steps to maintain accessibility of his content. If neither of these things happen, this raises the question whether the value of the potentially inaccessible content is greater than the cost of a high-availability mirror of the web whose funding will come from as yet unspecified publishing partners.
These things aside, there are some important technical flaws with the project:
Of course, it’s much easier to find flaws in a solution than to come up with a solution in the first place, but it seems to me that a DOI-like system where semantic permalinks could be used that would always point to content wherever it moved around the web would work better, lead to a more complete index, and be much cheaper to run, as well. I know they chose archiving as opposed to redirecting because they wanted to link to the version of the page on the day it was cited, and that’s a good idea, but if having a copy of the page as it was is important, the author needs to make a local copy, rather than hope some third-party will do it for him.
John Udell likes it, but I’m feeling like it needs a little work.