The Internet Is Failing The Website Preservation Test

I blogged a few months back about the increasing problem of dead links (known as link rot) on research web sites. Not surprisingly this problem is not isolated to academic websites. An article by Rob Miller in TechChrunch, called “The Internet Is Failing The Website Preservation Test,” makes the case for a a digital Library of Congress to preserve and protect all of the content on the internet to counter this growing problem. Preserving the integrity of the web for posterity cannot he (and others) argue be left to content publishers or non-profits like the Internet Archive’s Wayback Machine or the British Library’s UK Web Archive. The web is such a vital information source to us all know it must be preserved for future historians.

from The Universal Machine http://universal-machine.blogspot.com/2015/09/the-internet-is-failing-website.html

Advertisements

About driwatson
I'm a New Zealand author, computer scientist and blogger specialising in Artificial Intelligence. I also have an interest in the history of computing and have just written a popular science book called "The Universal Machine - from the dawn of computing to digital consciousness."

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: