One useful and powerful tool is retroactive analysis and mining of historic data. There are enormous libraries, which heretofore were maintained solely by expensive, explicit effort. Most of the film shot, printed and used by mankind is now in landfills. What remains will only be preserved through digitization, functional metadata for authentication and sourcing, and network archiving. By making it available for use, much will be used and seen. Furthermore, historic events can be reconsidered through the retroactive lens: We can literally watch stories flow across the nation, through headlines, like the sun sweeping the continent. Information has always moved in waves, and through most of history those waves were far slower, and thus easier to read. Even regional TV has value both within it’s own markets (nostalgia) and without (camp/americana stock).
Persistent memory is the concept of an ever-growing, always searchable database of things across time, as well as space and the contemporary networks. The benefit of persistence is it allows us to track change. This is best exemplified in wikis, which are open to everyone, but difficult to corrupt because original versions can be rolled back so easily! The real power of persistent memory is seen in the kinds of searches that captured the 9/11 terrorists: retroactive data-mining, connecting the dots of web caches, phone records and cross checking numbers back through time to establish relationships. Google Maps and MSN satellite imagery are great, but they’d be greater still if we could look across time at areas! Sam Raimi’s Century Project considers photographs of cities growing and morphing over time – these images will certainly be as useful as they are beautiful. Any such system will certainly yield results, as every TV weather roof-cam demonstrates.
The value of all these applications is in quantity over quality. Databases must be open to searching to create the products that give the database itself value.