Low-value content is destroying the usefulness of intranets and public websites. It needs to be stored separately.Andrew Leung is a computer science researcher at the University of California. His team analyzed a large data/content environment over a three month period. Their findings included the following: * More than 90 percent of the files were never accessed. * Of those files accessed, 65 percent were only opened once. * Most of the rest were opened five or fewer times * About a dozen files were opened 100,000 times or more. The study also found that the ratio of content/data being accessed in the system versus new content/data being published was about 2-1. In previous studies this ratio was 4-1 or higher. This means that the rate at which we are publishing new stuff versus the rate at which we are accessing already published stuff continues to grow. Recently, I've been testing the quality of a search engine for a commercial organization's public website. This organization sells a wide range of products, and its customers' search behavior reflects this. However, what I find when searching for some of this company's most popular products is that the search results are full of links to the press archive and other old, out-of date content. Some of the content is misleading and wrong, talking about, for example, a feature for a product that has long since been replaced. Poor quality, low grade, minor-interest content is choking the usefulness of the search engine. I find this happens again and again and again. It happens on intranets, many of which have become dumping grounds for low quality content. One reason intranets have become such dumping grounds is because a great many organizations have no clear strategy in relation to how they manage their content/data. Because there is no other place to put "stuff", many people simply store it on the intranet, which of course bulges and bulges and bulges. Governments have a particularly severe problem when it comes to managing content. A key reason is because the Freedom of Information Act suffers from the law of unintended consequences. Some government people are piling everything they can find onto their websites so that they can say they've made it available to the public. You may not be able to find it but it is there somewhere. Most data and content that we create is next to useless. Nobody will ever be interested in looking at it again. Then there is another chunk that has fractional demand. However, we need to store most of this low-level stuff somewhere just in case of some unforeseen event. There is a saying: What do you get when you cross a fox with a chicken? A fox. When you manage low-level content and high-quality content on the same website, the low-level content smothers and eats up the high quality content. We must thus manage them separately. We need a website for the low level stuff. But our primary website should be for the high-quality content that people actually need today. In every environment, there is a small set of content that has a disproportionate demand and value. As our content/data universe explodes, it has never been more important to manage this precious content separately from low-demand data.

About the Author

Gerry McGovern, a content management author and consultant, has spoken, written and consulted extensively on writing for the web and web content management issues since 1994.