An Article by Gerry McGovern
Large websites often struggle to develop an efficient and cost-
effective publishing model. Centralizing publishing ensures a
consistent quality of what is published, but is often slow and
frustrating. Decentralized publishing is faster and often more
cost-effective, but can result in inconsistent quality, unless
rigorous publishing standards are adhered to.
There is a definite trend towards the centralization of
information architecture: metadata and classification design,
navigation, search, layout and graphic design. Because the Web
is inherently a navigational space, readers like a consistent
architecture. Of course, adhering to standard templates is
also more cost-effective, faster to implement, and easier to
There is less consensus on what sort of publishing process to
implement. The centralized model tends to work well where the
organization does not have extensive publishing expertise. Many
organizations simply do not have a history of publishing.
Sometimes, staff don't have the skills or the interest in
publishing content, whether to the intranet or public website.
Often, such organizations don't have all that much to publish
on an ongoing basis. Thus, a centralized model can work well,
where a small central team coordinates and publishes content
for the entire organization.
One area of particular debate is in relation to metadata. Some
believe that trained librarians (or similar professionals)
should be responsible for the input of metadata. I believe
that metadata needs to be as decentralized as possible. In
fact, it needs to become core to the writing process.
(But that's for another issue.)
Centralized publishing is, by definition, restrictive. The
publishing workflow tends to be slower and more convoluted.
Sometimes, centralized publishing can be simply too
restrictive. "Since the centralized group has tried to take
over, it's all going South in a hand basket," Gail Bennett,
who works for a major corporation, told me in an email
"They want full control for editing which makes no sense,"
Gail continues. "Instead of me changing one report link which
takes about 30 seconds, I'm supposed to gather and send all
the info up the road to sit in a huge priority waiting list.
By the time they get to our edits weeks later, they're
Such a situation is by no means uncommon. Where content needs
to be published quickly, the central team often becomes a
bottleneck. However, it's not simply a matter of allowing a
publish-at-will approach. Allowing everyone to publish
whatever they want whenever they want is a recipe for
So, if you feel you need to decentralize, you must put in
place proper structures. These include:
* A style guide: This should cover style and tone, usage,
glossary and references. Too many organizations create style
guides and then never adhere to them. This is counter-
* A publishing policy: You need to establish policies that
deal with scheduling and commissioning, metadata and
classification, editing and removal of old content,
promotion, legal issues, etc.
* Publishing measurables: If you're going to allow people to
publish they must be measured on what they publish. It's
as simple as that.
* Training and evangelism: Quality publishing is a complex
task requiring skill and experience. Few organizations have
reached a level of sufficient competence. To upgrade
skills, extensive training programs will be required.
Before such training is initiated, staff need to be
evangelized on the importance of getting publishing
Gerry McGovern has spoken, written and consulted extensively on web content management issues since 1994. Find out more here
This article was republished with permission.