Getting agreement on a decision like a big web presence change can be tough. Actually, scratch that. The problem is slightly different: getting people to think they agree can be simple, but having them understand what they are agreeing to (including the implications) is more of a challenge.

This disparity — between thinking there’s agreement when consensus is ill-informed — often means problems late in the web presence change. Problems that arise late in the game are especially problematic, because they can be difficult to react to and also lead to long-lasting animosity or distrust between teams.

As we previously covered, there are four questions to answer early (we’re talking long before the RFP) when significantly changing a web presence. And as the implications of decisions are often an issue, visualizing the impact of possible web presence transformations can help clarify changes.

Shock Into Conversation

That said, one technique of getting people on the same page stands out: estimating the effort of transforming content. Some form of content transformation is usually part of any big change to a digital presence, whether changing platforms or staying in place.

Do an estimate as early as possible with the explicit goal of driving discussion. The first estimate will probably shock everyone, and many of the content-related issues are concrete enough for many stakeholders to engage with.

But we’re not talking about just any estimate. Teams have far more control over content transformations than they realize, and in particular have the control knobs of weight (how much stuff), distance (how far from the way the current web presence works), and quality (what quality level and how consistent the content will be). The estimate — even a quick early one — should map out the assumptions on these control knobs, in particular the weight and quality assumptions.

One reason to anchor the estimate on quality assumptions is that it is exceedingly easy to reduce the problem to deciding on one dimension in planning: between automated and manual change.

Dancing Between Quality and Effort

But the first question isn’t whether to automate or not. A more productive early question is ‘What quality level are we attempting to attain, and what are the impacts of attaining that quality level?’ Then we dance — between quality and effort — as the planning discussion continues through the project. For instance, if the initial estimate is out of whack with the budget, the team can discuss what content to potentially delete in order to make room for the hands-on time required to improve the most important content.

One of the goals in this whole process is to differentiate between what’s easy and what’s important. It may mean that we spend 80 percent of our time on 1 percent of the existing content, deleting a large amount of the rest. But if we start with the automate-or-not question, we may end up dumping all the stuff that’s on the current site — in roughly the same shape — onto the new site, which doesn’t meet the goals of our big changes.

Discerning importance isn’t just figuring out what content is important. It’s also what changes we need to make to the content. For example, if better flow of content across the site is a goal, then the content needs meaningful metadata. That may turn out to be difficult, but if it’s the entire reason for the web changes, then we need to plan for that.

And by estimating early we may discover that better metadata is all we can afford (and not, for example, improving all of the images). Alternatively, we could end up blindly doing what’s easy (e.g., in some situations improving images may be relatively easy) rather than what is important (in our example here, better metadata).

Reframe the Problem

The estimate-shock-repeat process also gives room to reframe problems. For example, let's say a key part of the vision requires better metadata. Initially we may decide this means creating a very detailed taxonomy. But if the cost of this is too high, reframing the problem to concentrate on an easier to apply, narrow taxonomy could potentially get us 80 percent towards the goal at far lower cost.

Once again, it’s about refocussing the discussion on the issues that matter — optimizing resources towards goals — not automation-or-not or better-metadata-or-not.

To repeat, the estimate should clearly articulate quality and quantity assumptions backing it. In addition, the estimate should:

  • Cover all of the content improvements everyone assumed would take place
  • Include the key unspoken content changes required to achieve the goals
  • Consider the possible steps of handling content during a transformation to help uncover needs
  • Center on rules that could be re-used on an ongoing basis to keep content quality high

By estimating the effort of transforming content, we can take a step forward on getting key stakeholders on the same page early in the process, and can iterate in order to hone the effort to quality balance.

Creative Commons Creative Commons Attribution 2.0 Generic License Title image by  msim1238