Baseball season is here. As we now live in the world of "Big Data" and "data driven organizations" many in the analytics field think about the Billy Beane and Moneyball effect on how it has brought data into the realm of "cool." When I read Moneyball in 1996, I was inspired to write a book about analytics because it totally struck a chord.

On the other hand, I'm influenced by my roots in journalism, and I always feel like there needs to be balance … so I'm a bit uncomfortable with the religious fervor that seems to drive the current love affair with data. Seems like there too many high level discussions that involve the gathering of data but not enough consideration for how to use data.

Baseball Defies a Data Alone Approach 

Thumbnail image for shutterstock_58939603.jpgSo, with all this as a backdrop, I was just reading an article in the New York Times about how strikeouts seem to be going up among hitters. Lots of good data and charts, and lots of non-quantifiable reasons to suggest why this is happening, such as stronger pitchers, more pitchers used during games, weaker hitters, swinging for the fences vs. contact hitting, and so forth.

It's one of those subjects that make for a good discussion over a beer. I was recently in Havana, and found that every day, about a hundred or so gather to talk baseball … any time of day.

Yes, what's great about baseball is that for as much data as there is, you never really know the answer … the human factor plays a huge role in why players and teams are successful or not. There are reasons for results that data won't explain or lend to a clear “right” answer.

Baseball is fun because the stats do not predict the future.

Predictions vs. Perspective

Steven Jobs didn't like market research and didn’t have much use for stats. He believed that he knew better than anyone else what the market wanted. Sometimes he was right and sometimes he wasn't. He was the ultimate HIPPO (Highest Paid Person in the Office). However, he was also the ultimate usability tester. He would see his ideas presented in hundreds of ways before settling on the designs that would go to market.

He would have his products engineered to fit the design because he was sure the designs would address customer needs.

Clearly both Beane's and Job's approaches have their merits, but neither guarantees success 100% of the time. Beane is focused on prediction; Jobs focused on his perspective and then refining it through testing.

Where is your organization on the Beane to Jobs scale?

Or perhaps it's better to ask, how many Beane to Jobs scales are there in your organization?

When talking about the rise of new metrics and data in baseball, Washington Nationals General Manager, Mike Rizzo, says in a March 28 Washington Post article by Adam Kilgore, "It's not old school or new school. It doesn't slight the grinder, old scout in the field. But it's a tool. It would be like scouting without a radar gun. Why would you do it when a radar gun is available?"

The article goes on to note:

The Nationals' analytics department sees its mission as gathering as much information as possible and organizing in a way decision makers can easily digest, and stats are only part of the package. (They) sift through scouting notes, videos, public research and media reports."

No “religious” fervor here. Just a balanced approach between data and observations with the objective presenting analysis in a way that decision makers understand.

Matching the Data to the Need

In last month's article, I pointed out that a lot of what influences common practice around digital analytics is that it is so easy to crank out reports from the analytics software and distribute them throughout the organization. While this doesn't foster adoption of analytics, the prevailing notion of getting data into as many hands as possible and as quickly as possible is seen as the goal.

There is even more of this "understanding gap" when presenting data to senior level management. Analyst love for the numbers and a vague feeling of responsibility to deliver reports replaces the need to establish agreement on what should be reported and presented.

I was speaking with an organization that generates a significant amount of content that is funded by donors. The donors don't have a great deal of requirements for the metrics. They want to know that the funded articles are read, yet they don't have any benchmarks or targets. They want to see the numbers trend upwards. It isn't a very high performance bar, yet in this case, this is enough to continue funding.

One metric would fulfill this requirement -- page views. However, there is a lot more one could do to get more insight … segment by visitors, frequency, domain, acquisition type and so forth. We're going to provide these additional metrics. The organization can provide the required numbers to the funders. We'll show the organization how the segmented data helps them focus content development, create more targeted grant proposals and acquire readership based on target audience.

The point is that you can have more than one audience for the same data set … and each of these audiences will have different objectives and require different presentation and interpretation of the data. We aren’t talking about a "right vs. wrong" approach … it’s really a question of customizing your analysis and communication based on who’s using it.

You may have one set of decision makers that prefers to see things like Steven Jobs, and another like Billy Beane. One group may need to feel like their perspective is leading the conclusions; the other may need to feel like the data is leading their conclusions. The key to adoption and using data to inform is not in shoving data down people’s throats … it is in giving them what they need and explaining to them how to use it. There may be some who are more ready for this than others.

So, think about where your decision makers are on the Beane/Jobs scale … and then rather than thinking of more data to push, think about how to provide the interpretation, education and training into a dialogue that considers the strength of both methods in improving your digital channel.

Image courtesy of David Lee (Shutterstock)

Editor's Note: To read more of Phil's thoughts on the big data trends, see his How to Untangle the Data Deluge