customer experience, In Web Analytics Ranking World, Another Choice Emerges

Who's got the best web analytics software?

Forrester Research stands by these guys in its Wave for web analytics released in May. Forrester's not the only one ranking analytics. Or ranking anything, as we just learned this week with Gleanster Research.

The latest to flex its ranking muscles is TrustRadius, an Austin, Texas-based forum for professionals to share "candid insights about business software."

And TrustRadius feels it has 230 good reasons for being a trusted source for analytics rankings --  the 230 reviews by analytics software users that ultimately led to the crowdsourcing firm's TrustMaps™ for Digital Analytics software.

"It's performed by market segment -- company size -- others are one size fits all," Vinay Bhagat, CEO of TrustRadius, told CMSWire when asked what makes his company's ranking style unique. "It's based upon real user information vs. a survey of vendors, a typical analyst approach. Analysts generally do a very poor job of collecting real user feedback. Their surveys are long feature checklists, and are often driven by historical relationships with vendors."

Transparency Examined

So we have here two sides of the rankings spectrum, it seems: crowdsourced versus analyst. We've explored each here, and the latter made big news here last month. The major analysts in the space here, of course, are Forrester Research and Gartner Research, which produce rankings in Waves and Magic Quadrants, respectively.

Bhagat did not call out Forrester or Gartner specifically in his previous comments, but he did here. "It's transparent," Bhagat told CMSWire about his team's methodology. "Nobody understands just how Waves and Quadrants are calculated. We are 100 percent transparent in our methodology and the number of data points from real customers."

Asked about transparency in its Wave, Forrester's David Truog, vice president of research quality, told CMSWire in direct response to Bhagat's comment that the Wave methodology is transparent. 

"For each Wave, after we develop our criteria for evaluating product vendors or service providers in a particular category, we solicit feedback from them about these criteria," Truog said. "We then adjust the criteria based on any feedback we believe justifies a change. Next, we collect data about what they offer in each of the areas covered by the criteria. We then assign scores for every criterion, to each offering. The scores are based on scales we develop that reflect Forrester's opinion about and analysis of what matters most in the category for each criterion."

In this process, Truog said some providers disagree with Forrester's responses to their feedback about the criteria -- and there are disagreements with each other's feedback, too. Some also disagree with Forrester's scales.

"However, these two areas of potential disagreement do not amount to a lack of transparency," Truog said. "They reflect differences of opinion about the definition of the category and about what businesses need from providers' offerings in the category."

Reached by CMSWire to discuss Gartner's transparency in its Magic Quadrant, Gartner spokesperson Andrew Spender cited the firm's research methodologies

TrustRadius' Analytics Leaders

The "who's got the best rankings model" debate aside, TrustRadius released today its analytics winners. 

Top Rated Digital Analytics Software infographic

The best digital analytics software for enterprises based on user ratings and market segment adoption are:  

  • Adobe Analytics: the most widely used paid tool amongst enterprises, according to TrustRadius
  • Google Analytics Premium
  • Google Analytics (free)  

Strong performers are:

  • AT Internet
  • comScore Digital Analytix
  • Piwik

The best digital analytics software for mid-size companies are:    

  • Adobe Analytics
  • Google Analytics (free)  

Strong performers with high user satisfaction but lower mid-size segment adoption are:

  • GoSquared
  • AT Internet
  • KISSmetrics  

The best digital analytics products for small businesses are:

  • StatCounter
  • Piwik
  • Google Analytics (free)  

According to TrustRadius' report, all the digital analytics leaders in the small business segment are either totally free or offer a widely adopted free version, as small businesses tend to use free or low cost tools. These are often easy to implement and use.  

Strong performers in the small business segment with high user satisfaction but lower small business segment adoption are:

  • GoSquared
  • Woopra

What Rocks in Analytics

Asked generally what leaders are doing well in analytics, Bhagat told CMSWire they are "aggressively moving to support tracking of individuals across devices, as well as integrating data from various sources beyond the website."

"They are making their products highly usable for adoption across an organization," he added. "They are innovating quickly."

Forrester saw analytics excellence this way in its Wave in May: Adobe, AT Internet, IBM and Webtrends lead the pack with strong web analytics offerings and strategies for the enterprise. Google is a “strong performer” that offers a premium product that’s gaining support for easy-to-use features. SAS Institute is a “contender,” offering a customer-intelligence-based alternative. 

Back to Rankings Debate

So what of this crowdsourcing method when it comes to vendor rankings?

We asked Bhagat a question we often hear: What do you say to people who say the crowd-sourcing model is not trustworthy because vendors are influencing users to generate positive reviews?

"That's a fair critique for many sites," the TrustRadius CEO said. "TrustRadius is different because of our approach to proactively sourcing reviews from product users (85-90 percent), and secondly because of the rigor of our quality control processes."

Anatomy of a Ranking

Bhagat said his organization's rankings are derived based upon two metrics: average likelihood to recommend ratings from authenticated user reviews on TrustRadius within a given segment (small business, mid-sized companies and enterprises, based upon employee counts) versus website adoption based upon data.

"We use the top 10k websites as a proxy for enterprise adoption, top 100k for mid-market and the entire Internet for SMB," he added.

Leaders score above the median for both average user ratings and adoption. Strong performers score above the median for average user ratings but below the median for adoption. The analysis is performed by market segment.

"Between 85-90 percent of reviews on our site are garnered through our outreach to people that we identify as users of specific products," Bhagat said. "They are either existing members of our community, who have profiled themselves, or people who have identified themselves as product users in other ways, e.g. on their LinkedIn profile. The rest come from vendor referrals. We screen people out who are resellers, or have not used a product in the last six months."

Reviewer Authenticity?

Often questioned with these crowdsourcers are the actual users. How do we know they don't have an agenda?

Every reviewer needs to authenticate themselves via their LinkedIn profile, Baghat said, and TrustRadius checks every profile for validity, and to ensure that it's not a vendor reviewing themselves or a competitor. 

Here are some examples of reviews on TrustRadius

"We also have an algorithm that computes a validity score for a LinkedIn profile," he added. "Every review is checked by a researcher before we publish it. We reject 3-5 percent of reviews which don't meet our standards; just seem like marketing pitches or vents without substance to back them up. We also validated our findings by interviewing market leading analytics experts."

Title image by Syda Productions / Shutterstock.