two people balancing on train tracks
Businesses entering a procurement process for an enterprise search application face a steep learning curve PHOTO: Jonathan Pendleton

The procurement process for an enterprise search application often involves an imbalance in information, where the prospective customer has no experience in selecting such software, while the vendors (in most cases) have several hundred examples. This is an example of information asymmetry. In cases like this, the Request for Information (RFI) process can go far to make up this disparity, provided it is handled well.

Over the last few months I have been assisting two global clients select a replacement for their existing enterprise search application. The project teams had no prior experience of search procurement, but in both cases had carried out a significant amount of work to specify business and user requirements. They set this out in a Request for Information, along with around 20 very specific questions split between functional and implementation requirements. Both clients were willing to consider open source and commercial solutions, which is still quite unusual. As an outcome of the RFI, I read submissions from 11 different vendors, only two of them common to both clients.

In spite of the thoroughness of both RFIs, even the most well-crafted submissions failed to meet customer expectations in certain areas. By highlighting these areas, I hope it will help others adapt their RFI process accordingly.

Pitfalls of the RFI Process

Feature Overkill

The functional requirements in the RFI were, in effect, mandatory requirements. Far too many vendors used this as an excuse to list out everything in the toy box on the basis that they had a toy for every requirement. Vendors consistently failed to not only address the requirement, but to look at the introduction to the RFI and appreciate why the requirement was business-critical. 

The customer was looking for a response along the lines of, “we see what you are trying to achieve — this is how we could not only meet the requirement, but offer additional value.”

Comparative Scoring

When selecting a vendor, there's a risk of getting too deep into the process based on a numeric score. This is especially the case with search, where the solution is a development platform and not a product. Scoring does however provide a way for the project team to diligently read and understand the responses to each question.

The vendor responses seemed to be oblivious to this requirement, offering vague statements about product features and integration requirements that were impossible to score on a comparative basis despite the clarity of the questions.

Risk Analysis

Both clients were undertaking quite innovative search implementations. With innovation comes risk. In both cases we listed out some of the risks we were aware of, and asked the vendors to review and add to them based on all the implementations they had undertaken. 

The question was a test to see how well they had understood the project objectives. Only two of the 11 vendors offered a response that showed a good understanding of the project.

Team Resources

Another question asked vendors about the skills the customer would need to support the implementation and ongoing management of the search application. In most cases the response just covered IT-related skills and gave no indication of the optimum search support team. 

As with so many of the responses to other questions, the response often started with "It all depends," even when the level of detail was good enough to respond at some length.

Client Case Studies

The RFI specifically asked for recent implementations (which could be anonymized) that demonstrated how the vendor had addressed RFI requirements for other projects they had undertaken in the last 12 to 18 months. The ultimate objective was to meet with the organizations in question and learn more about the customer/vendor partnership, but that would be in the future. 

At the RFI stage, my clients just wanted to know the vendor had recent experience in some of the more unusual elements of the project. The results were not at all helpful. I was surprised to find none of the vendors had a user group, even on a virtual basis.

Pricing

In the end money is going to have to change hands, not just for the software licenses but for professional fees from integration partners and for the search support team. My clients were looking for a Rough Order of Magnitude (ROM) number, knowing the final cost may lie between less than 50 percent and over 50 percent of this number. It is no more than an informal starting point and allows the project team to receive authorization by the project sponsor to at least proceed with discussions.

The initial response to this question was at best unintelligible, and yet the RFI disclosed content volumes and query levels. The crucial question that the RFI needed to resolve is whether the functionality offered was within the initial budget. Pricing is always subject to due diligence and contract negotiation. In the end, it took several weeks and multiple emails to pin down an ROM figure.

The Choice Becomes Clear

I’m certainly not going to disclose the best or the worst. Some of the responses were very good and created a feeling within both the project teams that these were vendors they could work with. 

Enterprise search implementation is always challenging and an important objective of the RFI was to assess the extent to which a vendor just wanted to sell and move on, or become a partner for the longer term to bring high quality search to a customer.