As I wind down the last few days of my Forrester career before I retire, I am very conscious of the fact that I am leaving behind two customer experience (CX) analysts who are in various phases of creating a Forrester Wave™. I won’t be here to see them through the process, so I wrote down my best advice for getting through it with high quality and a minimum of drama.

It then occurred to me that I had never done anything parallel for my friends in the solutions provider community. And I have, in fact, come to know and like many of you, even though I am going to refrain from naming names, because I can only imagine the conspiracy theories that would generate.

In the spirit of providing a public service, however, I want to leave behind my personal (not official Forrester) top 10 tips for maximizing your performance in evaluative research. Here is the TL;DR:

  1. Provide evidence.
  2. Avoid irrelevance.
  3. Be honest.
  4. Be cordial.
  5. Don’t ignore the guidance.
  6. Try to hit the dates.
  7. Don’t assign the B, C, or D team.
  8. Pick your shots.
  9. Don’t play lawyer.
  10. Don’t try to lower the scores of your competitors.

And here are the details:

Provide evidence. More than anything else, the analysts I’ve worked with want to get their ratings right. Getting them right requires giving products under evaluation every point they earn and none of the points they don’t earn. So how do analysts know whether or not a product has earned a point? They can’t just take vendor assertions on faith unless they also take all their competitors’ assertions on faith, an approach that would punish honesty.

The answer is that analysts look for (and reward) evidence. So if you assert, for example, that it’s easy to build a dashboard with your product, provide evidence by building a dashboard in front of the analyst. Seeing is believing. Or if you assert that clients get fabulous ROI from one of your service offerings, provide evidence by connecting the analyst with a client who will go on record as saying that they got fabulous ROI from that offering. And remember that if you’re uncertain as to what type of evidence will prove your point, just ask the analyst, who will always be happy to tell you.

Avoid irrelevance. It’s an incredible waste of everyone’s time when vendors try to sway analysts with anything other than evidence. “The market doesn’t agree with you!” is a splendid example of an irrelevant argument (that I’ve heard more than once). It usually translates as “People are buying our product, so it must be good.” But that’s just conflating marketing and sales success with product quality, which is a thunderously bad idea: Analysts hear “buyer’s remorse” often, usually when a client purchased something that did not live up to what they were promised during the sales process.

Other irrelevant arguments include telling analysts how passionate you are about your product (they take that as given), how concerned you are about the analyst’s reputation (thanks for that), and pointing out how well you performed in a different Wave that evaluated either a different product or the same product but in a different category. Inside baseball: If your product appeared in some other Wave, then the analyst already read that other Wave, spoke with the analyst who created the other Wave, and factored in what he or she learned that’s relevant to this Wave.

Be honest. I’m going to say this as kindly as possible: Analysts are sometimes told things that are not true. If you or anyone at your firms is ever tempted to cross the line from spin to sin, keep in mind that we almost always find out. How? Usually from your clients and prospects.

A former analyst who worked for me delivered on average 100 inquiry calls with our clients per quarter. Most of the calls were about the products and services she covered. Those calls generated hundreds if not thousands of data points about solutions providers. What’s more, she was often asked to review responses to RFIs and RFPs. One time, she had just finished evaluating a product where the vendor told her that a particularly important feature was in the current, shipping version of that product. A couple of weeks, later she was assisting one of our clients with an RFP process and heard the same vendor tell the prospect that the feature was on their roadmap for a future release. Oops.

Be cordial. “I really like them — they insult my intelligence and integrity,” said no analyst ever. “I’ll recommend them to my favorite clients so those clients can experience this same level of unprofessionalism” is another sentiment that I’ve never heard voiced in 24 years on the job. Seriously, when someone who is thinking about buying your product asks an analyst how you are to work with, do you really want the only basis for their answer to be that time someone from your firm pounded the table and swore at them?

The good news is that there is a solid upside to being polite and professional. I’ve personally put one particular vendor in consideration a number of times because I know for a fact they care a lot about the experience they provide to their clients and are always willing to have a polite discussion about where they have opportunities to improve. It’s a service to all parties when analysts connect people like that with potential buyers.

Don’t ignore the guidance. One of the best things about Forrester Wave evaluations is that they all follow the same, high-quality, consistent methodology. And there is a team here whose sole purpose is to make sure analysts understand and apply that methodology without exception. Similarly, when analysts set rules like limiting the length of responses, it’s in service to cutting through the noise and getting at the signal. Despite that, we see solutions providers that drop a manual into a response field and expect analysts to find the answer for themselves — which they try to do, but do you really want to take the chance that they’ll miss or misinterpret it? You shouldn’t. Instead, take the time to provide just the answer, and everyone will be lots happier.

The worst thing a vendor can ignore, IMO, is the scenario that the analyst provides for their demo. But that scenario is designed to expose the capabilities that the analyst has deemed most valuable for end users. So if your competitors follow it (most do) and you don’t, the analyst will not see important capabilities and be forced to assume that you don’t provide them. Then there will be a scramble, either at the end of the demo or during the initial fact check, to prove that those capabilities do in fact exist. Please don’t do that to yourself. Or the analysts.

Try to hit the dates. It’s common for some vendors to go dark at critical phases of an evaluation process. That definitely puts the vendor at a disadvantage. Here’s why: Behind the scenes at Forrester, scores of people spend a ridiculous number of hours planning how many evaluative reports we will do per year, how long each will take to produce, and when they will publish. We take into account factors such as production capacity and when people (including people at solutions providers and at their clients) are likely to be away on vacation. We then go out of our way to tell everyone involved in the evaluation what our deadlines will be for each phase, well in advance. There’s close to zero fat in the schedule. So if vendors blow us off when a response is due, they deprive themselves of time and attention that analysts would otherwise spend on their response.

Don’t assign the B, C, or D team. Analysts know that vendors are busy and sincerely appreciate the effort you all put into participating in a Wave. In turn, vendors should know that participants who put in the effort to thoughtfully and expertly answer questions, prepare demos, and provide reference clients are far more likely to get every point they deserve compared to those that assign whoever happens to be available to provide whatever happens to be available. Unfortunately, it’s common for analysts to get a response to our questionnaire and think, “I know they’re better than this! What happened?” But analysts can’t cook the books and rate the vendor on what they think versus what the evidence says (see above). So please, involve the right people at your firm and put your best foot forward.

Pick your shots. There is a point in the Wave process where analysts send out their ratings to vendors and then the vendors respond. Analysts expect that, in their response, vendors will argue for higher ratings on some criteria. That’s great — it shows engagement. If the argument comes with evidence, it’s common for scores to go up — remember, analysts want to give you every point you earn. You know what doesn’t produce good results? A response that asks for a 5 out of 5 on every criterion. You know why? Because it screams, “I am not taking this seriously — I am just playing a game.” So please do yourself a favor and only dispute scores that you have rational reasons to dispute.

Don’t play lawyer. Sometimes a vendor seems to think that they’re in a trial, not an evaluation. When that happens, they twist words in an attempt to … what? Score debating points, maybe? I’m never sure. For example, one time I had a dozen vendors in a Wave I was editing. One of them was confused by one of the questions my analyst asked and provided a totally irrelevant response. Not wanting to penalize that vendor, I explained the question using other words so they could provide a better answer. In an attempt to let them save face, I framed this as, “I’m sorry if we inadvertently confused you.” I was trying not to rub it in that 11 of their peers understood something that they failed to grasp. But later, in an appeal of their rating, the vendor said, “Harley admits that the questions were confusing!” All I could do was sigh. Needless to say, that didn’t move their score.

Don’t try to lower the scores of your competitors. Imagine if analysts took every negative thing your competitors said about you as fact and rated you accordingly. Fortunately, that doesn’t happen, because analysts are smart. And yet, with every Wave, one or more vendors tries to tell the analyst that we rated one or more of their competitors too high. “Everyone knows that they … ” and “I’m just trying to save you from a bad call … ” are both phrases that I have heard too many times to count. Please, please, please don’t undermine your own credibility by trying to tell us all the horrible secrets that you alone have somehow uncovered about your rivals. At best, it’s a waste of time: I have never heard of any analyst lowering a vendor’s score because of something another vendor told them. It just does not happen. So instead, please use your valuable face time to tell us good things about your own product or service. Raising your score is possible. Lowering your rivals’ scores is not.

I hope this has been helpful, and I wish you all the best possible success in all your future Wave evaluations!