In Napoleon Hill’s story “Three Feet from Gold”, a gold prospector stumbles upon a shiny ore, but as he digs deeper, the vein of gold mysteriously disappears. The man drills on desperately, but eventually gives up. Later, a mining engineer re-examines the site and finds that since the poor man hadn’t grasped the concept of fault lines, he had stopped just three feet from striking it rich!
Managing an effective sales force is a lot like gold mining in this respect. Persistence is key, but understanding your environment well enough to know exactly where your reps stand in relation to “gold,” or sales deals, is a game-changer. That takes experience and insights from data.
Mining existing sales information can yield great insights. Even for small sales forces, the amount of data accumulated over a year can help answer:
1. Which products sell better than others
2. Who the most effective sellers are, and what they have in common
3. What type of customers are interested in which products
Unfortunately, many companies have not even attempted the basics – not because they don’t have the will or skills, but because analysis is constantly hindered by the ubiquity of “dirty data,” just like fault lines making the vein of gold ore disappear.
Data for sales analytics are often gathered without proper quality assurance, resulting in duplicate customer records (e.g., “Cisco Systems” vs. “Cisco”), incorrect units (e.g., $15 instead of $15,000), simple misspellings, and incomplete information… AGI estimates that 15% of CRM entries contain errors. No wonder sales executives often (and understandably) lack confidence in their data, much less the insights garnered from it.
To address data issues, we have many options. Master Data Management is a newly-popular enterprise approach. When time is pressing and a systematic data audit is infeasible, a more practical strategy involves a right amount of duplicate check, missing data imputation (see below), and outlier removal procedures.
Using a recent project for a software company as an example, AGI ran into data issues when modeling customer spending potentials and share-of-wallet. The client data was incomplete and did not map out parent-child relationship among accounts consistently. So we rebuilt the entire account hierarchy by leveraging an external database. But still there was missing data for 20% of customers. That’s when we used the technique of “missing data imputation” to carefully fill gaps in the data with respective industry averages. In the end, we validated our assumptions by comparing the model’s output to actual total market size. The work we did in this project helped our client formulate the right coverage strategy for the right accounts, with a revenue upside exceeding $190M annually!
It also demonstrated that, with the right mix of data preparation and validation techniques, imperfect sales data can still yield impeccable insights to drive sales results.
About the Authors
Ian Zhao is the manager for AGI’s Sales Benchmarking Practice. Manish Jindal is a consultant at AGI’s San Francisco Office.
Originally published by: Ian Zhao