“Data-driven” sounds impressive. It makes strategy sound objective, scientific, and precise. However, being “data-driven” frequently masks another issue: people use numbers to justify their existing beliefs.
Data doesn’t make decisions; people do. And when the wrong data, or selective data, is used to guide major launches, the results can be catastrophic.
I once saw a team justify an ambitious price using a “willingness-to-pay” survey of just 45 clinicians. Half of them admitted they didn’t understand the reimbursement policy. The result looked convincing in a spreadsheet but fell apart the moment payers saw it.
Another team used a forecast model that assumed every eligible patient would start therapy within six months of approval. Beautiful graphs. Completely unrealistic. They forgot about system delays, staffing shortages, and local budget cycles. The forecast missed by 60%.
The problem isn’t data itself — it’s how it’s chosen, interpreted, and challenged. Too often, teams fall into three traps:
- Confirmation bias: collecting data that supports what they already want to do.
- Oversimplification: smoothing messy realities into elegant but inaccurate models.
- Blind faith: believing numbers without asking how they were produced.
The solution is to treat data as a compass, not a dictator. It should guide judgement, not replace it. Combine quantitative evidence with qualitative insights from payers, clinicians, and patients. When the data looks too neat, test it harder. When something doesn’t fit the pattern, explore it — it may be the signal you most need to understand.
Being data-informed is smarter than being data-driven. The former allows for judgement and course correction; the latter can send you confidently into an iceberg while insisting the spreadsheet says “clear water ahead.”
Data is powerful only when paired with humility. The companies that use it best are the ones that keep asking: what might we be missing?
Key Takeaways
- Treat data as guidance, not gospel.
- Challenge “perfect” numbers — they often hide imperfect assumptions.
- Balance quantitative evidence with lived experience.
- Train teams to spot bias and question the source.
- Remember: data should inform decisions, not excuse them.



