The Myth That 'More Metrics' Means 'More Value'
Cramming 50 metrics into a report isn't transparency; it's a liability. More numbers just mean more questions you can't answer. Here is how to cut the noise.
The Thud Factor is Not a KPI
I remember the days of physical reporting. We used to judge the quality of a report by the “Thud Factor”—the sound it made when you dropped the printed binder onto the boardroom table. A loud THUD meant you had done a lot of work.
We have moved to digital, but the mindset hasn’t changed. We just send 50-slide PDFs or spreadsheets with 40 columns instead. We are hiding behind volume.
I reviewed a monthly report recently that had 63 different metrics on the first page. 63! It had “Bounce Rate,” “Exit Rate,” “Time on Page,” “Pages per Session,” and “Scroll Depth.”
The client looked at this wall of numbers. Their eyes glossed over the 62 green numbers and found the one tiny red number in the corner: “Server Latency.”
“Why is the server latency up by 0.2%?” they asked.
The account manager didn’t know. The meeting derailed. We spent 45 minutes talking about server latency, which didn’t matter, instead of the record sales, which did.
The Clutter: Noise is a Liability
When you include every metric available in the API, you are not being transparent. You are creating a risk surface.
Every number you put on a page is a potential question you need to be ready to answer. If you have 50 metrics, you need 50 explanations. If you only have 5, you can be bulletproof on all of them.
More metrics mean:
- More Cognitive Load: The client has to work to find the signal.
- More Anxiety: “Why is that one red?”
- More Dodgy Questions: Questions about irrelevant details that make you look unprepared.
It is dangerous. You are handing the client a shotgun and hoping they don’t shoot your foot.
The Clarity: The Rule of Five
I enforce a strict rule now: The Decision Mapping Protocol.
We do not report a number unless it maps to a decision. If a metric changes, what do we do?
- Impressions go up? We don’t do anything. (Vanity metric. Bin it.)
- Cost Per Acquisition goes up? We cut the budget. (Decision metric. Keep it.)
We stripped that 63-metric report down to five key indicators.
- Spend (Are we on budget?)
- Revenue (Are we making money?)
- ROAS (Is it efficient?)
- Top Product (What is selling?)
- CAC (Is it sustainable?)
[TO EDITOR: Diagram showing a funnel. Top of funnel is wide, labelled “Data Lake (Everything)”. Middle is a filter labelled “The ‘So What?’ Test”. Bottom is 5 distinct icons/drops labelled “The Decision Metrics”.]
When we presented the new, slim version, I was nervous. I thought they would ask where the “Scroll Depth” went.
They didn’t. They looked at the revenue. They nodded. They signed off on the next month’s strategy in ten minutes.
It was brilliant. We stopped trying to prove how much data we had, and started proving how much we understood their business.
If the font size is 8, you are hiding something. If the font size is 24, you are telling the truth. Be brave enough to delete the noise.
FAQs
My client specifically asked for 'all the data'.
They are lying to themselves. Give them the summary, and put 'all the data' in an appendix file named 'Raw_Data_Do_Not_Open'. They will never look at it.
How do I choose which metrics to kill?
Use the 'So What?' test. If a metric goes up by 10% and you wouldn't change your strategy, delete it. It is noise.
Won't it look like we did less work?
It takes more work to write a short poem than a long rant. Smart clients know this. Dumb clients aren't worth the headache.