The reporting gap nobody talks about
Talk to enough business owners who have hired marketing agencies, and a pattern emerges. The most common complaint is not bad performance — it's that the reports “were full of metrics that meant nothing to my bottom line.”
Impressions, clicks, CTR, quality score, impression share. These are metrics the agency cares about. They are levers the media buyer pulls. They are not what the client hired you to deliver.
Your client hired you to generate leads, book appointments, fill units, get phone calls. When your report leads with click-through rate instead of “here is how many leads came in and what each one cost,” you are speaking a language your client does not speak. And they will eventually find an agency that translates.
What clients actually want (in their words)
We have talked to dozens of business owners who work with paid media agencies. Across industries — HVAC, property management, dental, legal — the wish list is remarkably consistent. Here is what they say they want from reports:
1. Did we get leads?
Not conversions. Not form submissions. Leads. Phone calls, booked appointments, contact form fills — in language they use, not yours. “You got 23 leads this month” beats “23 conversion actions were recorded” every time.
2. Is this good or bad?
Numbers without context are meaningless. “Cost per lead was $42” means nothing to a client who does not know if that is good. “Cost per lead was $42, down from $58 last month and well under our $65 target” — now they know.
3. What are you doing about it?
Clients want to know the agency is actively managing their campaigns, not just watching numbers. “We noticed the 2BR ad group CTR dropped, so we are testing new copy this week” shows you are paying attention. A chart of declining CTR with no commentary shows you are not.
4. What should I expect next?
Forward-looking context makes clients feel like they have a partner, not a vendor. “Based on current conversion rates, expect 6-8 tours from this batch” is more valuable than any backward-looking metric.
Notice what is missing from that list: impressions, quality score, impression share, audience segments, device breakdown. Those are your tools. They belong in your internal analysis, not in the client-facing report.
The before and after
The difference between a report clients ignore and one they actually read comes down to translation — taking your data and framing it in their language, with their goals as the lens. Here are three examples:
Before: Generic report section
“Google Ads delivered 2,450 impressions and 186 clicks this month, resulting in a CTR of 7.6%. There were 23 conversions at a cost per conversion of $42.17. Total spend was $970.00. Month-over-month, conversions increased by 15% while CPC decreased by 8%.”
After: Client-focused version
“Strong month. Your Google Ads campaigns generated 23 leads at $42 each — that is down from $58 last month and well under our $65 target. The spring campaign restructure is working. Based on your typical conversion rates, you should see 6-8 booked appointments from this batch. One thing I am watching: the branded search campaign is taking a larger share of budget than I would like, so I am capping it this week to push more toward the service-area campaigns.”
Before: Bad month, generic framing
“Conversions decreased 22% month-over-month, from 28 to 22. Cost per conversion increased from $38 to $51. Click-through rate remained stable at 6.8%. The account experienced reduced conversion volume across all campaigns.”
After: Bad month, with context and next steps
“Leads dipped from 28 to 22 this month, and cost per lead went up to $51. Not great, but expected — we launched a new campaign structure on the 12th and the algorithm is still learning. This typically takes 2-3 weeks to stabilize. Traffic quality looks good (CTR is holding), so the volume should recover. I am keeping a close eye on it and will adjust bids if we do not see improvement by mid-month. No action needed on your end.”
Before: Great month, buried in metrics
“The account achieved 45 conversions this period, representing a 50% increase over the prior period. Impression share increased to 78%. Average CPC was $3.42, down 12% MoM. ROAS improved to 4.2x. Quality scores across primary keywords average 8.2/10.”
After: Great month, framed as a win
“Best month since we started working together. 45 leads, up from 30 last month. Cost per lead dropped to $34 — our lowest yet. The new keyword structure and landing pages are clicking. At this pace, you are on track for 50+ leads next month. My recommendation: now is the time to increase budget by 15-20%. The campaigns are performing well enough that more spend should translate directly to more leads at a similar cost.”
Same data in both versions. But the client-focused version answers the four questions every client actually has: did we get leads, is this good or bad, what are you doing about it, and what should I expect next.
How business rules bridge the gap
Writing client-focused reports manually is entirely possible — many great agencies do it. The problem is scale. When you have 15 clients, each with different goals, different terminology preferences, and different anxiety triggers, maintaining that level of personalization takes real time.
This is where business rules become powerful. Instead of rewriting the same contextual framing every month, you define the rules once:
Example policy rules
- •“When cost per lead exceeds $65, flag it as a concern and include what you are doing to fix it.”
- •“Do not flag conversion dips in the first 60 days of a new campaign — explain that the learning period is expected.”
- •“Always translate conversions into the client's language: tours for multifamily, calls for HVAC, consultations for legal.”
- •“Include a forward-looking estimate using the client's historical conversion rates.”
- •“CPC increases during peak season (March-May for HVAC, Jan-March for multifamily) are expected — frame them accordingly.”
When those rules are applied automatically to every report, the output is not just a data summary. It is a narrative that reflects how the agency actually thinks about performance — the same way a great account manager would write it, but consistent across every client, every time.
The retention connection
Reports are not just about transparency. They are your primary retention tool. A client who understands what you are doing, why you are doing it, and what to expect next is a client who renews.
A client who receives a monthly PDF full of charts they do not understand is a client who wonders what they are paying for — even when the campaigns are performing well.
The irony is that many agencies have excellent campaign performance but average retention because their reporting does not communicate the value of what they are doing. The work is good. The story about the work is not.
The bottom line
Your clients do not want more data. They want fewer numbers and more meaning. They want to know what happened, whether it is good, what you are doing about it, and what comes next. That is it.
The agencies that figure this out — whether through disciplined manual writing or through tools that encode their expertise into every report — are the ones that keep clients for years instead of months.
Nooma was built around this principle: every report should answer the questions your client actually has, in the language they actually use, with the context only your agency can provide.