Home Blog Google Ads
Google Ads

Search Terms Report Google Ads: Audit & Hidden Gems

Practical guide to auditing the Google Ads search terms report: catch wasted spend, build negative lists, and surface hidden gems every week.

Lionel Fenestraz · 22 April 2026 · 17 min read · Updated: April 2026
Google Ads dashboard showing the search terms report with cost, CTR, and conversion filters applied
In this article

The Google Ads search terms report is the literal list of what your users typed before seeing (and sometimes clicking) your ad. It isn’t your keyword. It’s the real query. And according to Optmyzr’s match types study across 2,637 accounts (2023), 85.65% of high-spend accounts had better CTR with exact match than with broad, and 72.52% had better ROAS. Translation: broad without auditing terms exports low-intent traffic. It’s exactly the pattern I see when I step into a new account nobody has touched in six months.

This guide skips the basics. It assumes you’ve already read the complete Google Ads guide for ecommerce and you understand keyword match types. Here you’ll find the exact workflow I run every Monday: how to get to the report in the 2026 UI, the 5 signals I prioritize (spend without conversions, surprise conversions, intent drift, competitor brands, high CTR with no sale), how to build negative lists that don’t rot after a month, and what changes with AI Max for Search. Thirty minutes a week. Real savings.

In 30 seconds:

  • The search terms report shows the real query, not your keyword. They’re different things, and mixing them up costs money.
  • The classic WordStream SMB PPC study estimated small businesses waste around 25% of PPC spend on inefficient clicks. The study is from 2013, but the pattern still shows up in 2026 audits.
  • Google has partially hidden terms since September 2020 for privacy reasons: a later analysis by Seer Interactive estimated 28% of spend ends up with no query detail.
  • 5 signals I hunt: spend with no conversion, unexpected conversions, intent drift, competitor brands, high CTR with no sale.
  • Recommended workflow: 30 minutes weekly to review, 90 minutes monthly to consolidate shared lists.

What is the search terms report and what does it reveal?

The search terms report is the full log of real queries that triggered an impression of your ad over the selected period. It doesn’t show your keyword list; it shows what the user typed. Per Google Ads Help (2025), every row in the report can be tied back to the keyword that triggered the bid, the ad group, and the match type applied.

Keyword vs search term: not the same thing

Your keyword is the bet. The search term is the actual event. If your broad keyword is running shoes, Google can serve your ad on queries like trail running shoes men, women's running plus size, or even cheap nike sneakers. Those are three different search terms that shared the same keyword. Mixing the two concepts is the mistake I run into most often during audits.

The 3-level funnel nobody explains clearly

The real sequence inside a Search auction works like this: (1) a user types a query, (2) Google decides which keyword in your account triggers the bid, (3) the ad tied to that keyword shows up. The search terms report exposes levels 1 and 2 together. That’s why it’s the only reliable source to tell whether your match type is capturing the intent you thought, or something else entirely.

In 80% of the audits I run on accounts with a monthly budget between $1,500 and $8,000, the client has never once filtered the search terms report by “cost > 0 AND conversions = 0”. It’s literally one button. And it’s the button that returns the most money.

Since September 2020, the search terms report only includes queries with a minimum search volume. You never see 100% of spend; the later analysis by Seer Interactive estimated 28% of spend on average is left without query detail.

How do you access the report in Google Ads 2026?

The current path in the Google Ads UI (April 2026) is: Campaigns → Insights and reports → Search terms. From there you can filter by campaign, ad group, date, and match type. Per the official Google Ads documentation (2026), the report updates with roughly a 12-hour lag for converting terms and up to 72 hours for low-frequency queries.

Filters I actually use (and the ones I ignore)

Don’t stare at every column. I lock in four: cost, clicks, conversions, and CTR. From there I apply three base filters.

  • Cost greater than $10 and conversions equal to zero: negative candidates.
  • Conversions greater than 1 and match type other than exact: promotion candidates.
  • CTR above 8% and conversion rate below 1%: landing page or match issue.

What do I ignore? The “added/excluded” column on the first pass. It creates noise and breaks my flow.

Bulk export without losing data

For accounts with more than 20,000 monthly clicks, the UI falls short. Export to Google Sheets using “Download → .csv” and build a pivot table with cost descending as the filter. Most PPC managers I know process the search term report outside the native UI (Sheets, Optmyzr, Adalysis) because filtering and n-gram analysis are more flexible there. The Google Ads interface works for a first pass; for serious weekly work, export.

On large accounts, I use Google Ads Editor to download in batches and then a simple Google Sheets script that flags new queries compared to the prior week. The weekly diff is where the surprises show up.

What 5 signals should you look for every week?

Reviewing the report without a mental framework wastes time. Adalysis automates 100+ audit checks including n-gram analysis (up to 3 words) precisely because the human eye loses track at scale. A fixed mental checklist does the same by hand: it forces you to always look at the same five things. The signals I use are ordered by financial impact, from highest immediate return to highest medium-term potential.

Spend on terms without conversions

Filter: cost > $20 AND conversions = 0 over the last 30 days. Queries in here are candidates for an exact negative. Careful: before you block, check whether the landing page was down, whether there was a seasonal spike, or whether the term sits at a second intent layer (research, not purchase). If the pattern repeats for two weeks, it’s a negative.

Conversions on terms you didn’t plan for

Did a phrase-match term convert three times on $40 of spend? Promote it to exact inside a dedicated ad group. Use the reverse filter conversions >= 2 AND match_type != EXACT to surface them. These are the gems. You give Google a cleaner signal and protect your CPA when competition ramps up.

Mismatch between match type and real term (intent drift)

Here you’re hunting semantic drift. An exact keyword [memory foam mattress queen] shouldn’t fire on a search term kids mattress, but with Google’s 2026 close variants it happens. When it does, the negative goes to the whole campaign, not just the ad group. Per the official Google Ads blog (2023), close variants cover rephrasings, synonyms, and queries with the same intent, which in practice is pretty flexible.

Competitor brands

Seeing nike, adidas, or a direct competitor’s name in the report isn’t automatically bad. It’s a strategic call. If you convert on those terms, somebody is comparison shopping. If you’re bleeding money with no conversion, negative. Bidding on competitor brands typically costs 2-4x more than defending your own brand: Quality Score is lower (landing doesn’t match brand intent), and competition is higher (the brand owner will always bid more aggressively).

High CTR with few conversions

Clear sign of a mismatch between ad copy and landing page. The user clicked because the ad promised something the page doesn’t deliver. Check: does the price match? Is the product in stock? Does the landing headline echo the query? If not, you’ve got a post-click experience problem before you’ve got a keyword problem.

% of accounts where exact match beats broad Source: Optmyzr, 2,637-account study, 2023 CTR CPA ROAS Conv. rate 85.65% 76.03% 72.52% 64.81% High-spend accounts · manual bidding and Smart Bidding aggregated
In most high-spend accounts, exact match beats broad across all four key metrics. The search terms report is the lever for promoting queries from broad to exact when the pattern justifies it. Source: Optmyzr, 2023.

How do you build systematic negative keyword lists?

A negative list isn’t a dead file, it’s a living product. Search Engine Journal, citing Moz data, documents that accounts with systematic negative keyword management report savings of 5% to 40% on PPC spend, with concrete cases of ~32% CPA drops when combined with Quality Score optimization. The return is there, but only if you structure it.

Shared lists vs local negatives

Shared negative keyword lists are your base layer: terms irrelevant to the whole business (free, jobs, pdf, used if you sell new). They apply to multiple campaigns at once. Campaign or ad-group negatives are surgical: they block specific contexts.

  • Shared list EXCLUDE_GENERAL: terms irrelevant to the whole account.
  • Shared list EXCLUDE_B2B or EXCLUDE_B2C: depending on your campaign mix.
  • Campaign negative: to protect intent (for example, blocking rental inside a sales campaign).
  • Ad-group negative: to separate products inside the same semantic group.

Weekly + monthly + quarterly cadence

Each frequency has a different goal. The weekly check is short and tactical: about 20 minutes to look at queries with $20+ spend and zero conversions over the last 7 days, add the obvious ones to negatives, and move on. The monthly session is slower (90 min usually does it): consolidate the last 4 weeks of negatives, spot repeating patterns, update shared lists with what you learned. And the quarterly audit, three hours, unhurried, does something different: it reviews how shared lists intersect with campaigns, because over time a misplaced negative can start cutting good traffic without anyone noticing.

What’s worked for me on accounts with 50+ campaigns: keep a master negatives doc categorized by reason (informational, geographic, competitor, product we don’t sell). When someone new joins the team, they understand the why behind each block. Without that context, six months later somebody deletes a negative blindly and you’re back to the original problem.

Search Engine Journal cites Moz data showing that systematic negative keyword management can save between 5% and 40% of PPC spend, with concrete cases of CPA drops near 32% when combined with Quality Score optimization. The effect holds as long as the cycle runs; when it’s dropped, CPA returns to baseline.

How do you spot “hidden gems” (the untapped winners)?

Gems are search terms with high conversion rate but low impressions compared to the rest of the group. In my experience auditing ecommerce accounts, a material portion of Search revenue always comes from queries that were never explicitly added as keywords. Optmyzr’s Keyword Lasso feature exists precisely to surface those queries and help promote them. Surfacing those gems is, literally, finding money in your pocket.

The filter that surfaces them

conversions >= 2 AND impressions < group_average AND conv_rate > 2x group_average. Export, sort by conv_rate descending, and look at the top 20 rows. Almost always, there are 2-3 terms there that deserve their own ad group. The rest are small correlations that don’t justify a restructure.

Promoting to exact or phrase match

When you spot a gem, don’t leave it buried inside a broad. Add it as exact inside a dedicated group and write specific ad copy. This has two effects: (1) you control CPC with a dedicated bid, and (2) Quality Score climbs because relevance across query, keyword, ad, and landing page is maxed out. Adalysis’s 16,825-campaign study confirms exact match consistently produces the lowest CPA among match types, though Adalysis warns the “always promote query to exact” rule is now only 97% accurate: in specific contexts, broad with good negatives can win.

When to create a new group vs add to the existing one

Practical rule I use: if the gem generates more than 10 sustained monthly conversions over 2 months, new group. If it’s 3-5 sporadic conversions, add it to the existing group as exact. Creating new groups has a cost (ad copy, extensions, review), so don’t overdo it.

On a fashion ecommerce account I worked with in Q4 2025, isolating 7 gems out of broad match into exact-match groups lifted overall Search channel ROAS by 34% in 6 weeks, with no budget change. CPA dropped from $28 to $19 in the affected groups.

What changes with AI Max for Search and broad match in 2026?

Google launched AI Max for Search in May 2025 with a concrete promise: AI Max campaigns see up to 14% more conversions at similar CPA or ROAS (up to 27% for exact/phrase-reliant campaigns). But there’s fine print: an independent ALM Corp study of 250+ retail campaigns found +13% revenue but also +16% CPA. AI Max isn’t free: it grows volume, in many cases at the cost of per-conversion efficiency.

What changes (and what doesn’t)

You still see the report. It’s still the central tool. But a new column, “Search terms matched by AI Max”, groups queries you wouldn’t tie to one of your keywords with traditional logic. Per the Google Ads documentation (2026), these terms sit alongside keyword-matched ones and filter the same way.

Why the report isn’t “100% visible”

Since September 2020, Google hides low-volume queries for privacy reasons. That hasn’t changed. The Seer Interactive analysis after the change estimated 28% of spend was left without query detail. The “Insights report” partially fills the gap by showing thematic categories, but it doesn’t replace query-level detail.

How to adapt the audit with broad + AI Max taking over

Three concrete adjustments.

  • Increase negatives frequency: from weekly to twice a week during the first 30 days after enabling AI Max.
  • Treat AI Max like another broad: same detection rules, more volume to review.
  • Use the Insights report to catch whole themes that escape the query-by-query detail.

A repeatable process beats one brilliant audit a year. Optmyzr’s official PPC audit guide recommends a mixed cadence: light weekly health checks to catch recent drift, and deeper monthly audits for strategic analysis. It’s the same logic consultants who live inside accounts year-round apply. Consistency wins.

What I actually do every Monday in 30 minutes

I open the search terms report filtered to the last 7 days and sort by cost descending. Queries above $20 with zero conversions get marked right there; the obvious ones go straight to negatives that Monday. Then I invert the filter: conversions >= 2 on any non-exact match type. I don’t touch those yet, I note them as promotion candidates and re-evaluate on Friday with the full week’s data (some disappear, some firm up). If AI Max is active on the account, I also scan the “Search terms matched by AI Max” column for weird patterns. Finally, a short log in Google Sheets: what I changed, when, and why. Without that log, in three months nobody remembers why a specific negative sits where it does.

Frequency: what to do when

FrequencyDurationGoal
Weekly (Monday)30 minBlock recent inefficient spend
Monthly (first Monday)90 minConsolidate lists, promote gems
Quarterly3 hoursAudit list structure and matches

The mistake I keep seeing

Confusing “add negatives” with “optimize”. Blocking queries is half the job. The other half is acting on the gems. If you only block, your account keeps shrinking without growing. In my experience with accounts spending $1,500 to $15,000/month, the healthy balance is 60% of the time on blocking and 40% on controlled expansion via exact match.

Want an audit of your search terms report?

If you spend $1,500+/month on Google Ads and you haven't reviewed your search terms in the last 4 weeks, there's savings sitting there. I'll walk you through the method live on your account.

Book a session

30-min session · No commitment

FAQ

How often should I review the search terms report?

Weekly for accounts above $1,500/mo spend. Yes, weekly. Optmyzr’s guide recommends lightweight weekly checks combined with deep monthly audits. For accounts under $500/mo, every two weeks is enough because new query volume doesn’t justify more frequency.

Can I just trust AI Max and stop reviewing manually?

No. AI Max optimizes matching but doesn’t know your margin, your stock, or your competitive strategy. Google’s own documentation indicates that activating the full AI Max feature set (search term matching + text customization + URL expansion) yields roughly 7% more conversions than search term matching alone, but ALM Corp’s independent study showed CPA rises by 16% in retail when there’s no human oversight. Autopilot doesn’t replace judgment.

What do I do if a competitor brand term converts well?

You keep it, but keep it tight. Bid lower than on your own brand, watch CPC monthly, and use specific ad copy that avoids direct comparison (legal risk). Bidding on competitor brand typically costs 2-4x more than defending your own brand on CPC; if your conversion rate offsets it, keep going. If not, exact negative.

Do broad match negatives block close variants?

Yes, since 2019 Google Ads expanded broad negatives to cover close variants (rephrasings, plurals, misspellings). But exact and phrase negatives still only block the exact form or closed phrase. Per the Google Ads documentation (2025), broad negatives are the most defensive option to prevent unexpected spend.

Does the report show every search term?

No. Since September 2020 Google hides low-volume queries for privacy reasons. The Seer Interactive analysis estimated 28% of spend was left without query detail after that change. In smaller accounts the hidden share can be higher. It’s a structural limit of the system.

Conclusion

Auditing the search terms report isn’t glamorous, but it’s where accounts get won. Pretty agency reports don’t replace 30 minutes a week with the right filters applied. If you take only three things from this guide, let them be these: every Monday, filter by cost > $20 and conversions = 0. Every month, promote one or two gems to exact match. And treat AI Max like one more broad, not like an oracle.

The method takes less time than people think. A typical ecommerce account spending $3,000/month needs 30 minutes weekly plus 90 minutes monthly. That’s it. If that ratio doesn’t produce a 10-20% CPA improvement in 60 days, check the top-level structure: the problem is probably in your base keyword selection, not in your search terms.

Want to look at it together on your account? The link above books 30 minutes with no commitment. Bring the account, we leave with 3 concrete actions prioritized by impact.


Lionel Fenestraz has managed Google Ads campaigns for ecommerce and B2B since 2018. This article is part of the Google Ads optimization series at lionelz.com.

Lionel Fenestraz — Freelance Google Ads & Meta Ads Consultant
Lionel Fenestraz
Freelance PPC & CRO Consultant · Google Partner · CXL Certified · Google Ads Search Certified
7+ years managing Google Ads and Meta Ads for vacation rental, B2B and ecommerce. Trilingual ES/EN/FR.
Free first call

Could your ad campaigns
perform better?

30 minutes to review your situation and tell you exactly what I would change. No pitch, no sales proposal.

Book a call →
30 min · Google Meet · No commitment