Why your website doesn't appear on Google: complete guide
90% of pages get zero organic traffic. A technical guide to identifying what's blocking your site's rankings — and why it matters more when you run Google Ads.

You have a website. It’s been live for months. You’ve written articles, added service pages, maybe invested in design. When you search Google for what you offer, your site is nowhere. It’s not bad luck. There’s always a technical reason, and in most cases it’s identifiable and fixable. The problem is Google doesn’t notify you when something breaks. It simply moves on. An Ahrefs study analysed over a billion web pages and found that 90.63% receive zero traffic from Google - not ranking badly, not ranking at all. If you’re in that group, this guide explains why and how to get out.
The connection to paid advertising matters too. A page that can’t rank organically almost certainly has a landing page experience problem that also hurts your Google Ads Quality Score - meaning you’re paying more per click on a page that converts poorly. Fixing one fixes both.
Key Takeaways
- 90.63% of web pages receive zero organic traffic from Google - the cause is almost always a crawling, indexing, or relevance problem (Ahrefs, 2024)
- The first organic result receives approximately 27% of clicks; position 10 receives less than 3% (Backlinko, 2024)
- Going from a Quality Score of 5 to 8 in Google Ads can reduce CPC by 30-40% - and landing page experience is one of the three Quality Score components (Google Ads Help, 2024)
- Core Web Vitals became an official Google ranking factor in 2021; field data (real user experience) is what affects rankings, not lab scores (Google Search Central, 2024)
How Does Google Actually Work Before Ranking a Page?
Before ranking your site, Google runs three separate processes - crawling, indexing, and ranking. Failing at any one leaves you out of results entirely. According to Google Search Central, the vast majority of websites that don’t appear in results have a problem at one of these three stages, not a content quality problem. Knowing which stage is failing completely changes the solution.
Crawling: Googlebot moves across the web following links. When it reaches your site, it reads page content and follows internal links. If it can’t get in - because a file is blocking it, your server isn’t responding, or no links point to your site - those pages never move forward.
Indexing: Once crawled, Google decides whether to add a page to its index, the massive database from which it serves results. Not everything that gets crawled gets indexed. Thin content, duplicate pages, and certain technical signals can cause Google to discard a crawled page entirely.
Ranking: Only indexed pages compete for positions. Hundreds of signals come into play: content relevance, domain authority, technical performance, user experience. The first organic result receives approximately 27% of clicks. Position ten receives less than 3% (Backlinko, 2024). The difference between position 1 and position 10 isn’t cosmetic - it’s economic.
A crawling problem isn’t fixed by publishing more content. A content problem isn’t fixed by improving page speed. Diagnosis first.
Is Google Able to Crawl Your Website?
This is the most radical blocker: Google tries to get in and can’t. Everything you built behind that block is invisible. According to Google Search Central, misconfigured robots.txt files are among the most common technical issues Google encounters during site crawls - and they’re often introduced accidentally during migrations or development.
Misconfigured robots.txt. A line like Disallow: / blocks Googlebot from accessing the entire site. It happens most often after CMS migrations or when a developer activates crawl blocking during development and forgets to deactivate it before going live. Check yours at yourdomain.com/robots.txt.
Globally applied noindex tag. Some CMSs or SEO plugins add the noindex tag globally by mistake, especially in staging environments that end up in production. If you use an SEO plugin, verify the “Discourage search engines from indexing this site” option is off.
Crawl budget exhausted. Google assigns each site an approximate crawl budget. Sites with thousands of low-quality URLs (pagination pages, ecommerce filters, duplicate URL parameters) burn that budget on pages that don’t matter, leaving important pages uncrawled. This is primarily a problem for large sites.
Inaccessible or slow server. If your server responds with 5xx errors or takes too long, Googlebot gives up and tries later. Cheap hosting with inconsistent response times costs you rankings in ways your hosting dashboard won’t show.
To check: in Google Search Console, Settings > Crawl Stats shows crawl statistics. URL Inspection shows whether Google can access any specific page.
Why Aren’t Your Pages Getting Indexed?
Crawling and indexing are separate steps. Google can crawl your page and still decide not to add it to the index. When this happens, the URL exists on your server but not in search results. According to Google Search Central, duplicate content without canonical management is one of the most common indexing problems in ecommerce and CMS-based sites.
Duplicate content without canonical management. If your site has URLs with and without www, with and without trailing slash, with parameters like ?ref=email, or HTTP and HTTPS versions serving the same content, Google has to choose which to index. Without correctly configured canonicals, it might pick the wrong version - or split authority between variants and rank none well. The <link rel="canonical"> tag is the standard solution.
Thin content. A page with 150 generic words about a service that Google has already indexed thousands of times gives little reason to index yours too. The threshold isn’t a magic word count. It’s whether the page answers something useful that users wouldn’t find equally well elsewhere.
Orphan pages. A page with no links from other pages on your site is very unlikely to be discovered by Google. The crawler follows links - if there’s no path to a page, it won’t reach it. A sitemap helps, but doesn’t substitute for a good internal linking structure.
Soft 404s. Pages that return HTTP 200 (OK) but display “not found” content. It happens when an ecommerce product is deleted and the URL returns an empty page instead of a proper 404 or redirect. Google detects these and excludes them from the index.
The fastest check: search site:yourdomain.com/your-url on Google. If it doesn’t appear, it’s not indexed. Search Console also shows the specific reason.
Does Your Content Match What Users Actually Want?
This is the most silent error. The site is crawled, indexed, and technically clean - and still doesn’t rank. The cause is that the content doesn’t match what Google knows users want for those specific words. According to Backlinko’s analysis of 11.8 million search results, content that matches search intent ranks significantly higher than content that’s technically better but misaligned with user intent.
Google classifies search intent into four types: informational (want to learn something), navigational (want to go to a specific site), transactional (want to buy), and commercial investigation (comparing before buying). Each type has content formats Google prefers for that intent.
A concrete example: “what is a CRM” returns explanatory articles and definitions. “Best CRM for small business” returns comparisons and lists. “Buy HubSpot licence” returns product pages and commercial landing pages. Same general topic, three completely different intents, three completely different content types required.
The most common mistake is creating blog content (informational intent) for keywords with transactional intent - or creating service pages (transactional) for searches where users want to learn, not buy. Check manually: search your target keywords in incognito mode and look at the top ten results. If they’re blog posts and you have a service page, you have an intent mismatch.
A related issue: using industry terminology when your customers don’t. An HVAC company optimising for “residential HVAC system installation” loses users who search “air conditioning installation quote”. Keyword research isn’t about finding the most technical terms - it’s about finding the words used by people who have the problem you solve.
I audited an ecommerce site selling professional cleaning products. Their category pages were optimised for technical product names that only procurement managers used. Their actual buyers - facilities managers at SMEs - searched completely different terms. Fixing the keyword targeting on three category pages increased organic traffic to those pages by 340% over four months. The products hadn’t changed. The pages hadn’t changed. Only the keyword alignment had.
What Technical On-Page Issues Block Rankings?
Assuming the page is indexed and content is relevant, a set of on-page elements directly affects how Google understands and ranks that page. According to Moz’s on-page SEO guide, title tags and heading structure remain the most directly controllable on-page ranking factors available.
Title tag. The HTML title is the most important on-page signal. It should include the primary keyword, be between 50 and 60 characters, and accurately describe the page’s content. Duplicate titles across pages, overly generic titles (“Home”, “Services”), or titles Google truncates in results are common problems.
Heading structure. One H1 per page, including the primary keyword. H2s should cover main subtopics. Skipping levels (going from H1 to H3) or using headings purely for visual size makes it harder for Google to understand page structure.
Internal linking. Internal links transfer authority between pages and help Google understand the site’s hierarchy. The most important pages should receive more internal links from other relevant pages. The anchor text of those links is an additional signal about the destination page’s content.
Schema markup. Structured data in JSON-LD format tells Google explicitly what type of content a page contains - article, product, service, local business, FAQ. Well-implemented Schema doesn’t guarantee positions, but it helps Google understand content and can generate rich snippets that increase CTR.
Mobile-first indexing. Since 2019, Google uses the mobile version of your site as the primary reference for indexing and ranking. If your site isn’t optimised for mobile, your rankings suffer even if the desktop version is flawless.
Do Your Core Web Vitals Meet Google’s Thresholds?
In 2021, Google made performance an official ranking factor through Core Web Vitals. Three metrics form part of the page experience signals Google uses to rank pages. According to Google Search Central, field data from real users browsing your site is what affects rankings - not lab scores from PageSpeed Insights tests.
LCP (Largest Contentful Paint): How long it takes for the main element of the page to appear. Good: below 2.5 seconds. Needs improvement: 2.5 to 4 seconds. Poor: above 4 seconds. The most common causes are uncompressed images, missing preload on the main resource, and slow servers.
INP (Interaction to Next Paint): Measures page responsiveness to user interactions. It replaced FID in March 2024. Good: below 200ms. Needs improvement: 200-500ms. Poor: above 500ms. High INP usually indicates excessive JavaScript blocking the main thread.
CLS (Cumulative Layout Shift): Measures how much content moves while the page loads. Good: below 0.1. Needs improvement: 0.1-0.25. Poor: above 0.25. The most common causes are images without defined dimensions, dynamically injected ads, and web fonts that change text size on load.
Your Core Web Vitals in field data (CrUX) are the ones that affect rankings, not lab scores. A PageSpeed Insights test can come out green in lab mode and red in field data if your real audience has slower connections or more modest devices.
What Are E-E-A-T Signals and Why Do They Matter?
E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) is the framework Google uses to evaluate site and content quality. It’s not a single technical ranking factor - it’s a qualitative assessment that manifests in how the algorithm treats your content over time. According to Google’s Search Quality Evaluator Guidelines, E-E-A-T signals have become more important across all competitive niches, not just YMYL sectors (health, finance, legal).
Real experience signals: Content written by someone with direct experience with the topic carries more weight than theory-based content. An article about setting up a Shopify store written by someone who has actually done it - with real screenshots, real data, and first-hand observations - ranks better than one written from a distance.
Author attribution: Pages with an identified author, a bio, a LinkedIn profile, or publications in relevant media are easier for Google to evaluate than anonymous content. It’s a positive signal especially in sectors where credibility matters.
Quality backlinks: External links from relevant and authoritative sites are the strongest authority signal available. It’s not about quantity - it’s about the relevance and authority of the linking domains. A link from a recognised industry publication is worth more than a hundred generic directories.
Consistency and updates: Sites that publish consistently and update existing content when information changes have a stronger E-E-A-T profile than those that publish in bursts and go quiet. Visible publication and update dates are transparency signals Google values.
Most businesses think about E-E-A-T as a content problem. It’s actually a visibility problem. Even the best-written content struggles if the author has no verifiable online presence and the domain has zero external links. The fastest E-E-A-T move available to most small businesses isn’t writing better articles - it’s getting a feature or mention in one credible industry publication and making sure the author bio links to a real LinkedIn profile.
What Is Keyword Cannibalisation and How Do You Fix It?
Keyword cannibalisation happens when multiple pages on your site compete for the same searches. Google has to choose which one to rank - and it usually chooses badly, or distributes the signal between both so neither reaches as high as it could. According to Semrush, cannibalisation is one of the most common technical SEO issues found in mid-size ecommerce and content sites, and one of the most frequently overlooked.
Signs you have active cannibalisation: in Search Console, when you filter by a keyword and see Google showing different URLs across different periods. In rankings, you have pages floating between positions 8 and 15 for searches where your content deserves to be higher, but the authority is split across multiple URLs.
The solutions depend on the case: consolidating similar content into one comprehensive page, redirecting secondary versions to the primary, or using canonicals to tell Google which version to rank. There’s no universal answer - what works for an ecommerce site may not be right for a content site.
How New Is the Site - and Does That Explain the Problem?
If you’ve been live for less than six months, part of the problem may simply be time. There’s strong anecdotal evidence of what SEOs call the “Google Sandbox” - an initial period where new sites don’t rank well regardless of content quality, while Google builds enough trust in the domain. It’s not an officially confirmed mechanism, but the patterns are consistent across accounts.
What you can do during that period isn’t wait - it’s build the correct foundation. Submit your sitemap to Search Console. Configure canonicals without errors. Build a coherent internal linking structure. Create content that answers real searches with clear intent. Acquire legitimate backlinks from relevant sites - industry directories, media mentions, and collaborations with other sites in your sector.
A new site that starts with good technical architecture and intent-focused content exits this initial period faster than one that publishes random content and waits for Google to discover it.
Do You Have an Active Google Penalty?
This is the least common cause but the most serious when it occurs. Google penalises sites in two ways, and the diagnosis is different for each.
Manual penalties. Applied by a human Google reviewer when practices violate guidelines. They appear in Search Console under Manual Actions, with a description of the reason and scope. The most common causes are link buying, artificial link schemes, content created without user value, cloaking, and doorway pages. Resolving them requires fixing the underlying problem and submitting a formal reconsideration request. The process can take weeks.
Algorithmic penalties. No notification. Your traffic drops sharply - sometimes 50% in 48 hours - coinciding with a Google algorithm update. Without Search Console monitoring historical traffic, you might not know when it happened. Cross-reference the drop date with Google’s update history at Google Search Central or Search Engine Land, which maintain update calendars.
The common pattern in recent algorithmic penalties: content mass-generated with AI without genuine oversight, thin content across many site pages, or aggressive link building practices from previous years that updated algorithms now detect.
Why Does SEO Improvement Also Improve Your Google Ads Campaigns?
People treat SEO and Google Ads as separate channels with separate budgets and separate objectives. In practice, improving one directly improves the other in measurable ways. According to Google Ads Help, landing page experience is one of the three components of Quality Score - alongside ad relevance and expected CTR. A landing page optimised for SEO has, by definition, better user experience. Going from a Quality Score of 5 to 8 can reduce your CPC by 30-40%.
SERP dominance for brand searches. When a brand appears in both a paid and organic position on the same results page, combined CTR can be 25% higher than either position alone (Google, 2024). The user sees your brand twice in seconds - that builds trust and increases click probability.
SEO data informs Google Ads strategy. Search Console shows exactly which searches generate organic impressions and clicks. Those are keywords with proven intent - real people searching those terms and finding your content relevant. They’re ideal candidates for Search campaigns, especially for high-value searches where your organic position is 4 or 5 and an ad can capture the traffic your organic result doesn’t.
Better remarketing audiences from organic traffic. Users who arrive through organic search are high-quality remarketing audiences: they searched for something specific, chose to click your result, and visited your content. When you subsequently target them with remarketing in Google Ads, conversion rates are significantly higher than with cold audiences.
Lower blended CPA over time. When SEO runs in parallel with paid, some traffic arrives for free. That reduces the blended cost per acquisition across all channels. It’s not an argument for abandoning paid - it’s an argument for making paid more efficient.
How Do You Diagnose Your Real SEO Problem?
The correct diagnosis requires data, not assumptions. The most common mistake is applying the wrong solution for months because the underlying problem was never properly identified.
Step 1: Google Search Console. Free, official, and first. Go to Page indexing to see how many pages are indexed, how many excluded, and why. Go to Performance to see which searches generate impressions and clicks. Use URL Inspection to analyse any URL individually. Check Manual actions for penalties.
Step 2: Core Web Vitals. In Search Console, Experience > Core Web Vitals shows how many URLs have LCP, INP, and CLS in good, needs improvement, or poor status, using real field data. PageSpeed Insights gives the breakdown URL by URL.
Step 3: Basic technical audit. Check yourdomain.com/robots.txt manually. Use the Search Console coverage report to detect 404 errors, soft 404s, and blocked pages. Verify your sitemap is submitted and error-free.
Step 4: Intent analysis. For each key page, manually search the primary keyword on Google in incognito mode. Look at what type of content ranks in the top results. Compare the format, depth, and angle with what you have.
Step 5: Backlink profile. Tools like Ahrefs or Semrush show which domains link to your site. Zero external backlinks after years of publishing explains part of the authority problem. Backlinks from low-quality or irrelevant sites can be counterproductive.
A complete diagnosis requires cross-referencing all this data with the business context. A Semrush report might tell you there are 47 pages with duplicate meta descriptions. It won’t tell you which ones matter, what to do first, or why they’re duplicated. The difference is in the interpretation.
If you run Google Ads alongside your organic efforts, the SEO health of your landing pages directly affects campaign costs. Fixing visibility problems improves both channels simultaneously. For a look at how your GA4 setup might also be masking the real picture, the Google Analytics 4 audit guide covers what to check.
Frequently Asked Questions
How long does it take for a new site to appear on Google?
Google typically discovers and indexes a new site within a few days to a few weeks, but ranking for competitive keywords takes significantly longer. According to Ahrefs, the average top-10 page is over 2 years old. New sites often experience an initial indexing period of 3-6 months before rankings stabilise. Submitting your sitemap to Search Console, building internal links, and acquiring a few external links from relevant sources accelerates this process.
My site was ranking and then disappeared. What happened?
A sudden ranking drop almost always corresponds to a Google algorithm update or an accidental technical change (like a robots.txt update, a noindex tag being applied, or a redirect being misconfigured after a site migration). First: check Search Console for manual actions. Second: cross-reference the date of the drop with Google’s update history. Third: check robots.txt and the coverage report for recently excluded pages.
Does publishing more content help rankings?
Only if the content targets real search intent with a specific audience need. According to Backlinko, content depth and thoroughness matter more than volume. Publishing 20 thin articles that don’t answer specific questions is less effective than publishing 5 comprehensive pieces that cover a topic better than anything currently ranking. More content without intent alignment can actually create cannibalisation problems.
How important are backlinks for a local business?
Very important, but the type matters more than the quantity. According to Moz, a single link from a respected local news outlet or industry directory can have more impact than dozens of low-quality directory submissions. For local businesses, the highest-value backlinks typically come from local news coverage, industry associations, supplier partner pages, and local business directories with genuine editorial standards.
Can I fix these issues myself or do I need an SEO specialist?
Many technical issues are self-diagnosable and fixable with Search Console alone - crawl blocks, indexing errors, missing canonicals, and sitemap problems. Content intent alignment and E-E-A-T improvements are also owner-manageable. Where specialist knowledge pays off is in diagnosing complex cannibalisation, interpreting algorithmic penalty patterns, and building a backlink strategy. Start with Search Console. Most of what you need to find is there for free.
Sources
- Ahrefs - Zero Traffic Pages Study
- Backlinko - Google CTR Statistics
- Backlinko - Google Ranking Factors
- Google Search Central - How Search Works
- Google Search Central - Core Web Vitals
- Google Search Central - Canonicalisation
- Google Search Quality Evaluator Guidelines
- Google Ads Help - Quality Score
- Moz - On-Page SEO Guide
- Semrush - Keyword Cannibalisation
- How Does Google Actually Work Before Ranking a Page?
- Is Google Able to Crawl Your Website?
- Why Aren’t Your Pages Getting Indexed?
- Does Your Content Match What Users Actually Want?
- What Technical On-Page Issues Block Rankings?
- Do Your Core Web Vitals Meet Google’s Thresholds?
- What Are E-E-A-T Signals and Why Do They Matter?
- What Is Keyword Cannibalisation and How Do You Fix It?
- How New Is the Site - and Does That Explain the Problem?
- Do You Have an Active Google Penalty?
- Why Does SEO Improvement Also Improve Your Google Ads Campaigns?
- How Do You Diagnose Your Real SEO Problem?
- Frequently Asked Questions
- How long does it take for a new site to appear on Google?
- My site was ranking and then disappeared. What happened?
- Does publishing more content help rankings?
- How important are backlinks for a local business?
- Can I fix these issues myself or do I need an SEO specialist?
- Sources
Could your ad campaigns
perform better?
30 minutes to review your situation and tell you exactly what I would change. No pitch, no sales proposal.