Google isn't showing
your pages in search.
If Google can't find or index your pages, it's almost always one of three things: a robots.txt block, a noindex tag, or a crawl error. All are findable in under 2 minutes.
Check these in order
1. Go to Google Search Console โ Coverage. Any 'Excluded' or 'Error' URLs? That's your answer. 2. View the page source and search for noindex. 3. Visit yourdomain.com/robots.txt โ does it say Disallow: /?
Common causes & fixes
If robots.txt has Disallow: /, Google can't crawl anything. Change it to Disallow: (blank) to allow everything.
Search for <meta name="robots" content="noindex"> in your HTML. Remove it, redeploy, and request reindexing in Search Console.
Brand new sites take 2โ4 weeks to appear in search. Submit your sitemap in Google Search Console to speed it up.
Go to Search Console โ Sitemaps โ add /sitemap.xml. If you don't have a sitemap, generate one.
Quick checks from terminal:
# Check robots.txt: curl https://yourdomain.com/robots.txt # Check for noindex in page source: curl -s https://yourdomain.com | grep -i noindex