FreeSEOTools.io
Technical SEO9 min read

Google Search Console: How to Actually Use It for SEO

Google Search Console is free and gives you data straight from Google. Here's how to use the Performance report, Coverage report, Core Web Vitals, URL Inspection, and more.

F
FreeSEOTools Team
SEO Research
Google Search ConsoleGSCSEO ToolsClick-Through RateIndex Coverage

Google Search Console is one of the most underused free tools in SEO. It gives you direct data from Google about how your site is performing — clicks, impressions, indexing status, manual actions. No third-party estimates, no sampling. This is what Google actually sees and measures.

Most people set it up, check rankings once in a while, and ignore everything else. Here's how to actually use it.

Setting Up and Verifying

If you haven't set up Search Console, do it now. Verify via DNS record (most reliable), HTML file upload, or HTML meta tag. If you're on Vercel or a similar platform, DNS verification or the meta tag are easiest.

Add both https://www.yourdomain.com and https://yourdomain.com as separate properties, or use a Domain property that covers all subdomains and protocols in one. The Domain property is better for most sites.

The Performance Report

This is the one everyone uses. It shows your search traffic data: clicks, impressions, CTR (click-through rate), and average position.

The four metrics:

  • Clicks — How many times someone clicked your link in search results
  • Impressions — How many times your URL appeared in search results (even if not seen by scrolling down)
  • CTR — Clicks divided by Impressions. Low CTR on high-impression queries = opportunity to improve your title and meta description
  • Average Position — Your average ranking position. Positions 8-20 with decent impression volume are candidates for content improvement

How I Actually Use Performance Data

Finding CTR improvement opportunities: Sort by Impressions descending. Find pages with 500+ impressions per month and CTR below 3%. Those are pages where better titles and meta descriptions can move the needle without any new content.

Finding content refresh candidates: Filter by "Queries" and sort by position 8-20 range. These are pages on the edge of page 1 that need improvement to get over the line. Usually this means updating content, improving on-page optimization, or building a few more internal links.

Comparing date ranges: Use the date comparison feature to spot traffic changes. If a page dropped 40% in impressions over 90 days, something changed. It could be an algorithm update, a competitor improved, or you changed something on the page.

URL Inspection Tool

The URL Inspection tool is what I check whenever a page isn't appearing in search results. Enter any URL and it tells you:

  • Whether Google has indexed the page
  • The last crawl date
  • The canonical URL Google is using (may differ from what you set)
  • Any crawl or indexing issues
  • The rendered version of the page (what Google actually sees after running JavaScript)

The rendered view is especially valuable for JavaScript-heavy sites. If your content is invisible in the rendered view, Google can't see it either.

After fixing an issue, use "Request Indexing" to ask Google to recrawl the URL. It doesn't guarantee immediate indexing, but it does prioritize it.

Index Coverage Report (Now Called "Indexing")

The Indexing section (Pages report in the newer interface) shows the status of all URLs Google has discovered on your site. The status categories:

  • Indexed — Good. These pages are in Google's index.
  • Not indexed — Crawled, currently not indexed — Google visited the page but decided not to index it. Often: thin content, duplicate content, or content that doesn't provide unique value.
  • Not indexed — Discovered, currently not indexed — Google knows the URL exists but hasn't crawled it yet. Could be crawl budget issues or a new site.
  • Not indexed — Excluded by noindex tag — Intentionally excluded. Make sure these are actually pages you meant to noindex.
  • Not indexed — Duplicate without canonical — Google found two pages with the same content and no canonical tag. Fix with canonical tags.

The "Crawled, currently not indexed" bucket is the one to watch carefully. Pages sitting here for weeks usually need content improvements.

Core Web Vitals Report

Under "Experience," the Core Web Vitals report shows real user data (CrUX) for your site, broken down by mobile and desktop, with pages grouped as Good, Needs Improvement, or Poor.

This is the data Google actually uses for rankings — not your Lighthouse score. I've seen sites with 90+ Lighthouse scores have failing field CWV because of third-party scripts that only load in real browser sessions.

Click into any status to see which specific URLs are affected and which metric is failing. Fix the highest-traffic pages first.

Sitemaps

Under Indexing, submit your XML sitemap URL. Search Console shows how many URLs were submitted and how many are indexed. A large gap (100 submitted, 40 indexed) is a signal worth investigating.

Resubmit your sitemap after major content additions or when you've fixed indexing issues. It doesn't force reindexing but it helps Google discover new or updated URLs faster.

Manual Actions

Under "Security & Manual Actions," check for manual penalties applied by Google's quality review team. Manual actions mean a human reviewer decided your site violates Google's spam policies. They're less common than algorithm-based drops but important to check.

If there's a manual action, fix the issue described and submit a reconsideration request. This can take 2-4 weeks for Google to review.

Links Report

The Links report shows your external and internal link data as Google sees it: top linked pages, top linking sites, and top anchor text. Use it to:

  • Check if important pages have enough internal links
  • Verify that link-building efforts are being credited by Google
  • Spot any unnatural anchor text patterns that could signal manipulation

Setting Up Alerts

Search Console doesn't have native email alerts for most things. I'd recommend setting up Uptime monitoring for your site separately, and checking Search Console manually at least weekly. The 28-day data lag means problems can compound before you notice them.

Make Search Console part of your weekly SEO routine. Ten minutes a week looking at Performance and Indexing data will catch most significant issues before they become serious.

Related Articles

Try Our Free SEO & GEO Tools

62+ free tools to implement what you just read — from GEO Readiness Score to Website Speed Test.