Google Search Console: complete guide
How to use Google’s free tool to monitor and improve your organic rankings
Google Search Console (GSC) is the free tool Google offers site owners to monitor their presence in search results. It provides first-party data on how Google sees your site: which keywords generate impressions, which pages are indexed and which technical issues affect crawling.
Unlike third-party tools that estimate data, Search Console shows real data straight from Google. It’s an essential resource for any SEO professional and should be the first tool you check when auditing or monitoring a site.
Setup and verification
To start using GSC, you need to verify ownership of your domain. Google offers several methods: DNS record (recommended for full-domain verification), an HTML file on the server, a meta tag in the <head>, Google Analytics or Google Tag Manager.
Domain-level verification (Domain property) is the most complete because it covers all variants: www, non-www, HTTP and HTTPS. Once verified, submit your XML sitemap and allow a few days for data to start populating.
- Use DNS verification to cover all domain variants
- Submit your XML sitemap immediately after verifying
- Add users with different access levels (owner, full, restricted)
- Connect GSC with Google Analytics for cross-referenced data
Performance report: clicks, impressions, CTR and position
The performance report is the most consulted panel in GSC. It displays four key metrics: total clicks (visits from search), impressions (how often your pages appear in results), average CTR (percentage of impressions that generate a click) and average position.
You can filter these data by query, page, country, device, search type and date range. This lets you identify keywords generating many impressions but few clicks (an opportunity to improve meta tags), pages losing positions or untapped markets with potential.
- Filter by query to discover keywords already generating impressions
- Identify pages with high CTR to replicate what works
- Spot keywords with many impressions and low CTR: improve their meta tags
- Compare time periods to identify growth or decline trends
Index coverage report
The page indexing report shows the status of every URL Google has discovered: indexed, excluded, with an error or with a warning. It’s the first place to check if you notice a traffic drop or if new pages aren’t appearing in search.
Pay special attention to URLs marked as “Crawled – currently not indexed,” “Duplicate without user-selected canonical” and “Excluded by noindex tag.” These statuses can reveal thin-content problems, incorrect canonicals or accidental noindex rules.
Core Web Vitals in Search Console
GSC includes a dedicated Core Web Vitals report that groups your URLs by status: good (green), needs improvement (amber) or poor (red). The data comes from the Chrome User Experience Report (CrUX), meaning real users visiting your site.
Unlike lab tools such as Lighthouse that measure under controlled conditions, CrUX data reflects the real experience of your users on their own devices and connections. Prioritise fixing URLs marked as “poor” because they have the greatest impact on ranking.
- Monitor LCP, INP and CLS on the mobile tab (the one Google uses for indexing)
- Prioritise fixes on high-traffic pages
- After implementing improvements, use the “Validate Fix” button so Google re-evaluates
- Complement with PageSpeed Insights for URL-specific diagnostics
Manual actions and security
Manual actions are penalties applied by a human Google reviewer for violating quality guidelines: link spam, low-quality auto-generated content, cloaking, keyword stuffing or hacked content.
The security section shows whether Google has detected that your site has been compromised: malware, phishing, spam content injections or harmful downloads. Both sections should be empty; if they’re not, immediate action is required.
- Check periodically for active manual actions
- If you receive a manual action, fix the issue and submit a reconsideration request
- Set up email alerts in GSC to receive notifications about problems
- Security issues require urgent attention: they affect ranking and user trust
Advanced features and API
GSC offers additional features many users overlook: the URL Inspection tool lets you see exactly how Google renders a page and whether it detects any issues. The URL Removal tool lets you request the temporary removal of a URL from search results.
The Search Console API allows you to extract performance data programmatically, which is useful for custom dashboards, automated reports and large-scale analysis with tools like Google Sheets, Looker Studio or Python scripts.
Key Takeaways
- Google Search Console is the most reliable SEO data source because it comes straight from Google
- The performance report reveals keyword opportunities and CTR improvements
- The coverage report detects indexation issues before they hit traffic
- Core Web Vitals in GSC reflect your users’ real experience
- Manual actions and security alerts require immediate attention
Need help interpreting Search Console?
We analyse your Search Console data and turn insights into a prioritised SEO action plan. No strings attached.