How to Connect Google Search Console to VS Code (Step-by-Step API Setup)
Published:
Umair Akhter
14 MIN READ
Most developers check Google Search Console in a separate browser tab, copy data into a spreadsheet, and then switch back to their editor to act on it. That’s three context switches for a task that could happen in one place. If you’re already writing code in VS Code — and 75.9% of developers are, according to the 2025 Stack Overflow Developer Survey — there’s a better way.
This guide walks you through the complete setup: a Google Cloud project, three APIs, one config file, and you’ll have live GSC data (search queries, clicks, impressions, CTR, position, and Core Web Vitals) appearing directly inside VS Code. The whole process takes about 15 minutes.
Key Takeaways
- Two access tiers: Tier 0 (API key, no GSC needed) gives you PageSpeed and Core Web Vitals; Tier 1 (service account) unlocks live search query data from GSC.
- Sites that pass all Core Web Vitals thresholds rank an average of 2.3 positions higher (DebugBear, 2025) — so having CWV data in your editor pays off.
- The setup is free. You’re using Google’s own APIs, not a third-party tool.
What Is the Google Search Console API, and Why Bring It Into VS Code?
The Google Search Console API gives programmatic access to the same data you see in the GSC dashboard: search queries, click-through rates, average positions, and URL-level indexation status. Paired with the PageSpeed Insights API and the Chrome UX Report (CrUX) API, you get a complete picture of how Google sees your site and how real users experience it.
Why does this matter? Because Core Web Vitals are a confirmed Google ranking factor. Sites passing all three CWV thresholds (LCP, INP, CLS) rank 2.3 positions higher on average (DebugBear, 2025). Vodafone’s 31% improvement in LCP alone led to 8% more sales. That’s the kind of data worth watching closely — and watching it in your editor means you catch regressions the moment you push a change, not three days later when you happen to log into the GSC dashboard.
VS Code is where 75.9% of developers already spend their working hours (Stack Overflow Developer Survey, 2025). Bringing GSC data there isn’t just a convenience — it removes the friction that causes most developers to ignore SEO signals until something breaks.
What You’ll Need Before You Start
Before touching Google Cloud, make sure you have:
- A Google account with at least one property verified in Google Search Console.
- VS Code installed, with the Claude Code extension or CLI set up.
- About 15 minutes — the steps look long, but most are one-click actions.
Two tiers, one setup. The setup creates two levels of API access:
| Tier | What you need | What you get |
|---|---|---|
| Tier 0 | API key only | PageSpeed scores, Core Web Vitals (CrUX field data) |
| Tier 1 | API key + service account | Everything above, plus GSC queries, clicks, impressions, CTR, position |
You can stop at Tier 0 if you only want performance data. But Tier 1 is worth the extra five minutes — search query data is where the real insights live.
Steps 1–2: Create a Google Cloud Project and Enable the APIs
You need a Google Cloud project to generate credentials. It doesn’t cost anything for the API volumes a typical site generates.
Go to console.cloud.google.com, click New Project, and name it something you’ll recognise (for example, Claude SEO). Once the project is created, navigate to APIs & Services → Library and enable these three APIs one by one:
- Google Search Console API — powers query and click data.
- PageSpeed Insights API — powers Lighthouse scores and field CWV data.
- Chrome UX Report API — powers historical CWV trends.
Each one has a search bar. Search the name, click the result, click Enable. That’s it.
Why three? Each API is scoped to a different data source. PageSpeed and CrUX return public performance data — no site ownership required. The Search Console API returns private data tied to your verified property, which is why it needs the service account in Step 4.
Step 3: Generate Your API Key (Tier 0)
With the APIs enabled, go to Credentials → Create Credentials → API Key. Google generates a key immediately. Copy it — you’ll paste it into the config file in Step 6.
One important step: restrict the key so it can only call the APIs you’ve enabled. Click on the key, scroll to API restrictions, select Restrict key, and choose:
- PageSpeed Insights API
- Chrome UX Report API
- Knowledge Graph Search API (optional, for entity data)
Leave Application restrictions set to None for local development. If you deploy this to a server later, restrict by IP.
What does Tier 0 give you right now? Run this command inside VS Code:
/seo google pagespeed https://yoursite.com
You’ll get Lighthouse performance scores and field Core Web Vitals from real Chrome users — no GSC access needed. That’s immediately useful.
Step 4: Create a Service Account (Tier 1)
A service account is a non-human Google identity — think of it as a bot account that can authenticate with APIs on your behalf. GSC requires this because it’s serving your private search data.
Navigate to IAM & Admin → Service Accounts → Create Service Account. Give it a name (for example, claude-seo). On the permissions step, skip the role assignment — the service account doesn’t need any IAM roles for this use case. Click through to Done.
Once created, click on the service account, go to the Keys tab, click Add Key → Create New Key → JSON, and download the file. Rename it service_account.json and save it here:
Windows: C:\Users\<yourname>\.config\claude-seo\service_account.json
Mac/Linux: ~/.config/claude-seo/service_account.json
Keep this file out of version control. Add
.config/claude-seo/to your.gitignore. The JSON file contains a private key — if it ends up in a public repo, anyone can impersonate your service account.
Step 5: Add the Service Account to Google Search Console
This is the step most tutorials skip, and it’s why people get 403 errors.
The service account you just created has an email address — something like [email protected]. You can find it on the service account’s detail page in Google Cloud Console.
Open Google Search Console, go to Settings → Users and Permissions → Add User, paste that email address, and set the permission level to Full. Save.
That’s it. You’ve just given your local tooling the same access level as a human GSC user. The Search Console API will now accept requests signed with your service account credentials.
According to the Google Search Console API documentation, service account access is the recommended authentication method for automated tools. It doesn’t expire like OAuth tokens do, and it doesn’t require interactive browser login.
Step 6: Create the Config File
Create a file named google-api.json and save it next to your service account file:
C:\Users\<yourname>\.config\claude-seo\google-api.json
The content:
{
"service_account_path": "C:/Users/<yourname>/.config/claude-seo/service_account.json",
"api_key": "AIzaSy...",
"default_property": "sc-domain:yoursite.com"
}
Three fields explained:
service_account_path— absolute path to the JSON key you downloaded in Step 4. Use forward slashes even on Windows.api_key— the key from Step 3.default_property— your GSC property. Usesc-domain:yoursite.comif you verified via DNS (domain property). Usehttps://yoursite.com/if you verified via HTML file or meta tag (URL prefix property). The format must match exactly what appears in your GSC property selector.
Not sure which format your property uses? Log into GSC and check the URL in your browser. If it shows sc-domain%3Ayoursite.com, use the sc-domain: prefix.
Step 7: Run Your First Commands in VS Code
Open your VS Code terminal or Claude chat panel and run:
Tier 0 — no GSC needed:
/seo google pagespeed https://yoursite.com
Returns: Lighthouse performance score, LCP, INP, CLS, FCP, TTFB — both lab data (Lighthouse) and field data from real Chrome users.
Tier 1 — live GSC data:
/seo google gsc sc-domain:yoursite.com
Returns: top queries by clicks, impressions, CTR, average position — the same table you see in GSC’s Search Results report, live in your terminal.
What does the output look like? Here’s a sample dataset to give you a sense of the signal:
Troubleshooting: Common Errors and Fixes
Most setup issues fall into one of three categories. Here’s what causes each and how to fix it.
403 Forbidden on GSC commands The service account email hasn’t been added to Google Search Console, or it was added with the wrong permission level. Go back to Step 5, confirm the service account email matches exactly what’s in your JSON key file, and ensure the permission is set to Full (not Restricted).
“API not enabled” error One of the three APIs wasn’t enabled in Step 2. Return to APIs & Services → Library, search for the specific API named in the error, and enable it. Changes take about 30 seconds to propagate.
Wrong property format
If your command uses sc-domain:yoursite.com but your GSC property is https://yoursite.com/, you’ll get an empty result or an error. Check your default_property in google-api.json and match it exactly to your GSC property URL.
Path errors on Windows
In google-api.json, use forward slashes in the service_account_path value, even on Windows. Backslashes in JSON strings need escaping (\\), which causes hard-to-spot bugs. Stick with C:/Users/....
Common pattern observed: Most setup failures happen in Step 5, not Step 1-4. Adding the service account to GSC is a GSC-side action — it’s easy to complete all the Cloud Console steps and forget this one because it lives in a completely different product.
Frequently Asked Questions
Is this setup free?
Yes. The Google Search Console API, PageSpeed Insights API, and CrUX API are all free. Google imposes rate limits (25,000 requests per day for PageSpeed, for example), but no typical development workflow comes close to those limits. There are no charges from Google for normal usage.
What’s the difference between Tier 0 and Tier 1?
Tier 0 uses only an API key and returns public performance data — Lighthouse scores and Core Web Vitals from the Chrome UX Report. Anyone can query this data for any public URL. Tier 1 adds a service account linked to your GSC property, which returns private data: your site’s actual search queries, clicks, impressions, CTR, and average position. Tier 1 requires you to have a verified property in Google Search Console.
Can I use this setup for multiple sites?
Yes. Add each site as a separate verified property in Google Search Console, then add the same service account email as a user on each one. In your commands, specify the property you want: /seo google gsc sc-domain:site2.com. You can also update default_property in google-api.json to switch defaults, or pass the property inline.
Does this work on Mac and Linux?
Yes. The config file path uses ~/.config/claude-seo/ on Mac and Linux. The setup steps are identical — the only difference is the file path format. Use "service_account_path": "/Users/yourname/.config/claude-seo/service_account.json" in google-api.json.
Conclusion
Eight steps. One config file. Live GSC data in VS Code.
The setup gives you two things that most SEO workflows don’t: speed and proximity. Speed because PageSpeed and CWV data appear immediately, without tab-switching. Proximity because search data sits next to the code that affects it — you see a query’s average position drop and you’re already in the file that controls that page.
The SEO automation market is growing fast — from $1.99 billion in 2024 to an estimated $4.97 billion by 2033 (SEO.com, 2025). Most of that growth is tools doing what you just did manually: pulling structured data out of Google’s APIs and making it actionable. You’ve now done that for free, in 15 minutes, from your own editor.
Next step: run /seo google gsc sc-domain:yoursite.com and see which queries you’re ranking for between positions 8 and 15. Those are your quickest wins — pages already visible to Google that need a content or technical nudge to reach page one.