MCP Server / gsc
Transport
Tools (20)
Skill
How to invoke
seo-weekly-report
*"Run the SEO weekly report for example.com"*
cannibalization-check
*"Check for keyword cannibalization on example.com"*
indexing-audit
*"Audit indexing for my top pages"*
content-opportunities
*"Find content opportunities for example.com"*
Variable
Required
GSC_OAUTH_CLIENT_SECRETS_FILE
One of these two
GSC_CREDENTIALS_PATH
One of these two
GSC_DATA_STATE
Optional
GSC_ALLOW_DESTRUCTIVE
Optional
list_properties
Shows all your GSC properties
get_site_details
Shows details about a specific site
add_site
Adds a new site to your GSC properties
delete_site
Removes a site from your GSC properties
get_search_analytics
Shows top queries and pages with metrics
get_performance_overview
Gives a summary of site performance
check_indexing_issues
Checks if pages have indexing problems
inspect_url_enhanced
Detailed inspection of a specific URL
get_sitemaps
Lists all sitemaps for your site
submit_sitemap
Submits a new sitemap to Google
Dokumentation
Google Search Console MCP server for SEOs
April 2026 (v0.3.0): Coming to the Cursor Marketplace — one-click install with bundled SEO skills. Also: token storage moved to user config dir (survives
uvxupgrades), all data tools now return structured JSON, and 39 new unit tests. See the Changelog for details.
A Model Context Protocol (MCP) server that connects Google Search Console (GSC) to AI assistants, allowing you to analyze your SEO data through natural language conversations. Works with Claude, Cursor, Codex, Gemini CLI, Antigravity, and any other MCP-compatible client. This integration gives you access to property information, search analytics, URL inspection, and sitemap management—all through simple chat.
Cursor Marketplace
One-click install available — search for
mcp-search-consolein the Cursor Marketplace.
After installing, configure your credentials (see Getting Started below) then use the bundled skills directly in Cursor Agent chat:
| Skill | How to invoke | What it does |
|---|---|---|
| seo-weekly-report | "Run the SEO weekly report for example.com" | Full 28-day performance summary with period-over-period comparison and top queries |
| cannibalization-check | "Check for keyword cannibalization on example.com" | Finds queries where multiple pages compete; recommends which to keep |
| indexing-audit | "Audit indexing for my top pages" | Batch-inspects top 20 pages and returns a prioritized fix list |
| content-opportunities | "Find content opportunities for example.com" | Surfaces position-11-20 queries with high impressions and low CTR |
Required environment variables (set in Cursor MCP settings after install)
| Variable | Required | Description |
|---|---|---|
| GSC_OAUTH_CLIENT_SECRETS_FILE | One of these two | Path to your OAuth client_secrets.json |
| GSC_CREDENTIALS_PATH | One of these two | Path to your service account credentials JSON |
| GSC_DATA_STATE | Optional | all (default, matches GSC dashboard) or final (2–3 day lag) |
| GSC_ALLOW_DESTRUCTIVE | Optional | Set to true to enable add/delete site and delete sitemap tools |
First-time authentication (OAuth users only)
After installing, ask your AI assistant: "Authenticate my Google Search Console" — it will run the reauthenticate tool which opens a browser window once to authorize access. Subsequent uses are token-based and require no interaction.
What Can This Tool Do For SEO Professionals?
-
Property Management
- See all your GSC properties in one place
- Get verification details and basic site information
- Add new properties to your account
- Remove properties from your account
-
Search Analytics & Reporting
- Discover which search queries bring visitors to your site
- Track impressions, clicks, and click-through rates
- Analyze performance trends over time
- Compare different time periods to spot changes
- Visualize your data with charts and graphs created by Claude
-
URL Inspection & Indexing
- Check if specific pages have indexing problems
- See when Google last crawled your pages
- Inspect multiple URLs at once to identify patterns
- Get actionable insights on how to improve indexing
-
Sitemap Management
- View all your sitemaps and their status
- Submit new sitemaps directly through Claude
- Check for errors or warnings in your sitemaps
- Monitor sitemap processing status
Available Tools
Here's what you can ask your AI assistant to do once you've set up this integration:
| What You Can Ask For | What It Does | What You'll Need to Provide |
|---------------------------------|-------------------------------------------------------------|----------------------------------------------------------------|
| list_properties | Shows all your GSC properties | Nothing - just ask! |
| get_site_details | Shows details about a specific site | Your website URL |
| add_site | Adds a new site to your GSC properties | Your website URL |
| delete_site | Removes a site from your GSC properties | Your website URL |
| get_search_analytics | Shows top queries and pages with metrics | Your website URL, time period, and optional row_limit (default 20, max 500) |
| get_performance_overview | Gives a summary of site performance | Your website URL and time period |
| check_indexing_issues | Checks if pages have indexing problems | Your website URL and list of pages to check |
| inspect_url_enhanced | Detailed inspection of a specific URL | Your website URL and the page to inspect |
| get_sitemaps | Lists all sitemaps for your site | Your website URL |
| submit_sitemap | Submits a new sitemap to Google | Your website URL and sitemap URL |
For a complete list of all 20 available tools and their detailed descriptions, ask your AI assistant to "list tools" after setup.
Getting Started (No Coding Experience Required!)
1. Set Up Google Search Console API Access
Before using this tool, you'll need to create API credentials that allow your AI assistant to access your GSC data:
Authentication Options
The tool supports two authentication methods:
1. OAuth Authentication (Recommended)This method allows you to authenticate with your own Google account, which is often more convenient than using a service account. It will have access to the same resources you normally do.
Set GSC_SKIP_OAUTH to "true", "1", or "yes" to skip OAuth authentication and use only service account authentication
- Go to the Google Cloud Console and create a Google Cloud account if you don't have one
- Create a new project or select an existing one
- Enable the Search Console API for your project
- Add scope
https://www.googleapis.com/auth/webmastersto your project - Go to the "Credentials" page
- Click "Create Credentials" and select "OAuth client ID"
- Configure the OAuth consent screen
- For application type, select "Desktop app"
- Give your OAuth client a name and click "Create"
- Download the client secrets JSON file (it will be named something like
client_secrets.json) - Place this file in the same directory as the script or set the
GSC_OAUTH_CLIENT_SECRETS_FILEenvironment variable to point to its location
When you run the tool for the first time with OAuth authentication, it will open a browser window asking you to sign in to your Google account and authorize the application. After authorization, the tool will save the token for future use.
2. Service Account AuthenticationThis method uses a service account, which is useful for automated scripts or when you don't want to use your personal Google account. This requires adding the service account as a user in Google Search Console.
Setup Instructions:- Go to the Google Cloud Console and create a Google Cloud account if you don't have one
- Create a new project or select an existing one
- Enable the Search Console API for your project
- Go to the "Credentials" page
- Click "Create Credentials" and select "Service Account"
- Fill in the service account details and click "Create"
- Click on the newly created service account
- Go to the "Keys" tab and click "Add Key" > "Create new key"
- Select JSON format and click "Create"
- Download the key file and save it as
service_account_credentials.jsonin the same directory as the script or set theGSC_CREDENTIALS_PATHenvironment variable to point to its location - Add your service account email address to appropriate Search Console properties
🎬 Watch this beginner-friendly tutorial on Youtube:
Click the image above to watch the step-by-step video tutorial
2. Install Required Software
You'll need to install these tools on your computer:
- Python (version 3.11 or newer) - This runs the MCP server
- An MCP-compatible AI client — Claude Desktop, Cursor, Codex CLI, Gemini CLI, or Antigravity are all supported
Make sure Python is properly installed and available in your system path before proceeding.
3. Install the MCP Server
Option A — uvx (simplest, no clone needed)
If you have uv installed, you can skip cloning entirely. Use this config directly in step 5:
{
"mcpServers": {
"gscServer": {
"command": "uvx",
"args": ["mcp-search-console"],
"env": {
"GSC_CREDENTIALS_PATH": "/FULL/PATH/TO/service_account_credentials.json",
"GSC_SKIP_OAUTH": "true"
}
}
}
}
uvx installs the server in an isolated environment automatically and keeps it up to date. No virtual environment management needed. Skip to Step 5 if using this option.
Option B — Clone manually (more control)
Download this tool to your computer. The easiest way is:
- Click the green "Code" button at the top of this page
- Select "Download ZIP"
- Unzip the downloaded file to a location you can easily find (like your Documents folder)
Alternatively, if you're familiar with Git:
git clone https://github.com/AminForou/mcp-gsc.git
4. Install Required Components (Option B only)
Open your computer's Terminal (Mac) or Command Prompt (Windows):
-
Navigate to the folder where you unzipped the files:
# Example (replace with your actual path): cd ~/Documents/mcp-gsc-main -
Create a virtual environment (this keeps the project dependencies isolated):
# Using uv (recommended): uv venv .venv # If uv is not installed, install it first: pip install uv # Then create the virtual environment: uv venv .venv # OR using standard Python: python -m venv .venvNote: If you get a "pip not found" error when trying to install uv, see the "If you get 'pip not found' error" section below.
-
Activate the virtual environment:
# On Mac/Linux: source .venv/bin/activate # On Windows: .venv\Scripts\activate -
Install the required dependencies:
# Using uv: uv pip install -r requirements.txt # OR using standard pip: pip install -r requirements.txtIf you get "pip not found" error:
# First ensure pip is installed and updated: python3 -m ensurepip --upgrade python3 -m pip install --upgrade pip # Then try installing the requirements again: python3 -m pip install -r requirements.txt # Or to install uv: python3 -m pip install uv
When you see (.venv) at the beginning of your command prompt, it means the virtual environment is active and the dependencies will be installed there without affecting your system Python installation.
5. Connect Your AI Client to Google Search Console
The configuration below uses Claude Desktop as an example. For other clients (Cursor, Codex, Gemini CLI, Antigravity), the JSON structure is the same — check your client's documentation for where the config file lives.
- Download and install Claude Desktop if you haven't already
- Make sure you have your Google credentials file saved somewhere on your computer
- Open your computer's Terminal (Mac) or Command Prompt (Windows) and type:
# For Mac users:
nano ~/Library/Application\ Support/Claude/claude_desktop_config.json
# For Windows users:
notepad %APPDATA%\Claude\claude_desktop_config.json
- Add the following configuration text (this tells your AI client how to connect to GSC):
OAuth authentication (using your own account)
{
"mcpServers": {
"gscServer": {
"command": "/FULL/PATH/TO/-main/.venv/bin/python",
"args": ["/FULL/PATH/TO/mcp-gsc-main/gsc_server.py"],
"env": {
"GSC_OAUTH_CLIENT_SECRETS_FILE": "/FULL/PATH/TO/client_secrets.json",
"GSC_DATA_STATE": "all"
}
}
}
}
Service account authentication
{
"mcpServers": {
"gscServer": {
"command": "/FULL/PATH/TO/-main/.venv/bin/python",
"args": ["/FULL/PATH/TO/mcp-gsc-main/gsc_server.py"],
"env": {
"GSC_CREDENTIALS_PATH": "/FULL/PATH/TO/service_account_credentials.json",
"GSC_SKIP_OAUTH": "true",
"GSC_DATA_STATE": "all"
}
}
}
}
Environment Variables Reference
| Variable | Required | Default | Description |
|---|---|---|---|
| GSC_OAUTH_CLIENT_SECRETS_FILE | OAuth only | client_secrets.json (same folder) | Path to your OAuth client secrets JSON file |
| GSC_CREDENTIALS_PATH | Service account only | service_account_credentials.json (same folder) | Path to your service account JSON key file |
| GSC_SKIP_OAUTH | No | false | Set to "true" to force service account auth and skip OAuth |
| GSC_DATA_STATE | No | "all" | "all" returns fresh data matching the GSC dashboard. "final" returns only confirmed data (2–3 day lag). |
Important: Replace all paths with the actual locations on your computer:
- The first path should point to the Python executable inside your virtual environment
- The second path should point to the
gsc_server.pyfile inside the folder you unzipped - The third path should point to your Google service account credentials JSON file
Examples:
- Mac:
- Python path:
/Users/yourname/Documents/mcp-gsc/.venv/bin/python - Script path:
/Users/yourname/Documents/mcp-gsc/gsc_server.py
- Python path:
- Windows:
- Python path:
C:\\Users\\yourname\\Documents\\mcp-gsc\\.venv\\Scripts\\python.exe - Script path:
C:\\Users\\yourname\\Documents\\mcp-gsc\\gsc_server.py
- Python path:
-
Save the file:
- Mac: Press Ctrl+O, then Enter, then Ctrl+X to exit
- Windows: Click File > Save, then close Notepad
-
Restart your AI client
-
When it opens, you should now see GSC tools available in the tools section
6. Start Analyzing Your SEO Data!
Now you can ask your AI assistant questions about your GSC data! It can not only retrieve the data but also analyze it, explain trends, and create visualizations to help you understand your SEO performance better.
Here are some powerful prompts you can use with each tool:
| Tool Name | Sample Prompt |
|---------------------------------|--------------------------------------------------------------------------------------------------|
| list_properties | "List all my GSC properties and tell me which ones have the most pages indexed." |
| get_site_details | "Analyze the verification status of mywebsite.com and explain what the ownership details mean." |
| add_site | "Add my new website https://mywebsite.com to Search Console and verify its status." |
| delete_site | "Remove the old test site https://test.mywebsite.com from Search Console." |
| get_search_analytics | "Show me the top 20 search queries for mywebsite.com in the last 30 days, highlight any with CTR below 2%, and suggest title improvements." |
| get_performance_overview | "Create a visual performance overview of mywebsite.com for the last 28 days, identify any unusual drops or spikes, and explain possible causes." |
| check_indexing_issues | "Check these important pages for indexing issues and prioritize which ones need immediate attention: mywebsite.com/product, mywebsite.com/services, mywebsite.com/about" |
| inspect_url_enhanced | "Do a comprehensive inspection of mywebsite.com/landing-page and give me actionable recommendations to improve its indexing status." |
| batch_url_inspection | "Inspect my top 5 product pages, identify common crawling or indexing patterns, and suggest technical SEO improvements." |
| get_sitemaps | "List all sitemaps for mywebsite.com, identify any with errors, and recommend next steps." |
| list_sitemaps_enhanced | "Analyze all my sitemaps for mywebsite.com, focusing on error patterns, and create a prioritized action plan." |
| submit_sitemap | "Submit my new product sitemap at https://mywebsite.com/product-sitemap.xml and explain how long it typically takes for Google to process it." |
| get_sitemap_details | "Check the status of my main sitemap at mywebsite.com/sitemap.xml and explain what the warnings mean for my SEO." |
| get_search_by_page_query | "What search terms are driving traffic to my blog post at mywebsite.com/blog/post-title? Identify opportunities to optimize for related keywords." |
| compare_search_periods | "Compare my site's performance between January and February. What queries improved the most, which declined, and what might explain these changes?" |
| get_advanced_search_analytics | "Analyze queries with high impressions but positions below 10, filtered to mobile traffic in the US only. Use filters with country=usa and device=MOBILE." |
You can also ask your AI assistant to combine multiple tools and analyze the results. For example:
-
"Find my top 20 landing pages by traffic, check their indexing status, and create a report highlighting any pages with both high traffic and indexing issues."
-
"Analyze my site's performance trend over the last 90 days, identify my fastest-growing queries, and check if the corresponding landing pages have any technical issues."
-
"Compare my desktop vs. mobile search performance, visualize the differences with charts, and recommend specific pages that need mobile optimization based on performance gaps."
-
"Identify queries where I'm ranking on page 2 (positions 11-20) that have high impressions but low CTR, then inspect the corresponding URLs and suggest title and meta description improvements."
Your AI assistant will use the GSC tools to fetch the data, present it in an easy-to-understand format, create visualizations when helpful, and provide actionable insights based on the results.
Data Visualization Capabilities
Your AI assistant can help you visualize your GSC data in various ways:
- Trend Charts: See how metrics change over time
- Comparison Graphs: Compare different time periods or dimensions
- Performance Distributions: Understand how your content performs across positions
- Correlation Analysis: Identify relationships between different metrics
- Heatmaps: Visualize complex datasets with color-coded representations
Simply ask your AI assistant to "visualize" or "create a chart" when analyzing your data, and it will generate appropriate visualizations to help you understand the information better.
Troubleshooting
Python Command Not Found
On macOS, the default Python command is often python3 rather than python, which can cause issues with some applications including Node.js integrations.
If you encounter errors related to Python not being found, you can create an alias:
-
Create a Python alias (one-time setup):
# For macOS users: sudo ln -s $(which python3) /usr/local/bin/python # If that doesn't work, try finding your Python installation: sudo ln -s /Library/Frameworks/Python.framework/Versions/3.11/bin/python3 /usr/local/bin/python -
Verify the alias works:
python --version
This creates a symbolic link so that when applications call python, they'll actually use your python3 installation.
AI Client Configuration Issues
If you're having trouble connecting:
- Make sure all file paths in your configuration are correct and use the full path
- Check that your service account has access to your GSC properties
- Restart your AI client after making any changes
- Look for error messages in the response when you try to use a tool
- Ensure your virtual environment is activated when running the server manually
Other Unexpected Issues
If you encounter any other unexpected issues during installation or usage:
- Copy the exact error message you're receiving
- Use ChatGPT or Claude and explain your problem in detail, including:
- What you were trying to do
- The exact error message
- Your operating system
- Any steps you've already tried
- AI assistants can often help diagnose and resolve technical issues by suggesting specific solutions for your situation
Remember that most issues have been encountered by others before, and there's usually a straightforward solution available.
Safety: Destructive Operations
By default, the tools that can permanently modify your GSC account (add_site, delete_site, delete_sitemap) are disabled. If you ask the AI to "clean things up" or "remove old properties", it will explain the safety restriction instead of deleting data.
To enable these tools, set the GSC_ALLOW_DESTRUCTIVE environment variable:
# In your MCP client config (Claude Desktop, Cursor, etc.)
GSC_ALLOW_DESTRUCTIVE=true
If you never use add/delete operations, you don't need to do anything — your existing setup works exactly as before.
Remote Deployment & Docker (Advanced)
The standard setup above runs the server locally on your machine. This section is only for users who want to run it on a remote server, in a container, or share it with a team — existing local users don't need any of this.
HTTP Transport
By default the server communicates over stdio (standard input/output), which only works locally. To run it as a network server, set the MCP_TRANSPORT environment variable:
MCP_TRANSPORT=sse MCP_HOST=0.0.0.0 MCP_PORT=3001 python gsc_server.py
Your MCP client then connects to http://your-server:3001/sse instead of launching the process locally.
| Variable | Default | Description |
|---|---|---|
| MCP_TRANSPORT | stdio | Set to sse for network/remote use |
| MCP_HOST | 127.0.0.1 | Host to bind (use 0.0.0.0 for all interfaces) |
| MCP_PORT | 3001 | Port to bind |
Docker
A Dockerfile is included in the repo. Build and run:
# Build the image
docker build -t mcp-gsc .
# Run locally (stdio mode — for testing)
docker run -v /path/to/client_secrets.json:/app/client_secrets.json mcp-gsc
# Run as a network server (SSE mode — for remote use)
docker run \
-e MCP_TRANSPORT=sse \
-e MCP_HOST=0.0.0.0 \
-e MCP_PORT=3001 \
-e GSC_CREDENTIALS_PATH=/app/credentials.json \
-v /path/to/credentials.json:/app/credentials.json \
-p 3001:3001 \
mcp-gsc
Cloud Platforms
The Docker image works on any container platform. Set MCP_TRANSPORT=sse, MCP_HOST=0.0.0.0, and inject credentials via environment variables or mounted secrets:
- Railway — connect your repo, set env vars in the dashboard
- Render — deploy as a Web Service, set env vars under Environment
- Fly.io —
fly deploy, set secrets withfly secrets set
Related Tools
If you work with Google Search Console regularly, you may also find these tools useful:
Advanced GSC Visualizer — A Chrome extension (14,000+ users) that brings powerful charts, annotations, and one-click API access directly inside Google Search Console. Features include:
- Interactive charts with trendlines, moving averages, and Google algorithm update overlays
- One-click export of up to 25,000 rows from the GSC API — no coding required
- Keyword cannibalization detection
- Crawl stats visualizations
- AI assistant for querying your GSC data directly in the browser
Built by the same author. Install from the Chrome Web Store →
Contributing
Found a bug or have an idea for improvement? We welcome your input! Open an issue or submit a pull request on GitHub.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Changelog
[0.3.0] — April 2026
- Cursor Marketplace plugin — added
.cursor-plugin/plugin.json,mcp.json, and 4 bundled SEO skills (seo-weekly-report,cannibalization-check,indexing-audit,content-opportunities) - Stable token storage — OAuth token now stored in the platform user config dir (
~/Library/Application Support/mcp-gsc/on macOS,~/.config/mcp-gsc/on Linux) instead of the package directory; survivesuvxupgrades. Existing tokens are silently migrated on first run. - Structured JSON output — all 13 data-returning tools now return structured JSON (
json.dumps) instead of pipe-separated text, improving AI reasoning accuracy - Dependency fix — added
platformdirs>=4.0.0; removed deprecatedoauth2clientfromrequirements.txt - MCP safety — fixed stdout pollution (
print()→logging.warning()) that could corrupt stdio MCP protocol; replaced silent browser-flow hang in MCP context with clearRuntimeErrordirecting users toreauthenticate - Test suite — 39 unit tests covering auth, all 13 data tools, safety guards, stdout cleanliness, and token migration (zero real credentials required)
[0.2.2] — April 2026
Added
- Safety mode for destructive tools:
add_site,delete_site, anddelete_sitemapare now disabled by default. SetGSC_ALLOW_DESTRUCTIVE=trueto enable them. This prevents accidental deletion of GSC properties through vague AI instructions. - HTTP/SSE transport: Set
MCP_TRANSPORT=sse(plusMCP_HOSTandMCP_PORT) to run the server as a network service instead of a local process. Enables Docker, cloud, and team deployments. - Dockerfile: Official container image using the
uvbase image. Includes.dockerignoreto prevent credential files from being baked into images. - CLAUDE.md: Project context file for AI coding assistants — covers auth, env vars, and how to add new tools.
Fixed
- Sitemap warning status:
get_sitemapsnow correctly shows "Has warnings" when a sitemap has warnings but no errors. Previously, warnings were silently ignored in the status field. (Thanks @nloadholtes!)
Improved
- PyPI package:
pyproject.tomlnow correctly declaresgsc_server.pyas the installable module.pip install mcp-gscanduvx mcp-gscnow produce a working installation. (Thanks @jjeejj!)
[0.2.0] — March 2026
Added
- Data freshness: All search analytics queries now use
dataState: "all"by default, returning data that matches the GSC dashboard instead of finalized-only data (which lags 2–3 days). Configurable via theGSC_DATA_STATEenvironment variable ("all"or"final"). - Flexible row limits:
get_search_analyticsandget_search_by_page_querynow accept an optionalrow_limitparameter (default 20, max 500). Claude will automatically choose an appropriate value based on your request — use higher values for comprehensive analysis, lower values for quick overviews. - Multi-dimension filtering:
get_advanced_search_analyticsnow accepts afiltersparameter — a JSON array of filter objects for AND logic across multiple dimensions simultaneously (e.g., country = USA and device = mobile). The existing single-filter parameters (filter_dimension,filter_operator,filter_expression) remain fully supported.
[0.2.1] — March 2026
Added
- Reauthenticate tool: New
reauthenticatetool lets you switch Google accounts by deleting the saved OAuth token and triggering a fresh browser login. Ask your AI assistant: "switch to a different Google account". (Thanks @fterenzani!)
Fixed
- Sitemap TypeError crash:
get_sitemapsandlist_sitemaps_enhancedcrashed withTypeErrorwhen a sitemap had errors or warnings, because the GSC API returns those counts as strings. Addedint()casts before comparison. (Thanks @mcprobert!) - File cache warning: Suppressed the
file_cache is only supported with oauth2client<4.0.0warning that caused crashes on MCP hosts that treat any stderr output as fatal (e.g. GitHub Copilot CLI). - Domain property 404 errors: All tools now return a clear, actionable message when a 404 occurs, explaining the exact format required and service account permission requirements for
sc-domain:properties.
Improved
- Multi-client support: README now explicitly lists Claude, Cursor, Codex, Gemini CLI, and Antigravity as supported clients with setup guidance for each.
site_urlguidance: All 15 tool docstrings now explain how to get the exact property URL fromlist_propertiesand how domain properties relate to subdomain filtering.
[0.1.0] — Initial release
- 19 tools covering property management, search analytics, URL inspection, and sitemap management
- OAuth and service account authentication
- Batch URL inspection (up to 10 URLs)
- Period comparison tool