---
title: "Build a Claude SEO Agent with Google Search Console MCP Integration"
slug: "google-search-console-claude-mcp-integration"
published: "2025-12-22"
updated: "2026-01-03"
categories:
  - "AI"
tags:
  - "Claude SEO agent"
  - "Google Search Console API integration"
  - "MCP server"
  - "SEO diagnostics"
  - "URL inspection"
  - "Claude MCP"
  - "GSC API"
  - "AI SEO tools"
  - "Search Console automation"
  - "Claude AI"
  - "Next.js OAuth"
  - "TypeScript"
  - "SEO automation"
llm-intent: "how-to"
audience-level: "intermediate"
llm-purpose: "Build a Claude SEO agent that reads live Google Search Console data via MCP: diagnostic tools, URL inspection API, real-time ranking analysis, and automated SEO workflows."
llm-prereqs:
  - "Google Search Console API"
  - "Claude"
  - "Model Context Protocol (MCP)"
  - "Next.js"
  - "TypeScript"
  - "Zod"
  - "Redis"
  - "OAuth"
  - "Node.js"
  - "@modelcontextprotocol/sdk"
---

**Summary Triples**
- (Build a Claude SEO Agent with Google Search Console MCP Integration, expresses-intent, how-to)
- (Build a Claude SEO Agent with Google Search Console MCP Integration, covers-topic, Claude SEO agent)
- (Build a Claude SEO Agent with Google Search Console MCP Integration, provides-guidance-for, Build a Claude SEO agent that reads live Google Search Console data via MCP: diagnostic tools, URL inspection API, real-time ranking analysis, and automated SEO workflows.)

### {GOAL}
Build a Claude SEO agent that reads live Google Search Console data via MCP: diagnostic tools, URL inspection API, real-time ranking analysis, and automated SEO workflows.

### {PREREQS}
- Google Search Console API
- Claude
- Model Context Protocol (MCP)
- Next.js
- TypeScript
- Zod
- Redis
- OAuth
- Node.js
- @modelcontextprotocol/sdk

### {STEPS}
1. Confirm prerequisites and OAuth setup
2. Build the GSC service layer
3. Implement URL inspection function
4. Define MCP server and tools
5. Wire analytics and date-range logic
6. Test workflows and verify outputs

<!-- llm:goal="Build a Claude SEO agent that reads live Google Search Console data via MCP: diagnostic tools, URL inspection API, real-time ranking analysis, and automated SEO workflows." -->
<!-- llm:prereq="Google Search Console API" -->
<!-- llm:prereq="Claude" -->
<!-- llm:prereq="Model Context Protocol (MCP)" -->
<!-- llm:prereq="Next.js" -->
<!-- llm:prereq="TypeScript" -->
<!-- llm:prereq="Zod" -->
<!-- llm:prereq="Redis" -->
<!-- llm:prereq="OAuth" -->
<!-- llm:prereq="Node.js" -->
<!-- llm:prereq="@modelcontextprotocol/sdk" -->

# Build a Claude SEO Agent with Google Search Console MCP Integration
> Build a Claude SEO agent that reads live Google Search Console data via MCP: diagnostic tools, URL inspection API, real-time ranking analysis, and automated SEO workflows.
Matija Žiberna · 2025-12-22

## Build Your Own Claude SEO Agent: Google Search Console Automation

Imagine asking Claude: *"Why isn't this page ranking?"* and getting an answer backed by live Google Search Console data instead of a guess. This guide shows you exactly how to build a Claude SEO agent that connects directly to your GSC account via MCP, giving your AI assistant the ability to run URL inspections, pull analytics, and diagnose ranking problems in real-time.

Instead of exporting CSVs or toggling between tabs, you'll have an AI assistant that can answer questions like:
- *"Which of my pages are indexation errors?"*
- *"What's my average position for queries where I rank 5-10?"*
- *"Why isn't this URL showing up in Google?"*

The Model Context Protocol (MCP) is the bridge that makes this possible. 

> **Prerequisite:** This guide assumes you have already set up the OAuth infrastructure we built in [Persist Google OAuth Refresh Tokens](/blog/persist-google-oauth-refresh-tokens-nextjs-redis). We'll be importing the `getValidAccessToken` helper from that project to handle the Google security handshake securely.

## The Strategy: Beyond Simple Performance Metrics

Standard SEO tools give you dashboards, but dashboards require *you* to do the work of finding the pattern. When you give an LLM like Claude access to GSC via an MCP server, you're giving it the ability to perform **diagnostics**.

The core of this integration is a service layer that communicates with the Google Search Console API. While performance metrics are great, the real "killer feature" is the URL Inspection tool. It allows the AI to answer the most frustrating question in SEO: *"Why isn't this page ranking?"*

### 1. The Service Layer (`src/lib/google/search-console.ts`)

We need a robust file to handle the API communication. Note how we import our existing auth helper to handle the tokens.

```typescript
// File: src/lib/google/search-console.ts
import { getValidAccessToken } from '@/lib/auth/google'; // From our previous article
import { z } from 'zod';

const BASE_URL = '[https://searchconsole.googleapis.com/v1](https://searchconsole.googleapis.com/v1)';

// 1. The "Smart Resolver" - Handles the annoying sc-domain: prefix
export async function resolveSiteUrl(userUrl: string): Promise<string> {
  const token = await getValidAccessToken();
  
  // Fetch all verified sites
  const response = await fetch(`${BASE_URL}/sites`, {
    headers: { 'Authorization': `Bearer ${token}` }
  });
  const data = await response.json();
  
  // Logic: Find the best matching property in your GSC account
  const site = data.siteEntry?.find((s: any) => userUrl.includes(s.siteUrl) || s.siteUrl.includes(userUrl));
  
  if (!site) throw new Error(`Could not find a verified property for ${userUrl}`);
  return site.siteUrl;
}

// 2. The Inspector - Checks Index Status
export async function inspectUrl(siteUrl: string, inspectionUrl: string): Promise<any> {
    const accessToken = await getValidAccessToken();
    const resolvedSiteUrl = await resolveSiteUrl(siteUrl);

    const response = await fetch(`${BASE_URL}/urlInspection/index:inspect`, {
        method: 'POST',
        headers: {
            'Authorization': `Bearer ${accessToken}`,
            'Content-Type': 'application/json'
        },
        body: JSON.stringify({
            inspectionUrl: inspectionUrl,
            siteUrl: resolvedSiteUrl,
            languageCode: 'en-US'
        })
    });

    if (!response.ok) throw new Error('URL Inspection failed');
    return await response.json();
}

// 3. The Analytics Engine - Flexible Querying
export async function querySearchAnalytics(
  siteUrl: string, 
  startDate: string, 
  endDate: string, 
  dimensions: string[]
) {
    const accessToken = await getValidAccessToken();
    const resolvedSite = await resolveSiteUrl(siteUrl);
    
    const response = await fetch(`${BASE_URL}/sites/${encodeURIComponent(resolvedSite)}/searchAnalytics/query`, {
         method: 'POST',
         headers: { 'Authorization': `Bearer ${accessToken}` },
         body: JSON.stringify({ startDate, endDate, dimensions })
    });
    
    return await response.json();
}

```

This code snippet allows the AI to peek behind the curtain. It returns detailed information about whether a page is indexed, the last time Google crawled it, and if it's considered mobile-friendly.

## Designing the Marketing Tools

When building MCP tools for marketing teams, you have to think about flexibility. Marketing performance isn't just a single number; it's a mix of dates, devices, and search terms.

We will implement three specific categories of tools: **Core Data**, **Deep Dive Analytics**, and **Diagnostics**.

### 2. The MCP Server Definition (`src/app/api/mcp/[transport]/route.ts`)

```typescript
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { z } from 'zod';
import { inspectUrl, querySearchAnalytics, resolveSiteUrl } from '@/lib/google/search-console';

const server = new McpServer({
  name: 'Google Search Console Agent',
  version: '1.0.0'
});

// Tool 1: Core Data - List Sites
server.tool(
    'get_search_console_sites',
    'List all verified GSC properties to see what we have access to.',
    {},
    async () => {
        // Implementation calling the sites endpoint (simplified)
        return { content: [{ type: 'text', text: "List of sites..." }] };
    }
);

// Tool 2: Deep Dive - Flexible Analytics
server.tool(
    'get_search_console_analytics',
    'Advanced tool for flexible Search Console analytics. Supports filtering by page, query, etc.',
    {
        siteUrl: z.string().describe("The website URL (e.g., buildwithmatija.com)"),
        days: z.number().optional().default(30),
        dimensions: z.array(z.enum(['date', 'query', 'page', 'country', 'device'])).optional(),
    },
    async ({ siteUrl, days, dimensions }) => {
        const endDate = new Date().toISOString().split('T')[0];
        const startDate = new Date(Date.now() - (days * 24 * 60 * 60 * 1000)).toISOString().split('T')[0];
        
        const results = await querySearchAnalytics(siteUrl, startDate, endDate, dimensions || ['date']);
        
        return { content: [{ type: 'text', text: JSON.stringify(results) }] };
    }
);

// Tool 3: Diagnostics - Index Inspector
server.tool(
    'inspect_url',
    'Checks if a specific URL is indexed by Google.',
    {
        siteUrl: z.string(),
        pageUrl: z.string()
    },
    async ({ siteUrl, pageUrl }) => {
        const result = await inspectUrl(siteUrl, pageUrl);
        return { content: [{ type: 'text', text: JSON.stringify(result) }] };
    }
);

// Export the server handler (Next.js specific)
// export const POST = ...

```

## The Business Case: 3 Powerful Workflows

Once deployed, this setup enables three distinct workflows that transform how you interact with SEO data.

### 1. Core Data Access

**The Goal:** A high-level health check.
**The Prompt:** *"Which sites do I have access to, and how is traffic looking for buildwithmatija.com?"*

The AI uses `get_search_console_sites` to list your verified properties (e.g., your 38 sites) and `get_search_console_summary` (a wrapper around analytics) to pull high-level clicks, impressions, and CTR.

### 2. "Deep Dive" Analytics

**The Goal:** Finding specific opportunities or leaks.
**The Prompt:** *"Show me daily clicks for the URL 'https://www.google.com/search?q=https://www.buildwithmatija.com/blog/payload-nextjs' over the last 30 days."*

The AI utilizes the flexibility of `get_search_console_analytics`. It automatically calculates the date range and sets `dimensions: ["date"]`.

**Example Output:**
The AI can render this data into clean markdown tables for you:

> ⏺ **Here are the daily clicks for your Payload article:**
> **Summary:** This page is performing better! It has 1 total click with activity only on the last 2 days.
> | Date | Clicks | Impressions | CTR | Position |
> | --- | --- | --- | --- | --- |
> | 2025-12-18 | 0 | 23 | 0% | 6.65 |
> | 2025-12-19 | 1 | 28 | 3.57% | 7.0 |
> 
> 

### 3. Diagnostics & Robustness (The "Gap" Fixes)

**The Goal:** Solving the "Why isn't it ranking?" mystery.
**The Prompt:** *"My page X isn't ranking. Can you check if it's indexed?"*

This is where the **Smart Resolver** we built earlier shines. The AI:

1. Uses `resolveSiteUrl` behind the scenes to handle the confusing `sc-domain:` prefix.
2. Calls `inspect_url` to check the live Google Index.
3. Reports back if the page is excluded due to a "Soft 404" or "Crawled - currently not indexed" status.

## Final Verification
## Complete Your SEO Agent Build

You now have a fully functional Claude SEO agent capable of:
- **Real-time diagnostics** — Check index status, crawl status, and mobile compatibility
- **Flexible analytics** — Query any dimension (date, query, page, device, country)
- **Intelligent insights** — Claude analyzes GSC data and provides actionable recommendations

**Related MCP & SEO Tools:**
- **[Build a Production MCP Server](/blog/build-mcp-server-nextjs)** — Foundational MCP setup
- **[OAuth for MCP Server](/blog/oauth-mcp-server-claude)** — Secure your MCP endpoints
- **[Persist Google OAuth Refresh Tokens](/blog/persist-google-oauth-refresh-tokens-nextjs-redis)** — The auth pattern this guide relies on
- **[Custom JSON-RPC MCP Implementation](/blog/custom-mcp-server-nextjs-json-rpc)** — Alternative approach without mcp-handler

**Next Steps:**
1. Deploy your MCP server to Vercel
2. Configure OAuth in Claude Web or Claude CLI
3. Start by asking Claude simple questions about your site's performance
4. Expand with additional tools (Search Analytics, Sitemap analysis, ranking tracking)

**Want to extend this?** Consider adding tools for:
- Fetching your XML sitemap and checking coverage
- Tracking keyword rankings over time
- Finding content opportunities (queries where you rank 11-20)
- Bulk URL inspection for site migrations

```