Back to Blog
New Feature AI Integration

Ayzeo MCP Server: Use Ayzeo Inside Claude, Cursor & Gemini

Ayzeo Team April 20, 2026 9 min read Share this post

Ayzeo now speaks the Model Context Protocol. Your AI assistant can read your GSC opportunities scored by ACOS, cross-reference GA AI-referrer traffic, inspect citation history per model, and generate new content pages — all from inside the chat you already use.

Ayzeo MCP server connecting the Ayzeo bear logo to OpenAI, Anthropic, Google Gemini, and other AI clients
Connect any MCP-capable AI client to your Ayzeo project and let your assistant drive the full opportunity-to-draft loop.

Table of Contents
1. Why MCP, and Why Now
2. What You Get in v1
3. The End-to-End ACOS Workflow
4. The 14 Tools at a Glance
5. Which AI Clients Actually Work
6. Connecting in Three Steps
7. Security, Scope & Rate Limits
8. What's Next
9. Frequently Asked Questions

1. Why MCP, and Why Now

For the last year, most AI assistant workflows have looked the same: you pull up a dashboard, copy numbers into chat, paste them back, and ask the assistant to reason about them. It works — but the friction is real. Every time you want to check something new, you context-switch again.

The Model Context Protocol (MCP) is the open standard that lets AI assistants call external tools directly. Instead of copy-pasting data, the assistant queries the data. Anthropic introduced the protocol, OpenAI and Google have since added support, and the ecosystem of compatible clients is growing fast: Claude Desktop, Claude Code, Cursor, Windsurf, Gemini CLI, the OpenAI Responses API.

With today's launch, Ayzeo joins that ecosystem. A single endpoint — https://ayzeo.com/mcp/v1 — exposes your full GEO data layer to any MCP-capable client. The same ACOS-scored opportunities, per-model prompt runs, GA referrer data, and content-page generation that live inside the Ayzeo dashboard are now callable from whichever AI assistant you prefer.

"The point is not to build a chatbot inside Ayzeo. The point is to let the chatbot you already use treat Ayzeo as a first-class data source."

2. What You Get in v1

Fourteen tools covering the full GEO workflow, accessible over the standard MCP Streamable HTTP transport, authenticated with the same ProjectApiToken that powers the WordPress API. Included on all Pro and Enterprise plans at no extra cost.

  • AI visibility tools — list all saved prompts with their latest runs per AI model, including the full answer text, brand sentiment, cited URLs, and mentioned competitor brands
  • GSC tools — top-N high-ACOS opportunities with the full 9-dimension score breakdown, plus raw query data for custom filtering
  • GA tools — GA4 traffic segmented by AI referrer source (ChatGPT, Claude, Perplexity, Gemini, Copilot, You.com, Phind, Kagi) with top landing pages
  • Content tools — list existing content pages and, when the assistant picks one, generate a new AI-optimised page for a high-opportunity query
  • Website analysis — fetch the latest SEO/AI-readiness audit or trigger a fresh one

3. The End-to-End ACOS Workflow

The clearest demonstration is a single question: "Based on our ACOS, what's the next best content page to create?" Here is what the assistant actually does:

  1. Calls get_gsc_opportunities with limit: 20, excludeBrand: true. Ayzeo returns the top-20 queries ranked by ACOS, each with the full per-dimension breakdown — striking distance, CTR gap, momentum, content gap, and the rest — plus the weights used, so the assistant can reproduce the composite math.
  2. Calls get_ga_ai_traffic to see which landing pages already receive AI-referrer sessions. An opportunity for a query whose topPage is already converting from ChatGPT is worth more than one where no AI referrer has ever landed.
  3. Cross-references hasDedicatedContentPage flags on each opportunity. If a query already has a dedicated page, the assistant recommends enhancement instead of creation.
  4. Picks the single best candidate and walks through the reasoning: "Query X scored 87 — position 8, zero clicks (CTR gap), rising impressions (momentum 75), landing on the homepage instead of a dedicated page (content gap 90). No existing content page."
  5. If you agree, it calls create_content_page_from_query. A few minutes later, the new page appears in your Ayzeo dashboard with SEO elements, answer capsules, and structured data suggestions ready to review.

This is the loop the dashboard alone cannot run. The dashboard gives you the data; MCP lets your assistant act on the data.


4. The 14 Tools at a Glance

Project & analysis

  • get_project_info — project metadata
  • get_analysis_results — latest advanced website analysis
  • run_website_analysis — trigger a fresh site audit
  • get_project_stats — aggregated counts by AI model

AI visibility

  • list_prompts — saved prompts with latest per-model runs
  • run_prompt — re-run a saved prompt for before/after comparison
  • run_ai_visibility_check — ad-hoc AI search for any query
  • get_citation_history — recent citation results
  • get_brand_mentions — aggregated brand mentions with sentiment

GSC & GA data

  • get_gsc_opportunities — top-N ACOS-scored queries with dimension breakdown
  • get_gsc_queries — paginated raw GSC data
  • get_ga_ai_traffic — GA4 traffic by AI referrer source

Content

  • get_content_pages — already-generated pages
  • create_content_page_from_query — generate a new page for a query

Your assistant discovers all 14 automatically via MCP's tools/list handshake.


5. Which AI Clients Actually Work

The MCP ecosystem is asymmetric right now, so a practical honesty check. Every client below has been verified end-to-end against the live Ayzeo MCP endpoint:

  • Claude Code CLI — fastest path for engineers. One claude mcp add --transport http command with an Authorization: Bearer header and you’re connected.
  • Cursor — native remote HTTP MCP support via ~/.cursor/mcp.json (url + headers). Settings → Tools & MCP shows a green dot when the server connects.
  • Windsurf — same JSON shape as Cursor; config lives at ~/.codeium/windsurf/mcp_config.json.
  • Claude Desktop — Claude Desktop’s Custom Connector UI (Settings → Connectors, Pro/Max/Team plans) only supports OAuth client id/secret auth, not bearer-header auth. Since Ayzeo uses bearer tokens, the supported path is the mcp-remote stdio bridge — which works on every plan. Edit claude_desktop_config.json, add an entry with command: "npx" + args: ["-y", "mcp-remote", <url>, "--header", "Authorization: Bearer <token>"], restart Claude Desktop fully.
  • Gemini CLI — Google’s command-line client has native remote MCP. Use gemini mcp add --transport http or edit ~/.gemini/settings.json with httpUrl + headers.
  • OpenAI Responses API — for programmatic agents, pass Ayzeo as a hosted MCP tool in the /v1/responses call with type: "mcp", server_url, authorization: "Bearer <token>". The OpenAI Agents SDK wraps the same pattern.
  • ChatGPT.com (end-user app) — currently ships ready-made Connectors for specific partners and restricts custom remote MCP to a “Developer mode” beta. For a third-party tool server like Ayzeo, the reliable path today is the Responses API above. We’ll publish a native setup once OpenAI opens general-purpose MCP connectors to end users.
  • Gemini.com — no MCP client on the web yet. The CLI is the current Google path.

6. Connecting in Three Steps

The flow is identical no matter which client you pick:

  1. Open Project Settings → API Tokens in Ayzeo. Create a token, name it after the client you'll paste it into (claude-desktop, cursor-laptop). Copy it immediately — you won't see it again.
  2. Scroll to the Connect to AI (MCP) card on the same page. You'll see the endpoint URL and a ready-to-paste Claude Desktop config snippet. The full tool list is in a collapsible section.
  3. Paste the URL and token into your client's MCP config (exact snippets for each client are in the help center guide). Restart the client. Ask your first question.

A good first question: "Using Ayzeo, show me my top 5 ACOS opportunities and explain why the number-one query scored so high." If the assistant comes back with a ranked list and a per-dimension justification, the whole stack is working.


7. Security, Scope & Rate Limits

A few practical details that matter once you start using MCP at volume:

  • Per-project scope. Each token is bound to a single Ayzeo project. A compromised token exposes one project's data, never account-wide.
  • Same rate limit as the WordPress API. 100 requests per 5 minutes per token. Aggressive agent loops can burn this fast — if you're chaining dozens of tool calls, split across multiple tokens or throttle the agent.
  • Pro-plan gate enforced per request. Downgrade a subscription and MCP stops working on the next call — no cached access.
  • Stateless server. Each request opens a fresh MCP session and closes it when the response is sent. SSE streaming still works within a single request; we just don’t persist any cross-request context between tool calls. Good for security, simpler auth model.
  • Content generation counts against the daily quota. create_content_page_from_query uses your project's 5-per-day content-page budget, the same cap as the web UI and WordPress API. The assistant can't accidentally burn through unlimited AI credits.
  • Structured errors for missing integrations. Call a GSC-dependent tool before connecting Search Console and you get back { error: "gsc_not_connected", setupUrl: "..." }. The assistant reads this and tells the user how to fix it.

8. What's Next

A few items already queued for follow-up releases based on early dogfooding:

  • Sampling and elicitation — the MCP protocol supports the server asking the client to run an LLM completion mid-tool-call. This would let ACOS reasoning be even richer without round-tripping text through a big payload.
  • Organisation-wide tokens — v1 is per-project. You can work around this today by registering the same MCP server multiple times in your AI client with distinct names (ayzeo-brand-a, ayzeo-brand-b…) — one entry per project token. A single organisation-scoped token for agencies with dozens of client projects is still on the roadmap.
  • Write-back for GSC/GA — v1 is read-only for those sources. There's no plan to change that for GSC, but GA annotation on content-page publish would be useful.
  • Native ChatGPT support — once OpenAI's ChatGPT.com supports general-shape MCP servers in Connectors, we'll validate and publish setup instructions.

9. Frequently Asked Questions

Q: What is MCP and why does Ayzeo support it?
A: MCP (Model Context Protocol) is the open standard Anthropic, OpenAI, and Google are converging on for connecting AI assistants to external tools and data sources. Ayzeo's MCP server puts your full GEO data layer — AI visibility scores, GSC opportunities, GA AI-referrer traffic, brand mentions, content pages — directly into any MCP-capable AI client. Instead of context-switching to the Ayzeo dashboard, you ask your assistant questions in natural language and it calls the right tools to answer.
Q: Which AI clients can I use?
A: Anything that speaks remote MCP over Streamable HTTP with a custom Authorization header. Verified: Claude Code CLI, Cursor, Windsurf, Claude Desktop (via the mcp-remote stdio bridge — its native Custom Connector UI only supports OAuth client id/secret, not bearer tokens, so mcp-remote is the Ayzeo path), Gemini CLI, and the OpenAI Responses API / Agents SDK for programmatic agents. ChatGPT.com end-user app currently restricts custom MCP servers to a Developer-mode beta, so the reliable ChatGPT path today is the Responses API.
Q: How does authentication work?
A: Create a project API token in Project Settings → API Tokens, then paste the URL and token into your AI client's MCP config. Each token is scoped to a single Ayzeo project and can be revoked at any time. Rate limits are shared with the WordPress API: 100 requests per 5 minutes per token.
Q: What does the AI agent actually see?
A: Fourteen tools in v1: project info, latest analysis, citation history, per-model prompt runs with full answer text and sentiment, brand mentions with sentiment aggregation, GSC opportunities scored by ACOS with the full 9-dimension breakdown, raw GSC queries, GA4 AI-referrer traffic with top landing pages, and one-tool content-page generation. The assistant discovers these via the standard MCP tools/list handshake.
Q: Can the assistant generate content pages?
A: Yes. The create_content_page_from_query tool triggers Ayzeo's content generation pipeline for a high-ACOS query and returns the new page id plus generated SEO elements. It counts against the same 5-per-day content-page quota that the web UI and WordPress API use. In practice, an assistant will chain get_gsc_opportunities → get_ga_ai_traffic → create_content_page_from_query to close the full gap-to-draft loop inside the chat.
Q: What happens if GSC or GA is not connected?
A: GSC/GA-dependent tools return a structured error with codes like gsc_not_connected, gsc_sync_pending, ga_not_connected, or ga_sync_pending — each includes a setupUrl pointing at the Ayzeo settings page that finishes the setup. Your AI assistant reads this and tells you exactly how to fix it instead of failing silently.
Q: Which plans include the MCP server?
A: All Pro and Enterprise plans. Free and Starter plans do not have MCP access. The same ProjectApiToken powers both the WordPress API and MCP, so existing Pro customers can try it without any new setup other than generating a token.
Q: Can I use the MCP server across multiple Ayzeo projects in the same AI client?
A: Yes. Each token is scoped to a single project, but every MCP client (Claude Code, Cursor, Claude Desktop, Gemini CLI, …) lets you register multiple MCP servers in its config. Create one token per project, add each as its own entry with a distinct name — ayzeo-brand-a, ayzeo-brand-b, ayzeo-agency-client-3, and so on — and your assistant can reach all of them side by side. Every tool call is clearly scoped to one named server, and you can revoke any single project’s token without touching the others.
Q: Is my data safe?
A: Tokens are per-project so a compromised token exposes only that one project — never account-wide data. The server is stateless (no session caching), all traffic runs over HTTPS, and the existing apiLimiter prevents runaway usage. No credentials from the AI client ever reach Ayzeo beyond the bearer token.
Q: How do I test that it's working?
A: The fastest sanity check is the MCP Inspector (npx @modelcontextprotocol/inspector) pointed at your endpoint with the bearer header. It lets you list and call every tool visually without needing an AI client. The end-to-end test is to ask your assistant "Based on our ACOS, what is the next best content page?" — if it returns a ranked list with per-dimension score breakdowns, the whole stack is working.

Try It With Your Own Project

The MCP server is live on all Pro and Enterprise plans as of today. If you already have a Pro subscription, create a project API token and you can be connected to Claude Desktop or Cursor in under two minutes. If you're on Starter or Free, this is one of the clearest reasons yet to upgrade — the ACOS-driven content loop simply does not exist anywhere else.

Bring Ayzeo Into Your AI Assistant

All 14 tools included on Pro and Enterprise plans. Two-minute setup for Claude Desktop, Cursor, Gemini CLI, and friends.

Key Takeaways

  • MCP is the open standard for connecting AI assistants to tools — Anthropic, OpenAI, and Google all support it
  • 14 tools cover the full GEO workflow from data query to content-page creation
  • ACOS lives in the chat now — ask "what's the next best content page?" and get a reasoned answer inline
  • Pro/Enterprise only, uses the existing ProjectApiToken, rate-limited the same as the WordPress API
  • Best clients today: Claude Code, Cursor, Claude Desktop (via Custom Connector or mcp-remote), Gemini CLI

Related Articles