Wednesday, December 12, 2025
Agent Berlin MCP Server is Live


We've launched an MCP (Model Context Protocol) server for Agent Berlin — the first MCP from any AEO/SEO platform.
Now you can access your SEO and AEO analytics directly from AI assistants like ChatGPT, Claude, and Cursor.
What makes this different
- First MCP with semantic search across your pages, competitor pages, and keywords- Your GSC, GA4, and LLM traffic data
- All in one place, accessible via natural language
What Can You Do?
- get_analytics — Traffic data, domain authority, LLM traffic breakdown (ChatGPT, Claude, Perplexity), visibility trends
- get_brand_profile — Retrieve your brand context, competitors, industries, personas
- update_brand_profile — Update brand settings directly from your AI assistant
- search_pages — Semantic search across your pages or competitor page
- search_keywords — Find keywords by semantic query
Connect in 2 Minutes
Choose your platform:
- ChatGPT
- Claude Desktop
- Cursor
- Claude Code
- Or any MCP client
ChatGPT
Steps
- Enable Developer Mode: Go to Settings → Apps & Connectors → Advanced settings → Enable Developer Mode
- Create a Connector: Go to Settings → Connectors → Create (use https://backend.agentberlin.ai/mcp as the mcp url)
- Use in Chat: Start a new conversation → Click + → More → Select Agent Berlin
- Try it:
Show me my traffic analytics for example.com
Other MCP Clients
Any MCP-compatible client (Claude Desktop, Cursor, Claude Code, etc.) can connect using the same URL:
https://backend.agentberlin.ai/mcp
Authentication uses OAuth 2.0 with PKCE, you'll be redirected to sign in with your Agent Berlin account on first connection.
Questions?
- Check your dashboard for your project setup and data
- Reach out at support@agentberlin.ai
Stay ahead of the AI search revolution
LLM-based search is transforming user behavior rapidly. Subscribe to get exclusive insights from our experiments, discoveries, and strategies that keep you competitive in this evolving landscape.
No spam. Unsubscribe anytime. Updates only when we have valuable insights to share.
