Features Pricing Blog About
Compare vs Screaming Frog vs Sitebulb vs Slickplan vs Octopus.do vs VisualSitemaps
Use Cases For Agencies Site Migration Content Audit For SEO Pros
Log in Get Started

MCP Server Documentation

105 Tools

Model Context Protocol Server for AI Agents

Transport: stdio · Runtime: Python 3.12+

105
Tools
141
Tests
19
Modules
4
Phases

What is MCP?

The Model Context Protocol enables AI agents (Claude, etc.) to interact with IATO autonomously — crawling sites, building sitemaps, auditing SEO, and managing content without human intervention.

The MCP server translates 105 tool calls into IATO REST API requests, handling auth, retries, and error recovery automatically.

Quickstart

⚡ Connect in 30 Seconds (Recommended)

No installation or config files needed. Connect directly from Claude Desktop.

1. Open Claude Desktop Settings

Go to Settings → Connectors (or Settings → MCP Servers).

2. Add Custom Connector

Click "Add custom connector" and enter:

Name: IATO
URL: https://iato.ai/mcp

3. Authorize

Click "Add" → a browser window will open → log in with your IATO account → click "Authorize". That's it! All 77 MCP tools will appear in Claude Desktop.

Don't have an account? Sign up at iato.ai — free trial includes full MCP access.

Alternative: Connect with API Key (config file)

If you prefer to use an API key directly, add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "iato": {
      "type": "streamable-http",
      "url": "https://iato.ai/mcp",
      "headers": {
        "Authorization": "Bearer iato_your_api_key"
      }
    }
  }
}

Generate an API key at Developers → API Keys in your IATO dashboard.

Advanced: Local Connection (self-hosted / development)

For local development, install and run the MCP server via stdio transport:

# Install
python3 -m venv mcp_venv
source mcp_venv/bin/activate
pip install -r requirements-mcp.txt

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "iato": {
      "command": "/path/to/crawler/mcp_venv/bin/python",
      "args": ["-m", "mcp_server.server"],
      "cwd": "/path/to/crawler",
      "env": {
        "IATO_BASE_URL": "https://iato.ai",
        "IATO_API_KEY": "your-api-key"
      }
    }
  }
}

Tip: Use the full path to the venv's Python binary — not just python.

Architecture

┌─────────────┐  Streamable HTTP   ┌──────────────────┐              ┌─────────────┐
│  AI Agent   │ ◄────────────────► │  MCP Server (83) │ ◄──────────► │  IATO API   │
│  (Claude)   │  iato.ai/mcp       │  Python / httpx  │   REST/JSON  │  FastAPI    │
└─────────────┘  (or stdio local)  └──────────────────┘              └──────┬──────┘
                                                                           │
                                                                    ┌──────▼──────┐
                                                                    │    MySQL    │
                                                                    └─────────────┘

Response Format

Every tool returns a consistent JSON envelope:

{
  "success": true,
  "data": { ... },
  "summary": "Created workspace 'My Site' (ID: 42)",
  "next_actions": [
    { "tool": "start_crawl", "args": {...}, "description": "Begin crawling" }
  ]
}

Error responses include recovery_actions with specific tool calls — agents never hit dead ends.

Tool Reference

All 105 tools organized by module. Method badges: GET POST PUT DEL CLIENT

Workspaces (5 tools)

ToolMethodEndpointDescription
list_workspacesGET/api/workspacesList all workspaces with optional stats
get_workspaceGET/api/workspaces/{id}Get details with crawls and sitemaps
create_workspacePOST/api/workspacesCreate a new workspace
update_workspacePUT/api/workspaces/{id}Update name or description
delete_workspaceDEL/api/workspaces/{id}Delete (optionally with contents)

Crawls (5 tools)

ToolMethodEndpointDescription
start_crawlPOST/api/crawl/startStart crawl with config
get_crawl_statusGET/api/crawl/jobs/{id}Check progress, pages found, ETA
list_crawlsGET/api/crawl/jobsList crawls filtered by workspace or status
stop_crawlPOST/api/crawl/jobs/{id}/cancelStop a running crawl
delete_crawlDEL/api/crawl/jobs/{id}Delete crawl and associated data

Sitemaps (3 tools)

ToolMethodEndpointDescription
list_sitemapsGET/api/sitemapsList all visual sitemaps
get_sitemapGET/api/sitemaps/{id}Get sitemap with full node tree
create_sitemapPOST/api/sitemapsCreate from completed crawl

Sitemap Nodes (7 tools)

ToolMethodDescription
create_sitemap_nodePOSTAdd a page or section node
update_sitemap_nodePUTUpdate title, status, type, notes
delete_sitemap_nodeDELRemove a node
move_sitemap_nodePUTMove to new parent or position
duplicate_sitemap_nodePOSTCopy with optional children
bulk_update_nodesPUTUpdate status/type on multiple nodes
bulk_delete_nodesDELDelete multiple nodes at once

Content (3) · Changes (2) · Export (2)

ToolMethodDescription
get_page_contentGETFull page data — HTML, metadata, headers
search_pagesGETFull-text search across crawled pages
get_screenshotGETPage screenshot (requires JS rendering)
Changes
detect_changesPOSTCompare sitemap to latest crawl
apply_changesPOSTAccept changes, add new pages as nodes
Export
export_sitemapGETExport as JSON, CSV, standalone HTML, or redirect map (CSV/JSON). Formats: json, csv, html, redirects, redirects_json. Use redirect_code param (301/302) for redirect formats.
export_crawlGETExport as JSON, CSV, or XML sitemap

Taxonomy (10 tools)

ToolMethodDescription
get_taxonomyGETGet all categories and tags
create_categoryPOSTCreate a category
update_categoryPUTRename a category
delete_categoryDELDelete, clear from nodes
assign_categoryPUTAssign category to nodes (bulk)
create_tagPOSTCreate a colored tag
update_tagPUTUpdate label or color
delete_tagDELDelete from all nodes
assign_tagsPUTAssign tags to nodes (bulk)
bulk_auto_tagCLIENTAuto-tag by URL patterns with dry run

Navigation (4) · Schedules (7) · Recrawl (2)

ToolMethodDescription
get_menusGETList detected navigation menus
get_menu_itemsGETGet items in a specific menu
assign_to_menuPUTAdd page to navigation menu
find_orphan_pagesGETFind pages not in any menu
Schedules
list_schedulesGETList all scheduled crawls
create_schedulePOSTCreate daily/weekly/monthly schedule
get_scheduleGETDetails and next run time
delete_scheduleDELDelete a schedule
toggle_schedulePOSTEnable or disable
run_schedule_nowPOSTTrigger immediate run
get_schedule_historyGETView past runs with results
Recrawl
start_recrawlPOSTRe-crawl for version comparison
compare_crawlsPOSTDiff two crawls — added/removed/changed

Analytics (5) · SEO Audit (5) · AI Suggestions (4)

ToolMethodDescription
get_workspace_statsGETAggregate stats — crawls, sitemaps, pages
get_crawl_analyticsGETPages by status, performance, structure
get_content_metricsGETWord counts, images, content metrics
get_trendsGETMulti-crawl trend data
get_low_performing_pagesGETSlowest, largest, thinnest pages
SEO Audit
run_seo_auditGETRun full SEO audit with scores
get_seo_issuesGETIssues by severity and category
get_seo_scoreGETScore breakdown by category
get_audit_historyGETPer-page audit results
fix_seo_issuePOSTApply fix for title/meta/alt issues
AI Suggestions
generate_suggestionsCLIENTAnalyze issues, generate prioritized suggestions
prioritize_suggestionsCLIENTRe-rank by impact, effort, or quick wins
apply_suggestionPOSTApply auto-fixable suggestions
dismiss_suggestionCLIENTDismiss with optional reason

Webhooks (7 tools)

Events: crawl.started crawl.completed crawl.failed changes.detected and more.

ToolMethodDescription
list_webhooksGETList all webhooks
create_webhookPOSTCreate with event subscriptions
get_webhookGETDetails and delivery stats
update_webhookPUTUpdate URL, events, or status
delete_webhookDELDelete a webhook
test_webhookPOSTSend test payload
get_webhook_historyGETRecent delivery attempts

Agent Protocol (5 tools)

ToolMethodDescription
discover_agentsCLIENTFind other IATO instances
get_agent_capabilitiesCLIENTView tools and rate limits
delegate_taskPOSTSend tool call to another agent
get_delegated_resultCLIENTRetrieve async task results
share_resourceCLIENTShare crawl/sitemap/report

Extraction Rules (6 tools)

Create and manage CSS, XPath, or regex extraction rules to pull structured data from crawled pages. Rules can be tested against live URLs before saving, then attached to crawls or schedules via extraction_rule_ids.

ToolMethodDescription
list_extraction_rulesGETList all extraction rules
create_extraction_rulePOSTCreate rule (css/xpath/regex, target: text/html/attribute/count)
get_extraction_ruleGETGet rule details by ID
update_extraction_rulePUTUpdate rule selector, target, or options
delete_extraction_ruleDELDelete a rule
test_extraction_rulePOSTTest rule against a live URL without saving

Workflow Examples

Crawl & Map a Website

create_workspace(name="Client Site")
start_crawl(url="https://example.com", max_pages=500)
get_crawl_status(crawl_id="abc123")   # poll until complete
create_sitemap(name="Site Map", job_id="abc123")
export_sitemap(sitemap_id=1, format="json")

SEO Improvement Pipeline

run_seo_audit(crawl_id="abc123")
get_seo_issues(crawl_id="abc123", severity="error")
generate_suggestions(crawl_id="abc123", focus_areas=["seo"])
apply_suggestion(crawl_id="abc123", suggestion_id="sug_1")
get_seo_score(crawl_id="abc123")

Automated Monitoring

create_schedule(name="Weekly", url="https://example.com", frequency="weekly")
create_webhook(name="Slack", url="https://hooks.slack.com/...", events=["crawl.completed", "changes.detected"])
compare_crawls(crawl_id_old="abc123", crawl_id_new="def456")

Custom Data Extraction

test_extraction_rule(url="https://example.com", rule_type="css", selector="h1")
create_extraction_rule(name="Page Title", rule_type="css", selector="h1", target="text")
create_extraction_rule(name="Price", rule_type="css", selector=".price", target="text", match_all=True)
start_crawl(url="https://example.com", extraction_rule_ids="1,2")
get_extracted_data(crawl_id="abc123")

Site Migration — Redirect Map

start_crawl(url="https://example.com", max_pages=1000)
create_sitemap(name="Migration Plan", job_id="abc123")
# Restructure sitemap, mark pages as redirect with destination URLs
export_sitemap(sitemap_id=1, format="redirects", redirect_code=301)
# Or get structured JSON:
export_sitemap(sitemap_id=1, format="redirects_json")

WordPress Integration

Use IATO together with WordPress.com's MCP server to crawl, audit, and fix your WordPress site — all from Claude Desktop.

Setup

Add both connectors in Claude Desktop (Settings → Connectors → Add):

ConnectorURLPurpose
IATOhttps://iato.ai/mcpCrawl, audit, analyze SEO & content
WordPresshttps://public-api.wordpress.com/wpcom/v2/mcp/v1Edit posts, update metadata, manage content

How It Works

IATO MCP (Analyze)          Claude (Orchestrate)          WordPress MCP (Edit)
       |                           |                              |
  Crawl site ──────────────> Identify issues ──────────────> Fix titles
  SEO audit ───────────────> Prioritize fixes ─────────────> Update meta
  Content gaps ────────────> Generate content ─────────────> Edit posts
  Broken links ────────────> Map to pages ─────────────────> Fix links

WordPress-Ready Tools

IATO includes 3 tools that produce WordPress-ready output with post slugs:

ToolDescription
get_wordpress_seo_fixesSEO issues with current/suggested values mapped to WordPress post slugs
get_wordpress_content_gapsThin content pages with word counts, missing elements, and recommendations
get_wordpress_broken_linksBroken links mapped to source posts with anchor text and fix suggestions

Example Prompts

"Crawl my-site.com and fix any missing meta descriptions in WordPress"

"Run an SEO audit on my WordPress site and update the page titles that are too long"

"Find broken links on my site and fix them in WordPress"

"Identify thin content pages and expand them in the WordPress editor"

Claude chains tools from both servers automatically. Just describe what you want and it picks the right tools.

Configuration

VariableDefaultDescription
IATO_BASE_URLhttps://iato.aiIATO API base URL
IATO_API_KEYrequiredAPI key for authentication
MCP_REQUEST_TIMEOUT30.0HTTP timeout in seconds
MCP_MAX_RETRIES2Retry count on transient failures
MCP_RETRY_DELAY1.0Seconds between retries