Automate YouTube SEO across many videos

Three MCPs chained: rewrite titles in Sush's voice, push updates to YouTube, verify Cloudflare cache. Scaffolded recipe — measurements pending Sush's verified runs.

DRAFT RISK: MEDIUM
BEFORE Several hours / month manual
AFTER Single-digit minutes
COST / RUN Roughly low single-digit cents per video (token usage)
COMPONENTS: playwright-mcp youtube-mcp cloudflare-mcp

Why this recipe exists

Every month I have a batch of YouTube videos that need title and description tuning based on real performance data. Doing it manually is mostly mechanical work: open YouTube Studio, find underperforming titles, read context, draft a better one, save, repeat.

Chaining three MCPs collapses most of that to a single review pass.

SUSH VERDICT NEEDED — DRAFT RECIPE. This is a structural template. Real measured before/after, real costs, and verified API behavior come from Sush’s own runs. Do not treat the numbers above as benchmarks.

The architecture

                 ┌──────────────┐
                 │   Claude     │  ← orchestrator
                 │  (the brain) │
                 └──────┬───────┘

        ┌───────────────┼───────────────┐
        │               │               │
┌───────▼─────┐  ┌──────▼──────┐  ┌────▼──────────┐
│ youtube-mcp │  │ playwright- │  │  cloudflare-  │
│             │  │     mcp     │  │      mcp      │
└─────────────┘  └─────────────┘  └───────────────┘

Three MCPs:

  • youtube-mcp — fetches video metadata, CTR data, posts updates
  • playwright-mcp — for the steps YouTube API doesn’t expose (some metadata is web-only)
  • cloudflare-mcp — purges blog cache where embedded videos are referenced

The flow

  1. Pull video list with CTR / impressions. This needs a YouTube Analytics-capable MCP, not just a Data API metadata wrapper. CTR and impression data come from the YouTube Analytics API; titles and descriptions are updated via the YouTube Data API. Both need OAuth.
  2. Filter to candidates. Claude looks at the data and picks videos with declining CTR over the past 30 days.
  3. Generate new titles. For each candidate, Claude generates 3 alternative titles based on the current title, transcript snippet, and reference titles from peers.
  4. Pick the winner. Claude scores each alternative against my brand voice rules and picks one.
  5. Show the diff. Wait for approval. A human review pass before any update goes live. Non-negotiable.
  6. Push to YouTube. youtube-mcp → videos.update (~50 quota units per call).
  7. Update blog references. If the video is embedded in a blog post, also update the post title.
  8. Purge Cloudflare cache. cloudflare-mcp → “purge cache for /blog/* and /videos/*”.
  9. Generate diff report. Markdown report of all changes.

Step-by-step build

Prerequisites

  • Claude Pro account (or any model that supports MCP tool calling)
  • youtube-mcp installed and OAuth-connected to your channel
  • playwright-mcp installed
  • cloudflare-mcp installed with API token in scope Cache:Purge only
  • A Markdown file with your brand voice rules (mine is ~/voice-rules.md)

The orchestration prompt

Save this as ~/recipes/youtube-seo.md:

You are a YouTube SEO specialist for my channel.

For each of my last 30 videos:
1. Use youtube-mcp:list_videos to fetch them with CTR + impression data.
2. Filter to videos where CTR has declined >15% over the past 30 days.
3. For each, generate 3 alternative titles using the transcript snippet and these voice rules:
   [paste content of ~/voice-rules.md]
4. Score each alternative and pick the winner.
5. Show me a diff of (current title) → (new title) for all candidates.
6. STOP and wait for my approval before pushing changes.
7. After my approval: use youtube-mcp:update_video for each.
8. Use cloudflare-mcp:purge_cache for /blog/* and /videos/*.
9. Output a markdown report summarising changes.

Run it

How you actually run this depends on your host:

  • Claude Desktop: paste the prompt into a chat where the three MCP servers are already connected.
  • Claude Code / CLI: invoke from a script if your CLI is configured with the MCPs.
  • Custom runner: call the model API + MCP clients from your own automation.

The catch

Things to know before running this for real:

  1. YouTube API quota is the real limit. videos.update costs ~50 quota units per call, and a default project has 10,000 units/day. So if you spend the entire quota on metadata updates, ~200 / day theoretical max — less if you also call list, search, or analytics endpoints. Watch the quota dashboard.
  2. Title language drift. Claude sometimes shortens titles to fit YouTube’s 100-char limit, dropping emojis I had used deliberately. Add to the prompt: “Preserve emojis from the original title unless removing them clearly improves the title.”
  3. Cache purge timing. Cloudflare’s cache purge can take a moment on the free tier. Don’t refresh blog posts immediately after.

Cost breakdown (notional — Sush to verify)

  • Claude tokens — dominated by input (transcripts can be large).
  • youtube-mcp / playwright-mcp / cloudflare-mcp — free, local.
  • Specific per-run cost depends on context size and number of candidates. Sush will replace these with measured values.

My take

Pending Sush’s review pass — leaving this section to be written by him after the next real run.

The point of the review step is non-negotiable. Most title suggestions are usable, but the failures are exactly the ones I want to catch before publishing.

Variants

  • Replace youtube-mcp with x-mcp for Twitter/X title-equivalent (post text) optimisation
  • Drop cloudflare-mcp if you don’t have a blog with embedded videos
  • Add slack-mcp to post the diff report to a channel for team review
  • Add validate-mcp before push to enforce “title contains keyword X”

Source

This recipe is open source under CC BY 4.0. View the source on GitHub.