Production Pipeline

Let's Vibe Studio

Publishing Episode 2 took 3 hours of manual work after the recording. The transcript existed—but everything downstream was done by hand.

So we built a pipeline. Drop a transcript, get everything. 60 seconds.

“The transcript is the edit. Text-based editing is structured data an AI can operate on.”

Every downstream artifact—show notes, social posts, chapters—is a transformation of the transcript.

What happens when you drop a transcript

9 Claude API calls. Steps 2-8 run in parallel.

Transcript (any format)
1
ParseRiverside JSON, timestamped text, or raw transcript
2
TitleEpisode title from conversation arc
parallel
3
Show NotesNarrative summary, not bullet points
parallel
4
Chapters8-14 timestamped chapter markers
parallel
5
Topics4-7 discovery tags for SEO
parallel
6
Description2-3 sentences for Spotify / Apple
parallel
7
LinksEvery tool, book, and resource mentioned
parallel
8
Quotes5-8 shareable moments for social cards
parallel

Then generates

Twitter
Announcement + 4-6 tweet thread
Farcaster
Community-focused announcement
LinkedIn
Professional framing, 1-2 paragraphs
YouTube
Full description with chapters baked in
Website
episodes.ts auto-updated + validated

One command

# Full pipeline
npx tsx src/pipeline/post-recording.ts \
--transcript ../transcripts/ep3.txt \
--episode 3 \
--guest "Simon Willison" \
--guest-handle "@simonw" \
--date "Feb 12, 2026"
# Or just show notes
npm run pipeline:notes -- --transcript ep3.txt
# Dry run (generate but don't touch the website)
npx tsx src/pipeline/post-recording.ts \
--transcript ep3.txt --episode 3 --dry-run

What you get

Four files in output/, plus the website auto-updated.

{}
ep3-pipeline-output.json
Complete structured output. Title, show notes, chapters, topics, links, quotes, all social content.
ep3-youtube-description.txt
Ready to paste. Chapters, links, subscribe CTA all formatted.
𝕏
ep3-tweets.md
Announcement, thread, and quote cards. Copy, paste, post.
📅
ep3-social-calendar.md
Day-by-day content plan. Release day through day 6.

Roadmap

Built

Phase 1: Content Pipeline

Transcript in, everything out. Show notes, chapters, social posts, YouTube description, website update. One command.

Next

Phase 2: Media Processing

Local ffmpeg on the Mac Studio. Audio normalization, clip extraction, multi-aspect-ratio export. The EditPlan is a JSON cut list — human-reviewable, AI-generated, version-controllable.

Future

Phase 3: Platform APIs

Direct upload to YouTube with chapters, Twitter with video, auto-deploy to Vercel. The last manual steps disappear.

The EditPlan concept

Phase 2 introduces the EditPlan—a JSON intermediate between AI analysis and CPU-intensive media work. Claude reads the transcript, proposes cuts and segments. You review. ffmpeg executes.

// Human-reviewable. AI-generated. Version-controllable.
{
"segments": [
{ "action": "keep", "start": 0, "end": 180 },
{ "action": "cut", "start": 180, "end": 195,
"reason": "dead air + throat clear" },
{ "action": "keep", "start": 195, "end": 2400 }
]
}

The target

Before3 hours of manual post-production
Phase 160 seconds + 15 min review
Phase 3Record. Review. One button. Published everywhere.

Built with Claude Code. Transcript is the source of truth.

Last updated Feb 6, 2026