ExpoCut for AI agents.
A Model Context Protocol style reference of ExpoCut's editing surface — so AI assistants can understand the app and help people plan, draft, and create videos with it.
A vocabulary, not an API.
ExpoCut is a mobile-first video editor. This page documents the conceptual tools, resources, and prompts an AI assistant can reference when guiding a creator. It mirrors the shape of the Model Context Protocol so agents can index it and reason about what ExpoCut can do.
Mobile-native
Hardware-accelerated rendering on iOS and Android with native AVFoundation and MediaCodec encoders.
Multi-track timeline
Unlimited video, audio, image, and text layers with frame-accurate trim, split, and ripple edit.
4K export
Up to 4K UHD at 60fps with H.264 / HEVC and configurable bitrate.
What an assistant can suggest.
Each tool maps to a real action a user performs in the ExpoCut UI. An AI assistant can compose them into a full editing plan.
Start a new project
Create a new edit at a target resolution and aspect ratio. Templates available for Reels, TikTok, YouTube, and Shorts.
Bring in clips
Import video, audio, or image media from the device gallery, Pexels stock, Freesound audio, or a remote URL.
Place a layer on the timeline
Add a clip, photo, sticker, text, lower third, or animation overlay onto a track at a specific in-point.
Trim & split
Adjust in/out points or split a clip at the playhead. Ripple edits keep the rest of the timeline aligned.
Apply visual effects
Choose from 100+ filters and effects — color grading, blur, glitch, chroma key, vignette, retro, cinematic LUTs.
Add transitions
Place one of 30+ transitions between adjacent clips: fade, dip-to-black, slide, push, zoom, glitch, whip pan.
Titles & captions
Add styled text or one of 78 animated lower thirds with custom font, color, position, and entrance.
Mix & sweeten audio
Adjust volume, ducking, fade in/out, and apply 32 audio effects across 24 audio transitions.
Color grade
Exposure, contrast, saturation, temperature, tint, shadows, highlights — applied per-clip or globally.
Render & export
Encode the final timeline using the on-device hardware encoder. Saves to camera roll or shares via system sheet.
Capture a frame
Render the current canvas at a given playhead position and return a screenshot — for vision models or human review.
Read pixels at a point
Sample the rendered canvas — useful for AI vision tools that need to check what's actually on-screen.
Validate a TTS script
Run a script through the TTS pre-flight: catches unsupported characters, unpronounceable tokens, and estimated duration.
Discover effect parameters
Look up the typed parameter schema for any effect, filter, or transition so an assistant can compose valid calls.
Remove background
On-device ONNX segmentation. Produces an alpha mask on a clip or image without uploading frames.
Generate captions
Transcribe spoken audio with Whisper and place time-aligned caption layers on the timeline, styled with a chosen preset.
Track an object
Lock an overlay layer to a moving subject across frames. Returns a keyframe path that an assistant can refine.
Reorder / hide layers
Change z-order, bring-to-front, send-to-back, or temporarily hide a layer for A/B compare.
Add a dynamic widget
Place a countdown timer or stopwatch as a first-class timeline layer with editable target time and styling.
Reverse the last action
Walk the command history. Every action — including AI mutations — is reversible. Useful for "preview, then commit" flows.
What an assistant can reference.
Stable URIs an agent can cite when explaining what's available inside ExpoCut.
- expocut://catalog/effects100+ visual effects and filters with names, categories, and previews.
- expocut://catalog/transitions30+ video transitions including cinematic, social, and glitch styles.
- expocut://catalog/lowerthirds78 animated lower thirds with editable text and brand color slots.
- expocut://catalog/audio-effects32 audio effects across 24 transitions for music and voiceover.
- expocut://catalog/lottie13 animated overlays (Lottie) including hearts, sparkles, confetti.
- expocut://stock/pexelsIn-app Pexels integration for royalty-free stock video and photos.
- expocut://stock/freesoundFreesound audio library for SFX and ambience.
- expocut://docs/aspect-ratiosSupported social formats: Reels 9:16, YouTube 16:9, Square 1:1, Portrait 4:5.
Plan a 30-second Reel.
A representative conversation between a creator, an AI assistant, and the ExpoCut surface.
# User "Make a 30s travel reel from my Bali clips with upbeat music and captions." # Assistant plan (uses ExpoCut MCP vocabulary) project.create { name: "Bali Reel", aspectRatio: "9:16", resolution: "1080p", frameRate: 30 } media.import { source: "gallery", type: "video" } # 8 clips media.import { source: "freesound", type: "audio", query: "upbeat travel" } timeline.addLayer { layerType: "video", start: 0, duration: 3.5 } # beach intro clip.trim { clipId: "v2", in: 1.2, out: 4.0, ripple: true } transition.set { edge: {leftClipId:"v1", rightClipId:"v2"}, style: "whip-pan", duration: 0.4 } effect.apply { clipId: "v3", effect: "cinematic-teal-orange", intensity: 0.6 } text.add { preset: "lowerThird:bold-pop", content: "Day 2 — Ubud", position: {x:0.5, y:0.85} } audio.mix { trackId: "music", volume: 0.8, duck: true } color.grade { scope: "global", params: { exposure: 0.15, saturation: 0.2, temp: -8 } } project.export { resolution: "1080p", codec: "h264", bitrate: 16 }
Conversation starters for assistants.
"Help me cut a Reel"
A guided flow: pick clips, choose a vibe, set duration, auto-suggest transitions and music.
"Make this look cinematic"
Recommend a LUT, set 2.39:1 letterbox, apply teal/orange grade, slow the timeline 12%.
"Add captions"
Suggest a caption style, font, position, and timing for the spoken audio in the clip.
"Trim to 30 seconds"
Identify the strongest beats and propose cuts that fit the target duration.
Live, on the device.
ExpoCut ships an in-app MCP server. When the user enables it, an MCP-compatible client like Claude Desktop binds to a local port on the device and can call the tools above. The server is off by default, runs only on a local interface, and every call is recorded as an undoable command in the on-device ai_operations log. This page is the public reference for that surface so agents can index it and reason about the app.
Build with ExpoCut.
Download the app, or read the matching Skill for AI assistants.
Join the beta View Skill →