Route mid-session work to the right spawned model without changing the fixed main session. Use for coding, architecture, math, algorithms, web development, b...
Use this skill to classify work, dispatch it to the best spawned model, and return the result to the main session.
This skill assumes the runtime can spawn work on these model families:
openai-codex/*opencode-go/*anthropic/* only when --force-claude is presentIf the host is self-managed, configure whatever provider credentials your runtime expects before using this skill. Typical setups use environment variables for OpenAI, Opencode, and optionally Anthropic, but the exact variable names depend on the host/runtime wrapper. Do not hardcode secrets into the skill package.
--force-claude.--use-claude, --force-opus, and --no-opus.--force-claude as prompt-scoped, not session-scoped.long cache retention for openai-codex/* routes.short cache retention for opencode-go/* routes.--force-claude limited to tradeoff proposal generation.Choose exactly one primary route:
openai-codex/gpt-5.4opencode-go/glm-5opencode-go/minimax-m2.5opencode-go/kimi-k2.5openai-codex/gpt-5.3-codex-sparkIf a request spans categories, route by the highest-risk deliverable:
Trigger tradeoff mode when the user asks to:
Generate proposals in parallel with:
opencode-go/glm-5openai-codex/gpt-5.3-codexJudge both proposals with:
openai-codex/gpt-5.4--force-claudeUse Claude only for proposal generation:
anthropic/claude-sonnet-4-6anthropic/claude-opus-4-6Keep the judge on:
openai-codex/gpt-5.4Do not use Claude anywhere else in the flow.
Use the build pipeline when the prompt starts with one of these prefixes:
buildq: → quick pipelinebuild: → standard pipelinebuildx: → strict pipelineUse for smaller scoped coding work.
Steps:
openai-codex/gpt-5.4openai-codex/gpt-5.4opencode-go/glm-5openai-codex/gpt-5.3-codexopencode-go/glm-5Use for the normal serious coding workflow.
Steps:
openai-codex/gpt-5.4opencode-go/glm-5openai-codex/gpt-5.4openai-codex/gpt-5.3-codex-sparkopenai-codex/gpt-5.4opencode-go/glm-5openai-codex/gpt-5.3-codexopencode-go/glm-5openai-codex/gpt-5.4opencode-go/glm-5Use for stricter, higher-value delivery.
Steps:
openai-codex/gpt-5.4opencode-go/glm-5openai-codex/gpt-5.4openai-codex/gpt-5.3-codex-sparkopenai-codex/gpt-5.4opencode-go/glm-5openai-codex/gpt-5.3-codexopencode-go/glm-5openai-codex/gpt-5.4opencode-go/glm-5opencode-go/kimi-k2.5opencode-go/glm-5When a judge-plan step runs, require these sections:
For buildx:, also include:
The simplify step must:
The simplify step must not:
If the user shifts into a sustained new domain, summarize the active state with openai-codex/gpt-5.3-codex-spark and suggest a fresh session only when repeated cross-domain dispatching would be wasteful.
dispatcher.py is the routing source of truth. It must:
RoutePlan objectsbuildq:, build:, and buildx: pipeline modesBefore shipping updates:
tests/test_dispatcher.py.buildq:, build:, and buildx:.--force-claude is present.long retention.opencode-go routes use short retention.ZIP package — ready to use