CLI Flags

All flags can be passed to deno run -A jsr:@cdaringe/ralphmania.

FlagShortTypeDescription
--agent-astringAI agent to use (default: claude)
--iterations-inumberMaximum number of agent iterations to run (required)
--plugin-pstringPath to a plugin file exporting a RalphPlugin default
--reset-worktreesbooleanRemove and re-create all worker git worktrees on start
--guibooleanOpen the live GUI in the browser while the orchestrator runs

Plugin Hooks

Plugins export a named plugin object conforming to the Plugin type. All hooks are optional and async-friendly.

HookCalled whenCan modify
onConfigResolvedOnce, before the loop startscoder, verifier, escalated, iterations, level, parallel, gui, guiPort, resetWorktrees, specFile, progressFile
onModelSelectedEach iteration, after model resolutionModelSelection (model, provider)
onPromptBuiltEach iteration, after prompt constructionprompt string
onSessionConfigBuiltEach iteration, after session config assemblyAgentSessionConfig (provider, model, workingDir)
onIterationEndAfter the agent subprocess exitsobserve only
onValidationCompleteAfter the validation script runsValidationResult (pass/fail/messages)
onRectifyAfter validation fails post-mergeRectifyAction (agent, skip, abort)
onLoopEndOnce after the loop exits, regardless of outcomeobserve only

Example plugin

import type { Plugin } from "jsr:@cdaringe/ralphmania";

export const plugin: Plugin = {
  onPromptBuilt({ prompt }) {
    // Append extra instructions to every agent prompt
    return prompt + "\nAlways use TypeScript strict mode.";
  },
  onLoopEnd({ iterationNum, log }) {
    log({ tags: ["info"], message: `Loop ended after ${iterationNum} iterations` });
  },
};

Local models

For ollama, set customModel via onSessionConfigBuilt. Ralphmania queries /api/show to discover context window and capabilities:

export const plugin: Plugin = {
  onSessionConfigBuilt: ({ config }) => ({
    ...config,
    customModel: { kind: "ollama", model: "gemma4:e2b" },
  }),
};

For other OpenAI-compatible servers (llama.cpp, vLLM, LM Studio), provide contextWindow since there is no discovery API:

export const plugin: Plugin = {
  onSessionConfigBuilt: ({ config }) => ({
    ...config,
    customModel: {
      kind: "openai-compatible",
      baseUrl: "http://localhost:8080/v1",
      model: "my-model",
      contextWindow: 8192,
    },
  }),
};

Progress Statuses

Each scenario in progress.md carries one of five statuses:

StatusMeaning
WIPWork in progress — agent has not yet submitted
WORK_COMPLETEAgent submitted a result, awaiting validation by the orchestrator
VERIFIEDValidation passed — scenario is done
NEEDS_REWORKValidation failed — scenario will be retried in the next iteration
OBSOLETEThe scenario was removed from specification.md

Environment Variables

VariablePurpose
ANTHROPIC_API_KEYRequired when using the claude agent
RALPH_LOGSet to debug for verbose orchestrator logging