Docs / CLI

CLI

Command-line interface reference for Tasked.

tasked-server serve

Start the HTTP server. The server runs the DAG execution engine, exposes the REST API, and serves Prometheus metrics.

FlagDefaultDescription
--data-dirtasked-dataData directory for SQLite databases
--port8080Port to listen on
--host0.0.0.0Host to bind to
--auth-mode <mode>noneAuthentication mode: none or api-key
--api-key <key>-API key for api-key auth mode (also reads TASKED_API_KEY env var)
--metrics-push-url <url>-Push Prometheus metrics to this URL every 30s
# Start with defaults
tasked-server serve

# Custom port and data directory
tasked-server serve --port 3000 --data-dir /var/lib/tasked

# Bind to localhost only
tasked-server serve --host 127.0.0.1 --port 8080

# Enable API key authentication
tasked-server serve --auth-mode api-key --api-key my-secret-key

# Or use the environment variable
TASKED_API_KEY=my-secret-key tasked-server serve --auth-mode api-key

# Push metrics to a remote endpoint
tasked-server serve --metrics-push-url https://metrics.example.com/push

tasked-server run

Execute a flow from a JSON file and exit. This is the quickest way to run a workflow without starting the server. Also available as tasked run via the Homebrew install.

Argument / FlagDefaultDescription
filerequiredPath to a flow definition JSON file
--queuedefaultQueue to submit the flow to (created if it doesn't exist)
--db:memory:SQLite database path (use :memory: for in-memory)
--auto-approve-Skip all approval tasks without prompting (useful for CI/CD)
--output <file> / -o-Write task outputs as JSON on completion. Use - for stdout.

Exit codes: 0 on success (all tasks succeeded), 1 on failure (any task failed or flow was cancelled).

# Run a flow file
tasked run flow.json

# Run with a named queue
tasked-server run flow.json --queue builds

# Persist results to a database
tasked-server run flow.json --db results.db

# Auto-approve all approval tasks
tasked run flow.json --auto-approve

# Write outputs to a file
tasked run flow.json --output results.json

# Write outputs to stdout
tasked run flow.json -o -

tasked-server mcp

Start an MCP (Model Context Protocol) server on stdio. This allows AI agents like Claude to use Tasked as a tool for managing DAG workflows via JSON-RPC 2.0.

FlagDefaultDescription
--data-dirtasked-dataData directory for SQLite databases
# Start MCP server
tasked-server mcp --data-dir tasked-data

See the MCP Server guide for integration details and tool reference.

Environment Variables

Log verbosity is controlled via the RUST_LOG environment variable. Tasked uses the tracing EnvFilter syntax.

ModeDefault LevelNotes
servetasked_server=info,tasked=info,tower_http=infoVerbose by default for request tracing
runwarnQuiet by default for clean CLI output
mcpwarnLogs go to stderr so stdout stays clean for JSON-RPC
# Enable debug logging
RUST_LOG=debug tasked-server serve

# Trace-level logging for the engine only
RUST_LOG=tasked=trace tasked-server serve

# Debug a run command
RUST_LOG=debug tasked run flow.json
On this page