Run workflows,
not infrastructure

A single-binary DAG task execution engine. Durable workflows with SQLite storage, zero external dependencies, and native AI agent support.

9.8MB binary Zero dependencies MIT License Built with Rust
tasked
$ tasked run pipeline.json
Flow f_7k2m submitted (4 tasks)
[build] running...
[build] succeeded 12.4s
[test-unit] running...
[test-integ] running...
[test-unit] succeeded 4.2s
[test-integ] succeeded 8.1s
[deploy] running...
[deploy] succeeded 3.8s
Flow complete 4/4 tasks succeeded (24.3s)

Everything you need.
Nothing you don't.

No Redis. No Postgres. No Kubernetes. Just a binary and a task definition.

◆→◆→◆

DAG Workflows

Define task dependencies as a directed acyclic graph. Tasked resolves execution order, runs independent tasks in parallel, and cascades failures automatically.

Durable Execution

Every state change is persisted to SQLite. Crash mid-flow? Restart the binary and it picks up exactly where it left off. Configurable retries with exponential backoff.

Zero Infrastructure

One binary. One SQLite file. No Redis, no Postgres, no message broker. Download, run, done. Works in air-gapped environments.

Agent-Native (MCP)

First-class MCP server mode with Tasks protocol support. Agents get async task handles — submit a workflow, continue working, poll for results. No blocking, no callbacks.

Concurrent Dispatch

Per-queue concurrency limits and token-bucket rate limiting. Run 50 API calls at 10/sec with max 5 concurrent — configured in one line.

Production Ready

Prometheus metrics, structured logging, health endpoints, and API key authentication. Everything you need to run in production without bolting on extras.

Define. Run. Done.

flow.json
{
  "tasks": [
    {
      "id": "build",
      "executor": "shell",
      "config": { "command": "cargo build --release" }
    },
    {
      "id": "test-unit",
      "executor": "shell",
      "config": { "command": "cargo test --lib" },
      "depends_on": ["build"]
    },
    {
      "id": "test-integ",
      "executor": "shell",
      "config": { "command": "cargo test --test '*'" },
      "depends_on": ["build"]
    },
    {
      "id": "deploy",
      "executor": "http",
      "config": {
        "url": "https://deploy.example.com/api",
        "method": "POST"
      },
      "depends_on": ["test-unit", "test-integ"],
      "retries": 3
    }
  ]
}
terminal
$ tasked run flow.json --queue ci
 Flow f_7k2m submitted to queue "ci"
 Resolving DAG: 4 tasks, 3 edges

 [build]      succeeded  12.4s
 [test-unit]  succeeded   4.2s
 [test-integ] succeeded   8.1s
 [deploy]     succeeded   3.8s

 Flow f_7k2m complete 4/4 succeeded (24.3s)

Three steps. Zero configuration.

01

Define

Write a JSON flow definition. Declare tasks, executors, dependencies, retry policies, and timeouts.

{ "tasks": [...], "depends_on": [...] }
02

Submit

Run from the CLI, submit via REST API, or let an AI agent submit through MCP. Tasked validates the DAG and starts execution immediately.

$ tasked run flow.json
03

Monitor

Poll the API, stream events, or let your agent check back later. Every state transition is persisted — nothing gets lost.

GET /api/v1/flows/{id}

How Tasked compares

Feature Tasked Temporal Inngest Hatchet Dagu
Single binary
Zero infrastructure
DAG workflows
Durable execution
MCP server mode
Rate limiting
Self-hosted / air-gapped
Storage SQLite Postgres+ES Cloud PG+RabbitMQ YAML files
Binary size 9.8MB ~200MB+ N/A ~100MB+ ~20MB
License MIT MIT Source-avail MIT AGPL-3.0

Built for real work

CI/CD

Build, Test, Deploy

Independent test suites run in parallel. Failed steps retry automatically. No YAML circus.

tasked run ci-pipeline.json
API

Batch API Calls

Process hundreds of API calls with built-in rate limiting and concurrency control.

"rate_limit": { "per_second": 10 }
DATA

Data Processing

Chain extraction, transformation, and loading steps. Each step persists its output. Resume from any failure point.

tasked run etl-pipeline.json
AGENT

Agent Workflows

Give your AI agent a tool for long-running work. Submit a pipeline, continue coding, check back when done.

tasked_submit_flow → tasked_flow_status

Up and running in 30 seconds

quickstart
# 1. Install via Homebrew
$ brew install bradleydwyer/tap/tasked

# 2. Define a workflow
$ cat > flow.json << 'EOF'
{
  "tasks": [
    { "id": "hello", "executor": "shell",
      "config": { "command": "echo 'Hello from Tasked!'" } },
    { "id": "world", "executor": "shell",
      "config": { "command": "echo 'DAG execution works!'" },
      "depends_on": ["hello"] }
  ]
}
EOF

# 3. Run it
$ tasked run flow.json
 Flow complete 2/2 tasks succeeded
# 1. Build from source
$ git clone https://github.com/bradleydwyer/tasked
$ cd tasked && cargo build --release

# 2. Define a workflow
$ cat > flow.json << 'EOF'
{
  "tasks": [
    { "id": "hello", "executor": "shell",
      "config": { "command": "echo 'Hello from Tasked!'" } },
    { "id": "world", "executor": "shell",
      "config": { "command": "echo 'DAG execution works!'" },
      "depends_on": ["hello"] }
  ]
}
EOF

# 3. Run it
$ ./target/release/tasked-server run flow.json
 Flow complete 2/2 tasks succeeded