A single-binary DAG task execution engine. Durable workflows with SQLite storage, zero external dependencies, and native AI agent support.
No Redis. No Postgres. No Kubernetes. Just a binary and a task definition.
Define task dependencies as a directed acyclic graph. Tasked resolves execution order, runs independent tasks in parallel, and cascades failures automatically.
Every state change is persisted to SQLite. Crash mid-flow? Restart the binary and it picks up exactly where it left off. Configurable retries with exponential backoff.
One binary. One SQLite file. No Redis, no Postgres, no message broker. Download, run, done. Works in air-gapped environments.
First-class MCP server mode with Tasks protocol support. Agents get async task handles — submit a workflow, continue working, poll for results. No blocking, no callbacks.
Per-queue concurrency limits and token-bucket rate limiting. Run 50 API calls at 10/sec with max 5 concurrent — configured in one line.
Prometheus metrics, structured logging, health endpoints, and API key authentication. Everything you need to run in production without bolting on extras.
{
"tasks": [
{
"id": "build",
"executor": "shell",
"config": { "command": "cargo build --release" }
},
{
"id": "test-unit",
"executor": "shell",
"config": { "command": "cargo test --lib" },
"depends_on": ["build"]
},
{
"id": "test-integ",
"executor": "shell",
"config": { "command": "cargo test --test '*'" },
"depends_on": ["build"]
},
{
"id": "deploy",
"executor": "http",
"config": {
"url": "https://deploy.example.com/api",
"method": "POST"
},
"depends_on": ["test-unit", "test-integ"],
"retries": 3
}
]
}
$ tasked run flow.json --queue ci
▸ Flow f_7k2m submitted to queue "ci"
▸ Resolving DAG: 4 tasks, 3 edges
✓ [build] succeeded 12.4s
✓ [test-unit] succeeded 4.2s
✓ [test-integ] succeeded 8.1s
✓ [deploy] succeeded 3.8s
✓ Flow f_7k2m complete 4/4 succeeded (24.3s)
Write a JSON flow definition. Declare tasks, executors, dependencies, retry policies, and timeouts.
Run from the CLI, submit via REST API, or let an AI agent submit through MCP. Tasked validates the DAG and starts execution immediately.
Poll the API, stream events, or let your agent check back later. Every state transition is persisted — nothing gets lost.
| Feature | Tasked | Temporal | Inngest | Hatchet | Dagu |
|---|---|---|---|---|---|
| Single binary | ✓ | ✕ | ✕ | ✕ | ✓ |
| Zero infrastructure | ✓ | ✕ | ✕ | ✕ | ✓ |
| DAG workflows | ✓ | ✓ | ✓ | ✓ | ✓ |
| Durable execution | ✓ | ✓ | ✓ | ✓ | ✕ |
| MCP server mode | ✓ | ✕ | ✕ | ✕ | ✕ |
| Rate limiting | ✓ | ✕ | ✓ | ✓ | ✕ |
| Self-hosted / air-gapped | ✓ | ✓ | ✕ | ✓ | ✓ |
| Storage | SQLite | Postgres+ES | Cloud | PG+RabbitMQ | YAML files |
| Binary size | 9.8MB | ~200MB+ | N/A | ~100MB+ | ~20MB |
| License | MIT | MIT | Source-avail | MIT | AGPL-3.0 |
Independent test suites run in parallel. Failed steps retry automatically. No YAML circus.
tasked run ci-pipeline.json
Process hundreds of API calls with built-in rate limiting and concurrency control.
"rate_limit": { "per_second": 10 }
Chain extraction, transformation, and loading steps. Each step persists its output. Resume from any failure point.
tasked run etl-pipeline.json
Give your AI agent a tool for long-running work. Submit a pipeline, continue coding, check back when done.
tasked_submit_flow → tasked_flow_status
# 1. Install via Homebrew
$ brew install bradleydwyer/tap/tasked
# 2. Define a workflow
$ cat > flow.json << 'EOF'
{
"tasks": [
{ "id": "hello", "executor": "shell",
"config": { "command": "echo 'Hello from Tasked!'" } },
{ "id": "world", "executor": "shell",
"config": { "command": "echo 'DAG execution works!'" },
"depends_on": ["hello"] }
]
}
EOF
# 3. Run it
$ tasked run flow.json
✓ Flow complete 2/2 tasks succeeded
# 1. Build from source
$ git clone https://github.com/bradleydwyer/tasked
$ cd tasked && cargo build --release
# 2. Define a workflow
$ cat > flow.json << 'EOF'
{
"tasks": [
{ "id": "hello", "executor": "shell",
"config": { "command": "echo 'Hello from Tasked!'" } },
{ "id": "world", "executor": "shell",
"config": { "command": "echo 'DAG execution works!'" },
"depends_on": ["hello"] }
]
}
EOF
# 3. Run it
$ ./target/release/tasked-server run flow.json
✓ Flow complete 2/2 tasks succeeded