The Tasked HTTP API lets you manage queues, submit DAG workflows, monitor flow progress, and acknowledge callback tasks.
Health check and Prometheus metrics
Create, list, get, and delete execution queues
Submit DAGs, monitor progress, cancel flows, stream events (SSE)
Acknowledge callback task completion
Create, list, update, and delete cron schedules
Upload, download, and list shared files between tasks
Request/response types and enums
Error codes and response format
Health and observability endpoints.
Returns server health status.
{ "status": "ok" }
Returns metrics in Prometheus text exposition format.
Queues define execution policies: concurrency limits, retry counts, backoff strategies, and rate limiting. All flows are submitted to a queue.
Creates a new execution queue. All config fields are optional and default to sensible values.
| Field | Type | Description |
|---|---|---|
| idrequired | string | Unique queue identifier |
| config | QueueConfig | Execution policy (see schema) |
curl -X POST http://localhost:8080/api/v1/queues \ -H "Content-Type: application/json" \ -d '{ "id": "ci", "config": { "concurrency": 5, "max_retries": 2, "rate_limit": { "max_burst": 10, "per_second": 5.0 } } }'
{
"id": "ci",
"config": {
"concurrency": 5,
"max_retries": 2,
"timeout_secs": 300,
"backoff": { "Exponential": { "initial_delay_ms": 1000 } },
"rate_limit": { "max_burst": 10, "per_second": 5.0 }
},
"created_at": "2024-01-15T10:00:00Z",
"updated_at": "2024-01-15T10:00:00Z"
}
Returns all queues.
[
{
"id": "ci",
"config": { "concurrency": 5, ... },
"created_at": "2024-01-15T10:00:00Z",
"updated_at": "2024-01-15T10:00:00Z"
}
]
| Param | Type | Description |
|---|---|---|
| qidrequired | string | Queue ID |
Deletes a queue and all associated flows and tasks.
| Param | Type | Description |
|---|---|---|
| qidrequired | string | Queue ID |
A flow is a DAG of tasks submitted to a queue. The engine resolves topological ordering and executes tasks as their dependencies complete.
Submits a DAG of tasks for execution. Tasks with no depends_on run immediately. Circular dependencies are rejected.
| Param | Type | Description |
|---|---|---|
| qidrequired | string | Queue ID |
| Field | Type | Description |
|---|---|---|
| tasksrequired | TaskDef[] | Array of task definitions |
| webhooks | FlowWebhooks? | Optional webhook URLs: { on_complete, on_failure } |
| Field | Type | Description |
|---|---|---|
| idrequired | string | Unique task identifier within the flow |
| executorrequired | string | shell | http | noop | callback |
| config | object | Executor-specific config (see below) |
| input | any | Arbitrary JSON passed to the executor |
| depends_on | string[] | IDs of tasks that must complete first |
| timeout_secs | integer | Override queue default timeout |
| retries | integer | Override queue default retry count |
| backoff | BackoffStrategy | Override queue default backoff |
POST /api/v1/queues/ci/flows { "tasks": [ { "id": "build", "executor": "shell", "config": { "command": "cargo build --release" } }, { "id": "test", "executor": "shell", "config": { "command": "cargo test" }, "depends_on": ["build"] }, { "id": "deploy", "executor": "http", "config": { "url": "https://deploy.example.com/api", "method": "POST" }, "depends_on": ["test"], "retries": 3 } ] }
{
"id": "f_7k2m3n4p",
"queue_id": "ci",
"state": "running",
"task_count": 3,
"tasks_succeeded": 0,
"tasks_failed": 0,
"created_at": "2024-01-15T10:01:00Z",
"updated_at": "2024-01-15T10:01:00Z"
}
| Param | Type | Description |
|---|---|---|
| qidrequired | string | Queue ID |
[
{
"id": "f_7k2m3n4p",
"queue_id": "ci",
"state": "succeeded",
"task_count": 3,
"tasks_succeeded": 3,
"tasks_failed": 0,
"created_at": "2024-01-15T10:01:00Z",
"updated_at": "2024-01-15T10:01:28Z"
}
]
Returns the flow and all its tasks with current state, output, and timing information.
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
{
"id": "f_7k2m3n4p",
"queue_id": "ci",
"state": "running",
"task_count": 2,
"tasks_succeeded": 1,
"tasks_failed": 0,
"created_at": "2024-01-15T10:01:00Z",
"updated_at": "2024-01-15T10:01:03Z",
"tasks": [
{
"id": "build",
"flow_id": "f_7k2m3n4p",
"queue_id": "ci",
"state": "succeeded",
"executor_type": "shell",
"input": null,
"output": { "exit_code": 0, "stdout": "", "stderr": "" },
"error": null,
"retries_remaining": 3,
"timeout_secs": 300,
"started_at": "2024-01-15T10:01:01Z",
"completed_at": "2024-01-15T10:01:02Z",
"created_at": "2024-01-15T10:01:00Z"
},
{
"id": "test",
"flow_id": "f_7k2m3n4p",
"queue_id": "ci",
"state": "running",
"executor_type": "shell",
"input": null,
"output": null,
"error": null,
"retries_remaining": 3,
"timeout_secs": 300,
"started_at": "2024-01-15T10:01:03Z",
"completed_at": null,
"created_at": "2024-01-15T10:01:00Z"
}
]
}
Cancels a running flow and all its pending/running tasks. Already-completed tasks are unaffected.
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
Server-Sent Events stream of task state changes and flow completion. Connect with any SSE client (EventSource in JavaScript, curl with streaming).
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
| Event | Description | Data Fields |
|---|---|---|
| task_state | Emitted when a task changes state | task_id, state, output, error, started_at, completed_at |
| task_output | Emitted when a running task's output updates (streaming) | task_id, output |
| flow_complete | Emitted when flow reaches terminal state. Stream closes after this event. | flow_id, state, task_count, tasks_succeeded, tasks_failed |
curl -N http://localhost:8080/api/v1/flows/f_7k2m/events
const es = new EventSource('/api/v1/flows/f_7k2m/events'); es.addEventListener('task_state', (e) => console.log(JSON.parse(e.data))); es.addEventListener('flow_complete', (e) => { console.log(JSON.parse(e.data)); es.close(); });
The callback executor pauses a task in running state until an external system acknowledges it via this endpoint.
Reports the outcome of a callback task. On failure, set retryable: true to allow the engine to retry.
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
| tidrequired | string | Task ID |
| Field | Type | Description |
|---|---|---|
| statusrequired | string | "success" or "failed" |
| output | any | JSON output (for success) |
| error | string | Error message (for failed) |
| retryable | boolean | Allow engine to retry. Default: false |
POST /api/v1/flows/f_7k2m/tasks/process/ack { "status": "success", "output": { "records": 1042 } }
POST /api/v1/flows/f_7k2m/tasks/process/ack { "status": "failed", "error": "Upstream returned 503", "retryable": true }
Schedules attach cron expressions to flow definitions, automatically submitting flows on a recurring basis.
Creates a new cron schedule. The engine will submit the provided flow definition each time the cron expression fires.
| Param | Type | Description |
|---|---|---|
| qidrequired | string | Queue ID |
| Field | Type | Description |
|---|---|---|
| cronrequired | string | Cron expression (5-field standard or 7-field with seconds) |
| flowrequired | FlowDef | Flow definition to submit on each trigger |
POST /api/v1/queues/ops/schedules { "cron": "0 0 * * *", "flow": { "tasks": [ { "id": "backup", "executor": "shell", "config": { "command": "./backup.sh" } } ] } }
{
"id": "s_9x4k7m",
"queue_id": "ops",
"cron": "0 0 * * *",
"created_at": "2024-01-15T10:00:00Z",
"updated_at": "2024-01-15T10:00:00Z"
}
Returns all schedules associated with a queue.
| Param | Type | Description |
|---|---|---|
| qidrequired | string | Queue ID |
[
{
"id": "s_9x4k7m",
"queue_id": "ops",
"cron": "0 0 * * *",
"last_fired_at": "2024-01-15T00:00:00Z",
"created_at": "2024-01-14T10:00:00Z",
"updated_at": "2024-01-15T00:00:00Z"
}
]
Returns the full schedule including the embedded flow definition.
| Param | Type | Description |
|---|---|---|
| sidrequired | string | Schedule ID |
{
"id": "s_9x4k7m",
"queue_id": "ops",
"cron": "0 0 * * *",
"flow": {
"tasks": [
{
"id": "backup",
"executor": "shell",
"config": { "command": "./backup.sh" }
}
]
},
"last_fired_at": "2024-01-15T00:00:00Z",
"created_at": "2024-01-14T10:00:00Z",
"updated_at": "2024-01-15T00:00:00Z"
}
Replaces the cron expression and/or flow definition for an existing schedule.
| Param | Type | Description |
|---|---|---|
| sidrequired | string | Schedule ID |
| Field | Type | Description |
|---|---|---|
| cronrequired | string | New cron expression |
| flowrequired | FlowDef | New flow definition |
Permanently deletes a schedule. Future cron triggers for this schedule will no longer fire.
| Param | Type | Description |
|---|---|---|
| sidrequired | string | Schedule ID |
Upload, download, and list shared files between tasks in a flow. Shell tasks access artifacts via the $TASKED_ARTIFACTS directory; container tasks use these HTTP endpoints via $TASKED_ARTIFACT_URL. Artifacts are cleaned up when the flow completes.
Uploads a file as a named artifact for the flow. The request body is the raw file content. Overwrites any existing artifact with the same name.
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
| namerequired | string | Artifact name (filename) |
Raw file content (any content type).
curl -X PUT http://localhost:8080/api/v1/flows/f_7k2m3n4p/artifacts/myapp \
--data-binary @target/release/myapp
Downloads a named artifact from the flow. Returns the raw file content.
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
| namerequired | string | Artifact name (filename) |
curl http://localhost:8080/api/v1/flows/f_7k2m3n4p/artifacts/myapp -o myapp
Lists all artifacts stored for the flow.
| Param | Type | Description |
|---|---|---|
| fidrequired | string | Flow ID |
[
{
"name": "myapp",
"size": 4523008,
"created_at": "2024-01-15T10:05:00Z"
},
{
"name": "report.csv",
"size": 2048,
"created_at": "2024-01-15T10:06:00Z"
}
]
Reference for shared types used across the API.
| Field | Type | Default | Description |
|---|---|---|---|
| concurrency | integer | 10 | Max concurrent running tasks |
| max_retries | integer | 3 | Default retry count for tasks |
| timeout_secs | integer | 300 | Default task timeout in seconds |
| backoff | BackoffStrategy | Exponential 1000ms | Default retry backoff strategy |
| rate_limit | RateLimitConfig? | null | Optional token-bucket rate limit |
| retention_secs | integer? | null | Auto-delete terminal flows older than this (seconds) |
// Fixed delay { "Fixed": { "delay_ms": 1000 } } // Exponential (doubles each retry) { "Exponential": { "initial_delay_ms": 1000 } } // Exponential with jitter { "ExponentialJitter": { "initial_delay_ms": 1000 } }
| Field | Type | Description |
|---|---|---|
| max_burstrequired | integer | Bucket capacity (burst size) |
| per_secondrequired | number | Sustained token refill rate |
| Field | Type | Description |
|---|---|---|
| on_complete | string? | URL to POST when the flow succeeds |
| on_failure | string? | URL to POST when the flow fails |
config field on TaskDef varies by executor type.| Executor | Config Fields |
|---|---|
| shell | command (string) — shell command to execute. Returns { exit_code, stdout, stderr } |
| http | url, method, headers (object), body (string). Returns { status, body } |
| noop | No config. Completes immediately. |
| callback | No config. Waits for POST ack call. |
All errors return a JSON object with machine-readable error code and human-readable message.
{
"error": "queue_not_found",
"message": "Queue 'my-queue' not found"
}
| Code | HTTP | Description |
|---|---|---|
| queue_not_found | 404 | Queue does not exist |
| queue_already_exists | 409 | Queue ID is already taken |
| flow_not_found | 404 | Flow does not exist |
| task_not_found | 404 | Task does not exist in the specified flow |
| no_executor | 400 | Unknown executor type |
| invalid_graph | 400 | DAG validation failed (cycles, missing deps) |
| invalid_status | 400 | Ack status must be "success" or "failed" |
| invalid_state_transition | 409 | Task/flow not in a valid state for the operation |
| schedule_not_found | 404 | Schedule does not exist |
| invalid_cron | 400 | Cron expression could not be parsed |
| internal_error | 500 | Unexpected server error |