Skip to content

ChasquiMQ ยท Redis 8.6 ยท Rust + Node + Python

The fastest open-sourcemessage broker for Redis.

Rust-native engine. Native Node.js and Python bindings. Built on Redis Streams with MessagePack on the wire and pipelined acks โ€” the things naive queues leave on the table.

Pick your language

The engine is the same Rust binary in every package. Install, enqueue a job, run a worker.

All three quickstarts assume Redis 8.6 running on 127.0.0.1:6379:

Terminal window
docker run -d --name chasquimq-redis -p 6379:6379 redis:8.6

Requires Node.js 18+. In a fresh directory, save the snippet as worker.ts, then:

Terminal window
npm init -y && npm pkg set type=module
npm install chasquimq tsx
npx tsx worker.ts
worker.ts
import { Queue, Worker } from "chasquimq"
const connection = { host: "127.0.0.1", port: 6379 }
const queue = new Queue("emails", { connection })
const worker = new Worker(
"emails",
async (job) => {
console.log(`[worker] sending email to ${job.data.to}`)
return { delivered: true }
},
{ connection, storeResults: true },
)
const job = await queue.add("welcome", { to: "ada@example.com" })
console.log(`[producer] enqueued job ${job.id}`)
await job.waitForResult({ timeoutMs: 30_000 })
console.log("๐ŸŽ‰ your first ChasquiMQ job ran end-to-end")
await worker.close()
await queue.close()

Need a CLI? cargo install chasquimq-cli gets you chasqui inspect, chasqui watch, and chasqui dlq replay.

Why ChasquiMQ

Three things justify a new queue. Performance, operability, and the option to write your handler in the language your team already ships.

Rust-native throughput

188,775 jobs/s sustained on a single M3 host. MessagePack on the wire. No Lua scripts. No JSON on the hot path. Pipelined XACK and XACKDEL close the round-trip tax that breaks naive Streams consumers.

See the numbers

Operator-grade reliability

Per-job retry budgets. Exponential backoff with jitter. DLQ replay tooling baked in. Repeatable jobs with cron and every patterns โ€” DST-aware, with a MissedFiresPolicy for clock skips.

Job lifecycle

Multi-language at zero cost

Native bindings via napi-rs (Node) and PyO3 (Python). The engine is the same Rust binary in every package โ€” no protocol bridge, no sidecar process. Pick your language; pay no performance tax for it.
Requires Node 18+, Python 3.9+, or Rust 1.85+.

API reference

Performance, measured

188,775 jobs/s sustained on a single M3 host with Redis 8.6 โ€” measured against the leading Node.js Redis queue on identical hardware. Quiet-host runs reproduce 419k jobs/s on the consumer hot path.

Sustained throughput
188,775 jobs/s
Native languages
3 Rust + Node + Python
Producer ratio (1.0)
3.47 ร— sustained

Apple M3 ยท Redis 8.6.2 (loopback) ยท queue-add-bulk, 50 jobs ยท methodology & reproduction โ†’

The docs, by intent

Four kinds of documentation, four jobs they do.

Architecture at a glance

Producers XADD onto a Redis Stream. Delayed jobs sit in a sorted set until a promoter moves them. Consumers XREADGROUP, dispatch to your handler, then XACKDEL in a single batched round trip. Failures retry with backoff; exhausted ones land in the DLQ.

Producer โ”€โ”€ XADD โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ถ stream:emails  โ”€โ”€โ–ถ XREADGROUP โ”€โ”€โ–ถ Consumer
 โ”‚                                                              โ”‚
 โ””โ”€ delayed: ZADD โ”€โ”€โ–ถ zset:emails:z โ”€โ”€โ–ถ promoter โ”€โ”€โ–ถ XADD       โ”‚
                                                                โ–ผ
                                                            handler(job)
                                                                โ”‚
                                                     success โ—€โ”€โ”€โ”ดโ”€โ”€โ–ถ failure
                                                        โ”‚              โ”‚
                                                     XACKDEL       retry / DLQ

Ready?

Read the five-minute getting-started. Hello-world job in your terminal, running through Redis, before your coffee cools.

Get started โ†’