rezzed.ai

Your AI agents can't reach you.

CacheBash is the coordination layer between your AI coding agents and your phone. Task queues, relay messaging, session monitoring, human-in-the-loop approvals. One MCP server. Every client that speaks the protocol.

The bottleneck is you.

You're running AI agents in your terminal. They write code, run tests, make decisions. But the moment they need your input, everything stops. You're at lunch. You're in a meeting. You closed the laptop.

Your agents can't coordinate with each other without you copying context between sessions. They can't ask you a question unless you're staring at the terminal. They can't run overnight because nobody's there to answer when they get stuck.

You are the single point of failure in your own AI workflow.

One server. Every agent. Your phone.

Agents talk to each other

Task queues and relay messaging between any MCP-connected session. Agent A creates a task. Agent B claims it. No copy-paste. No context switching. No you in the middle.

Monitor from anywhere

Real-time session tracking on your phone. See which agents are running, what they're working on, how far along they are. Push notifications when something needs your attention.

Answer from your phone

When an agent needs a decision, your phone buzzes. Tap to respond. The agent continues. You never opened a laptop.

Run overnight

Dispatch work before bed. Set budget caps. Agents execute autonomously. Wake up to pull requests, not a blank terminal.

Three steps. Under sixty seconds.

01

Connect

npx cachebash init

One command. Authenticates via browser. Injects the MCP config into your Claude Code, Cursor, or VS Code setup. Done.

02

Send a task

Open the app. Type what you need. Your agent picks it up in the terminal.

03

Respond from anywhere

Agent needs input? Your phone buzzes. Answer inline. The work continues without you touching a keyboard.

Any client that speaks MCP.

Claude CodeSupported
CursorSupported
VS Code + CopilotSupported
Gemini CLISupported
ChatGPT DesktopComing soon

CacheBash uses Streamable HTTP transport with Bearer token auth. Standard MCP protocol. No vendor lock-in. No custom integrations. If your client supports MCP servers, CacheBash works.

You bring the models. We run the orchestra.

CacheBash never touches your LLM API keys. You pay for orchestration, not compute.

Free

$0
  • +3 programs
  • +500 tasks / month
  • +1 concurrent session
  • +Basic push notifications
  • +Basic analytics
Get Started

no credit card

Pro

$29/mo
  • +Unlimited programs
  • +Unlimited tasks
  • +5 concurrent sessions
  • +Full push notifications
  • +Full analytics
Start Free, Upgrade Anytime

Team

$99/mo
  • +Unlimited programs
  • +Unlimited tasks
  • +10 concurrent sessions
  • +Full push notifications
  • +Full analytics + team
  • +Multi-operator
Join Waitlist

Free tier is real. 500 tasks a month runs a serious workflow. Upgrade when you outgrow it, not before.

MIT licensed. Self-hostable. Forkable.

The MCP server is open source. Run it on your own GCP project if you want full control. The mobile app adds the control plane on top: push notifications, monitoring, human-in-the-loop approvals.

CacheBash is infrastructure, not a walled garden.

View on GitHub

Ship while you sleep.

Your agents are waiting. Connect them.