Token-efficient LLM commands using @@ markers.
Licensed to build, not to JSON.
{
"tool": "write",
"params": {
"path": "src/app.ts",
"content": "import express from 'express';\nconst app = express();\napp.listen(3000);"
}
}
Tokens: ~60
Escaping: Required (\n, \")
Failure: One missing } = Total crash
@@WRITE_FILE src/app.ts
import express from 'express';
const app = express();
app.listen(3000);
@@END_FILE
Tokens: ~8
Escaping: Not required (raw content)
Failure: Graceful partial execution
| Command | Usage | Description |
|---|---|---|
@@READ_FILE |
@@READ_FILE /path/to/file |
Read file contents |
@@WRITE_FILE |
@@WRITE_FILE /path |
Write or overwrite file |
@@MKDIR |
@@MKDIR /path/to/dir |
Create directory (with parents) |
@@RUN_CMD |
@@RUN_CMD npm install |
Execute shell command |
@@DISPATCH |
@@DISPATCH crew-coder |
Send task to specialist agent |
@@PIPELINE |
@@PIPELINE [{"wave":1,...}] |
Multi-agent orchestration |
The @@ symbol is visually distinct and rarely appears in natural text, making it easy for LLMs to learn and reliable for parsers to detect.
Unlike JSON (which must close before parsing), @@ commands are parsed line-by-line during streaming — reducing latency and enabling graceful partial execution.
LLMs don't need to escape newlines, quotes, or special characters inside @@WRITE_FILE blocks. This eliminates common hallucination patterns where models forget to close strings.
If an LLM hits a context limit mid-generation, every completed @@ block executes successfully. With JSON-RPC, one missing brace invalidates the entire payload.
LLMs think in top-down narratives. @@PROTOCOL follows this: agents can explain why, then execute the @@ command, then continue their narrative — all in one natural flow.
JSON syntax (braces, quotes, escapes) consumes 15-30% more tokens than necessary. @@PROTOCOL reduces overhead from ~60 tokens to ~8 tokens per file operation.
| Feature | JSON-RPC | @@PROTOCOL |
|---|---|---|
| Token Overhead | 45-60 tokens per call | 8 tokens per call |
| Escaping | Required (\n, \") |
Not required |
| Streaming | Must close block before parsing | Line-by-line parseable |
| Partial Execution | One error = total failure | Graceful partial execution |
| LLM Hallucination | Frequent (unclosed braces) | Rare (simple markers) |
| Readability | Verbose nested JSON | Clean, narrative flow |
The current production protocol uses inline @@ markers that agents embed in their natural language replies.
The upcoming v2 adds adapter layers for MCP, JSON-RPC, and OpenAI-style tool calls while keeping @@ syntax as the canonical internal format.
crew-lead.mjslib/dispatch/parsers.mjsgateway-bridge.mjs@@PROTOCOL is built into crewswarm. Clone the repo and start building with token-efficient agent commands today.