Harness · OpenAI Codex
OpenAI Codex.
OpenAI's Codex CLI agent reads MCP server config from
~/.codex/config.toml. One TOML block and Codex can
drive Workbooks alongside its other tool calls.
Auto-install
Paste this into Codex
Click below to copy a self-install prompt. Paste it into your Codex session and the agent will run through the steps for you.
Or follow the manual steps below.
1. Install the CLI
npm install -g @work.books/cli
workbook --version
2. Edit Codex config
mkdir -p ~/.codex
$EDITOR ~/.codex/config.toml
3. Add the Workbooks server
[mcp_servers.workbooks]
command = "workbook"
args = ["mcp", "serve"]
4. Restart Codex
Close any running Codex session and run codex again. The new server is picked up on launch.
Quirks & tips
- TOML, not JSON. Other harnesses use JSON for MCP config; Codex specifically uses TOML. Don't paste a JSON block from the Claude docs.
- Approval prompts. Codex confirms each tool call by default. For unattended use, run Codex with
--approval-mode auto(or set it in the same config file). - Stdio is the only transport. Codex doesn't yet support SSE-based MCP servers — stick to the stdio command above.
Verify
From a Codex session: "List my workbook groups via the workbooks server." Codex should call workbooks.list_groups.