Integrations Overview
Contox gives your AI coding assistant persistent project memory, regardless of which tool you use. There are two integration methods depending on your tool's capabilities.
Integration methods
MCP (Model Context Protocol) -- For tools that support the MCP standard, Contox provides a native server with 23 tools. The AI can load, search, save, and manage memory directly through the protocol. This is the richest integration.
File-based injection -- For tools that read context from rule files, Contox exports your project brain as a markdown document and injects it into the appropriate file. The AI reads the memory at session start and saves via the CLI at session end.
Supported tools
| AI Tool | Method | Config File | Setup |
|---|---|---|---|
| Claude Code | MCP (native) | .mcp.json | Guide |
| Claude Desktop | MCP (native) | claude_desktop_config.json | Guide |
| Cursor | File-based | .cursorrules | Guide |
| Windsurf | File-based | .windsurfrules | Guide |
| GitHub Copilot | File-based | .github/copilot-instructions.md | Guide |
| Cline | File-based / MCP | .clinerules or MCP | Guide |
| Aider | File-based | contox export | Guide |
| Antigravity | File-based | contox export | Guide |
How memory flows to your AI
Regardless of integration method, the workflow follows the same pattern:
- Session start -- The AI loads your project brain (architecture, conventions, implementation history, decisions, bugs, todos)
- During work -- The AI uses the context to make informed decisions aligned with your project
- Session end -- The AI saves what was accomplished, feeding the enrichment pipeline
For MCP tools, this happens natively through protocol calls. For file-based tools, the brain document is written to .contox/memory.md and injected into rule files between <!-- contox:start --> and <!-- contox:end --> markers. User content outside those markers is never touched.
Quick setup with the VS Code extension
The Contox VS Code extension includes a setup wizard that auto-configures your chosen tools:
- Run Contox: Setup Wizard from the command palette
- Authenticate and select your project
- In Step 4, check the AI tools you use (Claude, Cursor, Copilot, Windsurf)
- The extension auto-creates the appropriate config files
Quick setup with the CLI
Use the contox export command to generate rule files for any tool:
contox export -f cursorrules # creates .cursorrules
contox export -f copilot # creates .github/copilot-instructions.md
contox export -f markdown # creates contox-brain.md
contox export -f cursorrules --stdout # print to stdout
Choosing the right integration
- Use MCP if your tool supports it (Claude Code, Claude Desktop, Cline). You get the full 21-tool experience with real-time search, context packs, and native save.
- Use file-based for everything else. The brain document provides the same core knowledge, and the CLI handles saving.
- Use both if you switch between tools. Contox keeps all integrations in sync through the shared brain.
Next steps
- Claude Code -- MCP server setup for the richest integration
- Cursor -- File-based setup with
.cursorrules - Other Tools -- Generic setup for Cline, Aider, Antigravity, and more