Skip to content

Integrations Overview

Contox gives your AI coding assistant persistent project memory, regardless of which tool you use. There are two integration methods depending on your tool's capabilities.

Integration methods

MCP (Model Context Protocol) -- For tools that support the MCP standard, Contox provides a native server with 23 tools. The AI can load, search, save, and manage memory directly through the protocol. This is the richest integration.

File-based injection -- For tools that read context from rule files, Contox exports your project brain as a markdown document and injects it into the appropriate file. The AI reads the memory at session start and saves via the CLI at session end.

Supported tools

AI ToolMethodConfig FileSetup
Claude CodeMCP (native).mcp.jsonGuide
Claude DesktopMCP (native)claude_desktop_config.jsonGuide
CursorFile-based.cursorrulesGuide
WindsurfFile-based.windsurfrulesGuide
GitHub CopilotFile-based.github/copilot-instructions.mdGuide
ClineFile-based / MCP.clinerules or MCPGuide
AiderFile-basedcontox exportGuide
AntigravityFile-basedcontox exportGuide

How memory flows to your AI

Regardless of integration method, the workflow follows the same pattern:

  1. Session start -- The AI loads your project brain (architecture, conventions, implementation history, decisions, bugs, todos)
  2. During work -- The AI uses the context to make informed decisions aligned with your project
  3. Session end -- The AI saves what was accomplished, feeding the enrichment pipeline

For MCP tools, this happens natively through protocol calls. For file-based tools, the brain document is written to .contox/memory.md and injected into rule files between <!-- contox:start --> and <!-- contox:end --> markers. User content outside those markers is never touched.

Quick setup with the VS Code extension

The Contox VS Code extension includes a setup wizard that auto-configures your chosen tools:

  1. Run Contox: Setup Wizard from the command palette
  2. Authenticate and select your project
  3. In Step 4, check the AI tools you use (Claude, Cursor, Copilot, Windsurf)
  4. The extension auto-creates the appropriate config files

Quick setup with the CLI

Use the contox export command to generate rule files for any tool:

bash
contox export -f cursorrules    # creates .cursorrules
contox export -f copilot        # creates .github/copilot-instructions.md
contox export -f markdown       # creates contox-brain.md
contox export -f cursorrules --stdout  # print to stdout

Choosing the right integration

  • Use MCP if your tool supports it (Claude Code, Claude Desktop, Cline). You get the full 21-tool experience with real-time search, context packs, and native save.
  • Use file-based for everything else. The brain document provides the same core knowledge, and the CLI handles saving.
  • Use both if you switch between tools. Contox keeps all integrations in sync through the shared brain.

Next steps

  • Claude Code -- MCP server setup for the richest integration
  • Cursor -- File-based setup with .cursorrules
  • Other Tools -- Generic setup for Cline, Aider, Antigravity, and more