You have a brilliant debugging session with Claude at 2 AM, make critical architectural decisions with Copilot during a sprint, or work through a complex refactor with Cursor over multiple days — and all of that context is scattered across IDEs with very poor search tooling.
ContextCore makes it permanent, searchable, and reusable.
Built by engineers for engineers. No fluff, just a solid memory layer.
Reads chat history from Claude Code, Cursor, Open Code, Kiro, and VS Code Copilot. Normalizes everything into a single unified format regardless of IDE.
Transforms storage into a live memory layer. Allow any MCP-capable LLM to search your archive and build on prior reasoning automatically.
Fuzzy lexical search (Fuse.js) combined with semantic vector search (Qdrant). Find conversations by exact phrases or underlying meaning.
Compose and edit reusable AI agents from indexed architecture docs, upgrade notes, and code files directly in the visualizer UI.
A D3-powered zoomable card map. Explore your history spatially, save favorites, and spin up AI agents directly from your past chats & indexed project knowledge.
Package repeatable engineering workflows into reusable skills and memory assets so future agents can apply your team's proven thinking patterns.
Load prior session transcripts via MCP to give your agent full context of what was tried, what worked, and what was decided months ago.
When an LLM encounters a regression, it can search conversation history to understand what changed, when, and why using the rationale captured in previous chats.
A search for "authentication" returns results from Claude Code, Cursor, Kiro, and VS Code sessions equally, ranked by relevance regardless of origin.
No other system reads from JSONL streams, SQLite databases, and incremental JSON patches, normalizing all of it into the same searchable model.
Not just storage — it's a feedback loop where AI agents build on each other's work via MCP tools, resources, and pre-built prompts.
File watcher detects changes to harness source files as they happen. New conversations appear in the system within seconds, fully indexed.
Built for engineers, using a modern stack designed for speed and extensibility.
bun:sqliteSpin up the orchestrator and visualizer locally:
# Install dependencies
cd server && bun install
cd ../visualizer && bun install
# Setup
cd ../server && bun setup
# Start the server (ingestion + API + MCP)
cd server && bun run dev
# Start the visualizer
cd visualizer && bun run dev
Clone the Repository