Local-first option
Run everything on your machine. No data ever leaves your network.
Four-stage pipeline that runs entirely on your machine
Everything you need to turn conversation history into institutional knowledge
Use Claude API, local Llama models, or GLM-4.6 via Z.AI. Your choice.
Semantic search powered by knowledge graphs. Find solutions by intent, not keywords.
Identify recurring problems and solutions across your codebase and team.
Aggregate knowledge across team members. Build shared understanding.
Early feedback from power users
Choose the plan that fits your workflow
Your code conversations stay yours. Process locally or use our cloud with strict data controls.
Run everything on your machine. No data ever leaves your network.
Use local Llama models for zero external API calls.
Cloud option processes and deletes. Nothing stored long-term.
Full data portability. Export your graph anytime.
Common questions about Code Atlas
Code Atlas supports the standard JSONL format used by Claude Code projects. We also support legacy formats and can work with custom session logging implementations.
Yes! The local-first option runs entirely on your machine using a local FalkorDB instance and optional local LLM models like Llama. No internet required after initial setup.
Our prompts achieve 90%+ accuracy on problem-solution extraction in benchmarks. We're continuously improving extraction quality with feedback from beta users.
Team and Enterprise plans support shared knowledge graphs. Team members can contribute sessions and search across the collective knowledge base.
We support FalkorDB (recommended for GraphRAG performance) and Neo4j. Enterprise customers can request additional database integrations.
Join thousands of developers building searchable knowledge from their AI sessions.