c9e4337e15033ce58140f6c94747b023ed4221e0
Repo summaries now generate structured, opinionated overviews with: - What This Project Does (purpose, not generic description) - Architecture (entry points, request flow, key abstractions) - Key Patterns (conventions an agent must match) - Where Things Live (non-obvious file ownership map) - Gotchas (surprises, race conditions, naming traps) Addresses Mike/Dmitry feedback: roll-up summaries must be 'actually usable.' Old prompt produced 4-6 generic sentences. New prompt produces a real onboarding doc that reads like a senior engineer wrote it.
Developer Intelligence POC
A local proof-of-concept that builds a living knowledge graph from a Go codebase. Every file gets LLM-generated documentation. Every relationship gets documented. Everything stays current on every merge. Query it from Claude Code via MCP.
Quick Start
cd dev-intel-poc
uv sync # install deps
uv run python ingest.py # clone echo, parse, generate docs (~15-20 min)
claude --mcp-config .mcp.json # start Claude Code with the knowledge graph
Then ask Claude Code:
- "How does routing work in echo?"
- "What files depend on context.go?"
- "Give me an overview of this project"
Prerequisites
- Python 3.11+
- uv (
curl -LsSf https://astral.sh/uv/install.sh | sh) - Ollama running at
192.168.86.172:11434withqwen2.5:7b - Claude Code CLI (
claude) - git
Demo Walkthrough
Act 1: "Here's what your codebase knows about itself"
After python ingest.py completes, start Claude Code:
claude --mcp-config .mcp.json
Ask it:
> What does echo.go do?
> How does echo.go interact with router.go?
> Give me an overview of the whole project
> What files depend on context.go?
> Search for anything related to "middleware"
Every answer comes from LLM-generated documentation stored in the knowledge graph — not from reading raw source code.
Act 2: "Now someone pushes a change"
In another terminal:
python simulate_merge.py echo.go
This:
- Regenerates echo.go's documentation (reflects the new code)
- Marks all relationships involving echo.go as STALE
- Marks the repo summary as STALE
Back in Claude Code:
> What does echo.go do? # fresh doc — mentions the new tracing feature
> What's the repo overview? # shows [STALE] — knows it's outdated
> Show me all stale docs # lists everything that needs refresh
Act 3: "The system heals itself"
python refresh_stale.py
Back in Claude Code:
> What's the repo overview? # fresh again — rewritten to include new capabilities
> Show me all stale docs # "Everything is fresh!"
Architecture
ingest.py ──→ repos/target/ (git clone)
│ │
│ parser.py (tree-sitter AST)
│ │
│ docgen.py (Ollama qwen2.5:7b)
│ │
└──────→ devintel.db (SQLite)
│
mcp_server.py ──→ Claude Code
No Docker. No external databases. One SQLite file. One MCP server.
MCP Tools
| Tool | What it does |
|---|---|
get_file_doc(path) |
Read a file's generated documentation |
get_relationship(a, b) |
How two files interact |
get_repo_overview() |
Project-level summary |
get_dependents(path) |
What breaks if you change this file |
get_dependencies(path) |
What this file depends on |
search_docs(query) |
Keyword search across all docs |
get_stale_docs() |
List outdated documentation |
get_graph_stats() |
File count, relationship count, staleness |
Project Structure
dev-intel-poc/
├── requirements.txt # Python deps
├── .mcp.json # Claude Code MCP config
├── ingest.py # Main ingestion pipeline
├── simulate_merge.py # Simulate a code change
├── refresh_stale.py # Refresh stale docs
├── db.py # SQLite backend
├── parser.py # tree-sitter Go AST parser
├── docgen.py # Ollama doc generation
├── mcp_server.py # MCP server for Claude Code
└── devintel.db # Generated — the knowledge graph
Configuration
| Env Variable | Default | Description |
|---|---|---|
LLM_BACKEND |
ollama |
ollama or openai (for Kiro gateway, OpenRouter, etc.) |
OLLAMA_URL |
http://192.168.86.172:11434 |
Ollama endpoint |
OLLAMA_MODEL |
qwen2.5:7b |
Ollama model |
OPENAI_URL |
http://192.168.86.11:8000 |
OpenAI-compatible endpoint (Kiro gateway) |
OPENAI_MODEL |
claude-haiku-4 |
Model name for OpenAI-compatible API |
OPENAI_API_KEY |
not-needed |
API key (if required by endpoint) |
TARGET_REPO |
https://github.com/labstack/echo.git |
Repo to ingest |
MAX_CONCURRENT |
4 |
Parallel LLM requests |
DEVINTEL_DB |
./devintel.db |
SQLite database path |
REPO_DIR |
./repos/target |
Cloned repo location |
Description
Developer Intelligence POC — knowledge graph from Go codebase via tree-sitter + Ollama + MCP
Languages
Python
87.7%
Shell
12.3%