v0.10.13 · MIT · Python 3.11+
gnosis-mcp indexes your docs, git history, and crawled sites into a searchable knowledge base exposed over MCP. Zero config. SQLite by default. Hybrid FTS5 + vector with optional cross-encoder reranking.
Markdown, text, notebooks, TOML, CSV, JSON. Optional rST + PDF. Heading-aware chunking that never splits inside code blocks or tables.
BM25 + local ONNX embeddings merged via Reciprocal Rank Fusion. Tune the fusion constant
with GNOSIS_MCP_RRF_K. No API key required.
Optional [reranking] extra. A 22M-param ONNX cross-encoder that
re-scores the top candidates. Off by default.
Ingest commit messages as searchable context. Find the reason a line exists, not just the line itself.
Sitemap discovery or BFS. Robots.txt with same-host redirect guard. ETag/Last-Modified caching. Trafilatura extraction with per-page timeout.
9 MCP tools + 3 resources. Optional REST API on the same process with Bearer auth (timing-safe). File watcher for auto re-ingest.
SQLite keyword (FTS5 + BM25), in-memory, median of 3 runs on laptop CPU. Methodology →
| Corpus | QPS | p50 | p95 | p99 |
|---|---|---|---|---|
| 100 docs / 300 chunks | 9,463 | 0.10 ms | 0.16 ms | 0.19 ms |
| 1,000 docs / 3,000 chunks | 2,768 | 0.29 ms | 0.72 ms | 0.78 ms |
| 5,000 docs / 15,000 chunks | 839 | 0.80 ms | 2.97 ms | 3.54 ms |
| 10,000 docs / 30,000 chunks | 471 | 1.38 ms | 5.60 ms | 6.29 ms |
Through the full MCP stdio protocol: 8.7 ms mean, 13.0 ms p95 per tool call.
RAG eval across 10 cases: Hit@5 = 1.00, MRR = 0.95, Precision@5 = 0.67. Reproduce with gnosis-mcp eval.
| gnosis-mcp | Context7 | docs-mcp-server | mcp-local-rag | |
|---|---|---|---|---|
| Your own private docs | ● | — | ● | ● |
| Self-hosted | ● | hosted | ● | ● |
| Zero config install | ● | ● | ● | ● |
| Local embeddings, no API key | ONNX | — | opt | ● |
| Hybrid keyword + vector (RRF) | ● | — | opt | ● |
| Cross-encoder reranker | opt | — | — | — |
| PostgreSQL + pgvector | ● | — | — | — |
| Web crawl (sitemap + BFS) | ● | — | ● | — |
| Git history indexing | ● | — | — | — |
| REST API on same port | ● | — | — | — |
| File watcher auto re-ingest | ● | — | — | — |
| Write tools (upsert/delete) | ● | — | — | — |
| Published benchmarks | ● | — | — | — |
| Built-in eval harness | ● | — | — | — |
Context7 indexes public library docs. gnosis-mcp indexes your own private docs. Use both.
pip install gnosis-mcp gnosis-mcp ingest ./docs && gnosis-mcp serve pip install gnosis-mcp[embeddings] gnosis-mcp ingest ./docs --embed pip install gnosis-mcp[reranking] GNOSIS_MCP_RERANK_ENABLED=true gnosis-mcp serve pip install gnosis-mcp[postgres] export GNOSIS_MCP_DATABASE_URL=postgresql://...
{
"mcpServers": {
"gnosis": {
"command": "gnosis-mcp",
"args": ["serve"]
}
}
} Full editor snippets in llms-install.md.