AutoDocs

AutoDocs

We handle what AI editors won't: generating and maintaining documentation for your codebase, while also providing search with dependency-aware context that helps your tools understand your codebase and its conventions.

Stars: 134

Visit
 screenshot

AutoDocs by Sita is a tool designed to automate documentation for any repository. It parses the repository using tree-sitter and SCIP, constructs a code dependency graph, and generates repository-wide, dependency-aware documentation and summaries. It provides a FastAPI backend for ingestion/search and a Next.js web UI for chat and exploration. Additionally, it includes an MCP server for deep search capabilities. The tool aims to simplify the process of generating accurate and high-signal documentation for codebases.

README:

Sita Github Banner

Docs · Report Bug · Feature Request ·

Sita uses GitHub Discussions for Support and Feature Requests.


Apache 2.0 License

Español हिन्दी 简体中文

Your browser does not support the video tag.

AutoDocs, by Sita

Automate documentation for any repo: we traverse your codebase, parse the AST, build a dependency graph, and walk that graph to generate accurate, high‑signal docs. A built‑in MCP server lets coding agents deep‑search your code via HTTP.

(Interested in our hosted or enterprise offerings? Join the waitlist at https://trysita.com)

What This Repo Does

  • Parses your repository using tree‑sitter (AST parsing) and SCIP (symbol resolution).
  • Constructs a code dependency graph (files, definitions, calls, imports) and topologically sorts the dependency order.
  • Traverses that graph to create repository‑wide, dependency‑aware documentation and summaries.
  • Exposes a FastAPI backend for ingestion/search and a Next.js web UI for chat and exploration.
  • Provides an MCP server at so agentic tools can query your repo with deep search.

Prerequisites

Install these once on your machine:

  • pnpm 10+ (Node 20+ recommended; Corepack is fine). Docs
  • uv (fast Python package manager). Docs
  • Docker + Docker Compose (to run everything locally). Docs

Reference docs

GitHub Personal Access Token (optional)

Some features or scripts may call the GitHub API (e.g., fetching repo metadata). If you hit rate limits or need to access private repos, create a Personal Access Token (PAT) and set it in your environment.

Suggested scopes

  • Public repos only: use a fine-grained token with selected repositories (read-only) or a classic token with public_repo.
  • Private repos: fine-grained token with read-only repo access to the needed repositories, or a classic token with repo.

Add to your .env (or shell env):

GITHUB_TOKEN=ghp_your_token_here

Notes

  • Keep this token secret; do not commit .env.
  • Fine-grained tokens are preferred for tighter, per-repo permissions.

Quickstart (copy/paste)

  1. Environment
cp .env.example .env

Configuration

Preconfigured

  • Database: DATABASE_URL (local postgres DB). In Compose, DB is available at postgresql://postgres:postgres@db:5432/app.
  • Ingestion API: INGESTION_API_URL for the web app to call the FastAPI service.
  • Analysis storage: ANALYSIS_DB_DIR controls where generated per-repo SQLite files live.

To-be configured

  • Summaries: SUMMARIES_API_KEY, SUMMARIES_MODEL, SUMMARIES_BASE_URL (OpenAI-compatible, default OpenRouter)
  • Embeddings: EMBEDDINGS_API_KEY, EMBEDDINGS_MODEL, EMBEDDINGS_BASE_URL (OpenAI-compatible, default OpenAI)
  • Rate limiting: MAX_REQUESTS_PER_SECOND for LLM summary batching (default 15)

  1. Run locally with Docker Compose
docker compose up -d

# If you want to see logs
docker compose up

You should now have:

Updating Docs

To refresh a repository’s docs after code changes, remove the repo and re‑ingest it (temporary workflow):

  • UI: delete the repo in your Workspace, then add it again (ingestion starts automatically).

We’re actively adding a one‑click "Resync" button in the UI, followed by automatic periodic ingestion (coming soon)

Using the MCP Server

The MCP server is available at http://localhost:3000/api/mcp and is designed for coding agents and MCP‑compatible clients. It exposes a codebase-qna tool that answers repository‑scoped questions by querying the analysis databases that AutoDocs produces.

Tips

  • Point your MCP client at http://localhost:3000/api/mcp.
  • Include an x-repo-id header with the repo ID (you can find it in the UI).
  • For setup guides with popular tools (Claude, Cursor, Continue), see https://docs.trysita.com

Development Workflow (for contributing)

For a local dev loop without Docker Compose you can run the API and web dev servers directly:

# concurrent dev (API + Web + DB)
./tools/dev.sh --sync

Database migration (run if modifying the postgres schema)

cd packages/shared
DATABASE_URL=postgresql://postgres:postgres@localhost:5432/app pnpm drizzle-kit push --config drizzle.main.config.ts

Project Layout

  • ingestion/ — Python FastAPI service, AST parser, graph builder, embeddings, and search.
  • webview/ — Next.js app (Turborepo workspace) and shared TS packages.
  • docker-compose.yml — local Postgres, API, and Web services.
  • tools/ — helper scripts (build_all.sh, dev.sh, uv_export_requirements.sh).

Troubleshooting

  • Web can’t reach the API: ensure INGESTION_API_URL=http://api:8000 is set in .env.
  • Missing uv/pnpm: install them (see links above)

Known Issues

  • In your repositories, code must live at the repository root, not in a nested folder.
  • Language support: currently supports TS, JS, and Python; currently working on expansion to Go, Kotlin, Java, and Rust.
  • Polyglot repos (multiple languages in one repo): not supported yet, but we’re actively working on it.

License

Licensed under the Apache 2.0 License. See LICENSE.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for AutoDocs

Similar Open Source Tools

For similar tasks

For similar jobs