
aidermacs
Emacs AI Pair Programming Solution
Stars: 361

Aidermacs is an AI pair programming tool for Emacs that integrates Aider, a powerful open-source AI pair programming tool. It provides top performance on the SWE Bench, support for multi-file edits, real-time file synchronization, and broad language support. Aidermacs delivers an Emacs-centric experience with features like intelligent model selection, flexible terminal backend support, smarter syntax highlighting, enhanced file management, and streamlined transient menus. It thrives on community involvement, encouraging contributions, issue reporting, idea sharing, and documentation improvement.
README:
Aidermacs brings AI-powered development to Emacs by integrating Aider, one of the most powerful open-source AI pair programming tools. If you're missing Cursor but prefer living in Emacs, Aidermacs provides similar AI capabilities while staying true to Emacs workflows.
- Intelligent model selection with multiple backends
- Built-in Ediff integration for AI-generated changes
- Enhanced file management from Emacs
- Great customizability and flexible ways to add content
Here's what the community is saying about Aidermacs:
"Are you using aidermacs? For me superior to cursor." - u/berenddeboer
"This is amazing... every time I upgrade my packages I see your new commits. I feel this the authentic aider for emacs" - u/wchmbo
"Between Aidermacs and Gptel it's wild how bleeding edge Emacs is with this stuff. My workplace is exploring MCP registries and even clients that are all the rage (E.g Cursor) lag behind what I can do with mcp.el and gptel for tool use." - u/no_good_names_avail
"This looks amazing... I have been using ellama with local llms, looks like that will work here too. Great stuff!!" - u/lugpocalypse
"Honestly huge fan of this. Thank you for the updates!" - u/ieoa
- Requirements
- Download Aidermacs through Melpa or Non-GNU Elpa, or clone manually
- Modify this sample config and place it in your Emacs
init.el
:
(use-package aidermacs
:bind (("C-c a" . aidermacs-transient-menu))
:config
; Set API_KEY in .bashrc, that will automatically picked up by aider or in elisp
(setenv "ANTHROPIC_API_KEY" "sk-...")
; defun my-get-openrouter-api-key yourself elsewhere for security reasons
(setenv "OPENROUTER_API_KEY" (my-get-openrouter-api-key))
:custom
; See the Configuration section below
(aidermacs-use-architect-mode t)
(aidermacs-default-model "sonnet"))
- Open a project and run
M-x aidermacs-transient-menu
orC-c a
(where you bind it) - Add files and start coding with AI!
The main interface to Aidermacs is through its transient menu system (similar to Magit). Access it with:
M-x aidermacs-transient-menu
Or bind it to a key in your config:
(global-set-key (kbd "C-c a") 'aidermacs-transient-menu)
Once the transient menu is open, you can navigate and execute commands using the displayed keys. Here's a summary of the main menu structure:
-
a
: Start/Open Session (auto-detects project root) -
.
: Start in Current Directory (good for monorepos) -
l
: Clear Chat History -
s
: Reset Session -
x
: Exit Session
-
1
: Code Mode -
2
: Chat/Ask Mode -
3
: Architect Mode -
4
: Help Mode
-
^
: Show Last Commit (if auto-commits enabled) -
u
: Undo Last Commit (if auto-commits enabled) -
R
: Refresh Repo Map -
h
: Session History -
o
: Change Main Model -
?
: Aider Meta-level Help
-
f
: Add File (C-u: read-only) -
F
: Add Current File -
d
: Add From Directory (same type) -
w
: Add From Window -
m
: Add From Dired (marked) -
j
: Drop File -
J
: Drop Current File -
k
: Drop From Dired (marked) -
K
: Drop All Files -
S
: Create Session Scratchpad -
G
: Add File to Session -
A
: List Added Files
-
c
: Code Change -
e
: Question Code -
r
: Architect Change -
q
: General Question -
p
: Question This Symbol -
g
: Accept Proposed Changes -
i
: Implement TODO -
t
: Write Test -
T
: Fix Test -
!
: Debug Exception
The All File Actions
and All Code Actions
entries open submenus with more specialized commands. Use the displayed keys to navigate these submenus.
When using Aidermacs, you have the flexibility to decide which files the AI should read and edit. Here are some guidelines:
- Editable Files: Add files you want the AI to potentially edit. This grants the AI permission to both read and modify these files if necessary.
-
Read-Only Files: If you want the AI to read a file without editing it, you can add it as read-only. In Aidermacs, all add file commands can be prefixed with
C-u
to specify read-only access. -
Session Scratchpads: Use the session scratchpads (
S
) to paste notes or documentation that will be fed to the AI as read-only. -
External Files: The "Add file to session" (
G
) command allows you to include files outside the current project (or files in.gitignore
), as Aider doesn't automatically include these files in its context.
The AI can sometimes determine relevant files on its own, depending on the model and the context of the codebase. However, for precise control, it's often beneficial to manually specify files, especially when dealing with complex projects.
Aider encourages a collaborative approach, similar to working with a human co-worker. Sometimes the AI will need explicit guidance, while other times it can infer the necessary context on its own.
Aidermacs provides a minor mode that makes it easy to work with prompt files and other Aider-related files. When enabled, the minor mode provides these convenient keybindings:
-
C-c C-n
orC-<return>
: Send line/region line-by-line -
C-c C-c
: Send block/region as whole -
C-c C-z
: Switch to Aidermacs buffer
The minor mode is automatically enabled for:
-
.aider.prompt.org
files (create withM-x aidermacs-open-prompt-file
) -
.aider.chat.md
files -
.aider.chat.history.md
files -
.aider.input.history
files
You can use the aidermacs-before-run-backend-hook
to run custom setup code before starting the Aider backend. This is particularly useful for:
- Setting environment variables
- Injecting secrets
- Performing any other pre-run configuration
Example usage to securely set an OpenAI API key from password-store:
(add-hook 'aidermacs-before-run-backend-hook
(lambda ()
(setenv "OPENAI_API_KEY" (password-store-get "code/openai_api_key"))))
This approach keeps sensitive information out of your dotfiles while still making it available to Aidermacs.
You can customize the default AI model used by Aidermacs by setting the aidermacs-default-model
variable:
(setq aidermacs-default-model "sonnet")
This enables easy switching between different AI models without modifying the aidermacs-extra-args
variable.
Note: This configuration will be overwritten by the existence of an .aider.conf.yml
file (see details).
Aidermacs offers intelligent model selection for solo (non-Architect) mode, automatically detecting and integrating with multiple AI providers:
- Automatically fetches available models from supported providers (OpenAI, Anthropic, DeepSeek, Google Gemini, OpenRouter)
- Caches model lists for quick access
- Supports both popular pre-configured models and dynamically discovered ones
- Handles API keys and authentication automatically from your .bashrc
- Provides model compatibility checking
The dynamic model selection is only for the solo (non-Architect) mode.
To change models in solo mode:
- Use
M-x aidermacs-change-model
or presso
in the transient menu - Select from either:
- Popular pre-configured models (fast)
- Dynamically fetched models from all supported providers (comprehensive)
The system will automatically filter models to only show ones that are:
- Supported by your current Aider version
- Available through your configured API keys
- Compatible with your current workflow
Aidermacs features an experimental mode using two specialized models for each coding task: an Architect model for reasoning and an Editor model for code generation. This approach has achieved state-of-the-art (SOTA) results on aider's code editing benchmark, as detailed in this blog post.
To enable this mode, set aidermacs-use-architect-mode
to t
. You must also configure the aidermacs-architect-model
variable to specify the model to use for the Architect role.
By default, the aidermacs-editor-model
is the same as aidermacs-default-model
. You only need to set aidermacs-editor-model
if you want to use a different model for the Editor role.
When Architect mode is enabled, the aidermacs-default-model
setting is ignored, and aidermacs-architect-model
and aidermacs-editor-model
are used instead.
(setq aidermacs-use-architect-mode t)
You can switch to it persistently by M-x aidermacs-switch-to-architect-mode
(3
in aidermacs-transient-menu
), or temporarily with M-x aidermacs-architect-this-code
(r
in aidermacs-transient-menu
).
You can configure each model independently:
;; Default model used for all modes unless overridden
(setq aidermacs-default-model "sonnet")
;; Optional: Set specific model for architect reasoning
(setq aidermacs-architect-model "deepseek/deepseek-reasoner")
;; Optional: Set specific model for code generation
(setq aidermacs-editor-model "deepseek/deepseek-chat")
The model hierarchy works as follows:
- When Architect mode is enabled:
- The Architect model handles high-level reasoning and solution design
- The Editor model executes the actual code changes
- When Architect mode is disabled, only
aidermacs-default-model
is used - You can configure specific models or let them automatically use the default model
Models will reflect changes to aidermacs-default-model
unless they've been explicitly set to a different value.
Note: These configurations will be overwritten by the existence of an .aider.conf.yml
file (see details).
The Weak model is used for commit messages (if you have aidermacs-auto-commits
set to t
) and chat history summarization (default depends on –model). You can customize it using
;; default to nil
(setq aidermacs-weak-model "deepseek/deepseek-chat")
You can change the Weak model during a session by using C-u o
(aidermacs-change-model
with a prefix argument). In most cases, you won't need to change this as Aider will automatically select an appropriate Weak model based on your main model.
Note: These configurations will be overwritten by the existence of an .aider.conf.yml
file (see details).
By default, Aidermacs requires explicit confirmation before applying changes proposed in Architect mode. This gives you a chance to review the AI's plan before any code is modified.
If you prefer to automatically accept all Architect mode changes without confirmation (similar to Aider's default behavior), you can enable this with:
(setq aidermacs-auto-accept-architect t)
Note: These configurations will be overwritten by the existence of an .aider.conf.yml
file (see details).
Choose your preferred terminal backend by setting aidermacs-backend
:
vterm
offers better terminal compatibility, while comint
provides a simple, built-in option that remains fully compatible with Aidermacs.
;; Use vterm backend (default is comint)
(setq aidermacs-backend 'vterm)
Available backends:
-
comint
(default): Uses Emacs' built-in terminal emulation -
vterm
: Leverages vterm for better terminal compatibility
The vterm backend will use the faces defined by your active Emacs theme to set the colors for aider. It tries to guess some reasonable color values based on your themes. In some cases this will not work perfectly; if text is unreadable for you, you can turn this off as follows:
;; don't match emacs theme colors
(setopt aidermacs-vterm-use-theme-colors nil)
You can customize keybindings for multiline input, this key allows you to enter multiple lines without sending the command to Aider. Press RET
normally to send the command.
;; Comint backend:
(setq aidermacs-comint-multiline-newline-key "S-<return>")
;; Vterm backend:
(setq aidermacs-vterm-multiline-newline-key "S-<return>")
Aidermacs fully supports working with remote files through Emacs' Tramp mode. This allows you to use Aidermacs on files hosted on remote servers via SSH, Docker, and other protocols supported by Tramp.
When working with remote files:
- File paths are automatically localized for the remote system
- All Aidermacs features work seamlessly across local and remote files
- Edits are applied directly to the remote files
- Diffs and change reviews work as expected
Example usage:
;; Open a remote file via SSH
(find-file "/ssh:user@remotehost:/path/to/file.py")
;; Start Aidermacs session - it will automatically detect the remote context
M-x aidermacs-transient-menu
Aidermacs makes it easy to reuse prompts through:
- Prompt History - Your previously used prompts are saved and can be quickly selected
-
Common Prompts - A curated list of frequently used prompts for common tasks defined in
aidermacs-common-prompts
:
When entering a prompt, you can:
- Select from your history or common prompts using completion
- Still type custom prompts when needed
The prompt history and common prompts are available across all sessions.
Control whether to show diffs for AI-generated changes with aidermacs-show-diff-after-change
:
;; Enable/disable showing diffs after changes (default: t)
(setq aidermacs-show-diff-after-change t)
When enabled, Aidermacs will:
- Capture the state of files before AI edits
- Show diffs using Emacs' built-in ediff interface
- Allow you to review and accept/reject changes
Aider automatically commits AI-generated changes by default. We consider this behavior very intrusive, so we've disabled it. You can re-enable auto-commits by setting aidermacs-auto-commits
to t
:
;; Enable auto-commits
(setq aidermacs-auto-commits t)
With auto-commits disabled, you must manually commit changes using your preferred Git workflow.
Note: This configuration will be overwritten by the existence of an .aider.conf.yml
file (see details).
If these configurations aren't sufficient, the aidermacs-extra-args
variable enables passing any Aider-supported command-line options.
See the Aider configuration documentation for a full list of available options.
;; Set the verbosity:
(setq aidermacs-extra-args '("--verbose"))
These arguments will be appended to the Aider command when it is run. Note that the --model
argument is automatically handled by aidermacs-default-model
and should not be included in aidermacs-extra-args
.
Aidermacs supports project-specific configurations via .aider.conf.yml
files. To enable this:
-
Create a
.aider.conf.yml
in your home dir, project's root, or the current directory, defining your desired settings. See the Aider documentation for available options. -
Tell Aidermacs to use the config file in one of two ways:
;; Set the `aidermacs-config-file` variable in your Emacs config: (setq aidermacs-config-file "/path/to/your/project/.aider.conf.yml") ;; *Or*, include the `--config` or `-c` flag in `aidermacs-extra-args`: (setq aidermacs-extra-args '("--config" "/path/to/your/project/.aider.conf.yml"))
Note: You can also rely on Aider's default behavior of automatically searching for .aider.conf.yml
in the home directory, project root, or current directory, in that order. In this case, you do not need to set aidermacs-config-file
or include --config
in aidermacs-extra-args
.
-
Important: When using a config file, all other Aidermacs configuration variables supplying an argument option (e.g.,
aidermacs-default-model
,aidermacs-architect-model
,aidermacs-use-architect-mode
) are IGNORED. Aider will only use the settings specified in your.aider.conf.yml
file. Do not attempt to combine these Emacs settings with a config file, as the results will be unpredictable. -
Precedence: Settings in
.aider.conf.yml
always take precedence when a config file is explicitly specified. -
Avoid Conflicts: When using a config file, do not include model-related arguments (like
--model
,--architect
, etc.) inaidermacs-extra-args
. Configure all settings within your.aider.conf.yml
file.
Aider can work with Sonnet 3.7's new thinking tokens. You can now enable and configure thinking tokens more easily using the following methods:
-
In-Chat Command: Use the
/think-tokens
command followed by the desired token budget. For example:/think-tokens 8k
or/think-tokens 10000
. Supported formats include8096
,8k
,10.5k
, and0.5M
. -
Command-Line Argument: Set the
--thinking-tokens
argument when starting Aidermacs. For example, you can add this to youraidermacs-extra-args
:(setq aidermacs-extra-args '("--thinking-tokens" "16k"))
These methods provide a more streamlined way to control thinking tokens without requiring manual configuration of .aider.model.settings.yml
files.
Note: If you are using an .aider.conf.yml
file, you can also set the thinking_tokens
option there.
The .aider.prompt.org
file is particularly useful for:
- Storing frequently used prompts
- Documenting common workflows
- Quick access to complex instructions
You can customize which files automatically enable the minor mode by configuring aidermacs-auto-mode-files
:
(setq aidermacs-auto-mode-files
'(".aider.prompt.org"
".aider.chat.md"
".aider.chat.history.md"
".aider.input.history"
"my-custom-aider-file.org")) ; Add your own files
Please check Aider's FAQ for Aider related questions.
Yes! Aidermacs supports any OpenAI-compatible API endpoint. Check Aider documentation on Ollama and LiteLLM.
Yes, the code you add to the session is sent to the AI provider. Be mindful of sensitive code.
Aider only support Python 3.12 currently, you can use uv install aider:
uv tool install --force --python python3.12 aider-chat@latest
If you encounter a proxy-related issue , such as the error indicating that the 'socksio' package is not installed, please use:
uv tool install --force --python python3.12 aider-chat@latest --with 'httpx[socks]'
And adjust aidermacs program with below config.
(setq aidermacs-program (expand-file-name "~/.local/bin/aider"))
Aidermacs thrives on community involvement. We believe collaborative development with user and contributor input creates the best software. We encourage you to:
- Contribute Code: Submit pull requests with bug fixes, new features, or improvements to existing functionality.
- Report Issues: Let us know about any bugs, unexpected behavior, or feature requests through GitHub Issues.
- Share Ideas: Participate in discussions and propose new ideas for making Aidermacs even better.
- Improve Documentation: Help us make the documentation clearer, more comprehensive, and easier to use.
Your contributions are essential for making Aidermacs the best AI pair programming tool in Emacs!
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for aidermacs
Similar Open Source Tools

aidermacs
Aidermacs is an AI pair programming tool for Emacs that integrates Aider, a powerful open-source AI pair programming tool. It provides top performance on the SWE Bench, support for multi-file edits, real-time file synchronization, and broad language support. Aidermacs delivers an Emacs-centric experience with features like intelligent model selection, flexible terminal backend support, smarter syntax highlighting, enhanced file management, and streamlined transient menus. It thrives on community involvement, encouraging contributions, issue reporting, idea sharing, and documentation improvement.

openedai-speech
OpenedAI Speech is a free, private text-to-speech server compatible with the OpenAI audio/speech API. It offers custom voice cloning and supports various models like tts-1 and tts-1-hd. Users can map their own piper voices and create custom cloned voices. The server provides multilingual support with XTTS voices and allows fixing incorrect sounds with regex. Recent changes include bug fixes, improved error handling, and updates for multilingual support. Installation can be done via Docker or manual setup, with usage instructions provided. Custom voices can be created using Piper or Coqui XTTS v2, with guidelines for preparing audio files. The tool is suitable for tasks like generating speech from text, creating custom voices, and multilingual text-to-speech applications.

RA.Aid
RA.Aid is an AI software development agent powered by `aider` and advanced reasoning models like `o1`. It combines `aider`'s code editing capabilities with LangChain's agent-based task execution framework to provide an intelligent assistant for research, planning, and implementation of multi-step development tasks. It handles complex programming tasks by breaking them down into manageable steps, running shell commands automatically, and leveraging expert reasoning models like OpenAI's o1. RA.Aid is designed for everyday software development, offering features such as multi-step task planning, automated command execution, and the ability to handle complex programming tasks beyond single-shot code edits.

org-ai
org-ai is a minor mode for Emacs org-mode that provides access to generative AI models, including OpenAI API (ChatGPT, DALL-E, other text models) and Stable Diffusion. Users can use ChatGPT to generate text, have speech input and output interactions with AI, generate images and image variations using Stable Diffusion or DALL-E, and use various commands outside org-mode for prompting using selected text or multiple files. The tool supports syntax highlighting in AI blocks, auto-fill paragraphs on insertion, and offers block options for ChatGPT, DALL-E, and other text models. Users can also generate image variations, use global commands, and benefit from Noweb support for named source blocks.

cursor-tools
cursor-tools is a CLI tool designed to enhance AI agents with advanced skills, such as web search, repository context, documentation generation, GitHub integration, Xcode tools, and browser automation. It provides features like Perplexity for web search, Gemini 2.0 for codebase context, and Stagehand for browser operations. The tool requires API keys for Perplexity AI and Google Gemini, and supports global installation for system-wide access. It offers various commands for different tasks and integrates with Cursor Composer for AI agent usage.

shellChatGPT
ShellChatGPT is a shell wrapper for OpenAI's ChatGPT, DALL-E, Whisper, and TTS, featuring integration with LocalAI, Ollama, Gemini, Mistral, Groq, and GitHub Models. It provides text and chat completions, vision, reasoning, and audio models, voice-in and voice-out chatting mode, text editor interface, markdown rendering support, session management, instruction prompt manager, integration with various service providers, command line completion, file picker dialogs, color scheme personalization, stdin and text file input support, and compatibility with Linux, FreeBSD, MacOS, and Termux for a responsive experience.

yoyak
Yoyak is a small CLI tool powered by LLM for summarizing and translating web pages. It provides shell completion scripts for bash, fish, and zsh. Users can set the model they want to use and summarize web pages with the 'yoyak summary' command. Additionally, translation to other languages is supported using the '-l' option with ISO 639-1 language codes. Yoyak supports various models for summarization and translation tasks.

llm-term
LLM-Term is a Rust-based CLI tool that generates and executes terminal commands using OpenAI's language models or local Ollama models. It offers configurable model and token limits, works on both PowerShell and Unix-like shells, and provides a seamless user experience for generating commands based on prompts. Users can easily set up the tool, customize configurations, and leverage different models for command generation.

podscript
Podscript is a tool designed to generate transcripts for podcasts and similar audio files using Language Model Models (LLMs) and Speech-to-Text (STT) APIs. It provides a command-line interface (CLI) for transcribing audio from various sources, including YouTube videos and audio files, using different speech-to-text services like Deepgram, Assembly AI, and Groq. Additionally, Podscript offers a web-based user interface for convenience. Users can configure keys for supported services, transcribe audio, and customize the transcription models. The tool aims to simplify the process of creating accurate transcripts for audio content.

llm-vscode
llm-vscode is an extension designed for all things LLM, utilizing llm-ls as its backend. It offers features such as code completion with 'ghost-text' suggestions, the ability to choose models for code generation via HTTP requests, ensuring prompt size fits within the context window, and code attribution checks. Users can configure the backend, suggestion behavior, keybindings, llm-ls settings, and tokenization options. Additionally, the extension supports testing models like Code Llama 13B, Phind/Phind-CodeLlama-34B-v2, and WizardLM/WizardCoder-Python-34B-V1.0. Development involves cloning llm-ls, building it, and setting up the llm-vscode extension for use.

mcpdoc
The MCP LLMS-TXT Documentation Server is an open-source server that provides developers full control over tools used by applications like Cursor, Windsurf, and Claude Code/Desktop. It allows users to create a user-defined list of `llms.txt` files and use a `fetch_docs` tool to read URLs within these files, enabling auditing of tool calls and context returned. The server supports various applications and provides a way to connect to them, configure rules, and test tool calls for tasks related to documentation retrieval and processing.

ice-score
ICE-Score is a tool designed to instruct large language models to evaluate code. It provides a minimum viable product (MVP) for evaluating generated code snippets using inputs such as problem, output, task, aspect, and model. Users can also evaluate with reference code and enable zero-shot chain-of-thought evaluation. The tool is built on codegen-metrics and code-bert-score repositories and includes datasets like CoNaLa and HumanEval. ICE-Score has been accepted to EACL 2024.

generative-models
Generative Models by Stability AI is a repository that provides various generative models for research purposes. It includes models like Stable Video 4D (SV4D) for video synthesis, Stable Video 3D (SV3D) for multi-view synthesis, SDXL-Turbo for text-to-image generation, and more. The repository focuses on modularity and implements a config-driven approach for building and combining submodules. It supports training with PyTorch Lightning and offers inference demos for different models. Users can access pre-trained models like SDXL-base-1.0 and SDXL-refiner-1.0 under a CreativeML Open RAIL++-M license. The codebase also includes tools for invisible watermark detection in generated images.

hayhooks
Hayhooks is a tool that simplifies the deployment and serving of Haystack pipelines as REST APIs. It allows users to wrap their pipelines with custom logic and expose them via HTTP endpoints, including OpenAI-compatible chat completion endpoints. With Hayhooks, users can easily convert their Haystack pipelines into API services with minimal boilerplate code.

llm-context.py
LLM Context is a tool designed to assist developers in quickly injecting relevant content from code/text projects into Large Language Model chat interfaces. It leverages `.gitignore` patterns for smart file selection and offers a streamlined clipboard workflow using the command line. The tool also provides direct integration with Large Language Models through the Model Context Protocol (MCP). LLM Context is optimized for code repositories and collections of text/markdown/html documents, making it suitable for developers working on projects that fit within an LLM's context window. The tool is under active development and aims to enhance AI-assisted development workflows by harnessing the power of Large Language Models.

HuggingFaceModelDownloader
The HuggingFace Model Downloader is a utility tool for downloading models and datasets from the HuggingFace website. It offers multithreaded downloading for LFS files and ensures the integrity of downloaded models with SHA256 checksum verification. The tool provides features such as nested file downloading, filter downloads for specific LFS model files, support for HuggingFace Access Token, and configuration file support. It can be used as a library or a single binary for easy model downloading and inference in projects.
For similar tasks

aidermacs
Aidermacs is an AI pair programming tool for Emacs that integrates Aider, a powerful open-source AI pair programming tool. It provides top performance on the SWE Bench, support for multi-file edits, real-time file synchronization, and broad language support. Aidermacs delivers an Emacs-centric experience with features like intelligent model selection, flexible terminal backend support, smarter syntax highlighting, enhanced file management, and streamlined transient menus. It thrives on community involvement, encouraging contributions, issue reporting, idea sharing, and documentation improvement.

sweep
Sweep is an AI junior developer that turns bugs and feature requests into code changes. It automatically handles developer experience improvements like adding type hints and improving test coverage.

sourcegraph
Sourcegraph is a code search and navigation tool that helps developers read, write, and fix code in large, complex codebases. It provides features such as code search across all repositories and branches, code intelligence for navigation and refactoring, and the ability to fix and refactor code across multiple repositories at once.

continue
Continue is an open-source autopilot for VS Code and JetBrains that allows you to code with any LLM. With Continue, you can ask coding questions, edit code in natural language, generate files from scratch, and more. Continue is easy to use and can help you save time and improve your coding skills.

cody
Cody is a free, open-source AI coding assistant that can write and fix code, provide AI-generated autocomplete, and answer your coding questions. Cody fetches relevant code context from across your entire codebase to write better code that uses more of your codebase's APIs, impls, and idioms, with less hallucination.

awesome-code-ai
A curated list of AI coding tools, including code completion, refactoring, and assistants. This list includes both open-source and commercial tools, as well as tools that are still in development. Some of the most popular AI coding tools include GitHub Copilot, CodiumAI, Codeium, Tabnine, and Replit Ghostwriter.

commanddash
Dash AI is an open-source coding assistant for Flutter developers. It is designed to not only write code but also run and debug it, allowing it to assist beyond code completion and automate routine tasks. Dash AI is powered by Gemini, integrated with the Dart Analyzer, and specifically tailored for Flutter engineers. The vision for Dash AI is to create a single-command assistant that can automate tedious development tasks, enabling developers to focus on creativity and innovation. It aims to assist with the entire process of engineering a feature for an app, from breaking down the task into steps to generating exploratory tests and iterating on the code until the feature is complete. To achieve this vision, Dash AI is working on providing LLMs with the same access and information that human developers have, including full contextual knowledge, the latest syntax and dependencies data, and the ability to write, run, and debug code. Dash AI welcomes contributions from the community, including feature requests, issue fixes, and participation in discussions. The project is committed to building a coding assistant that empowers all Flutter developers.

mentat
Mentat is an AI tool designed to assist with coding tasks directly from the command line. It combines human creativity with computer-like processing to help users understand new codebases, add new features, and refactor existing code. Unlike other tools, Mentat coordinates edits across multiple locations and files, with the context of the project already in mind. The tool aims to enhance the coding experience by providing seamless assistance and improving edit quality.
For similar jobs

sweep
Sweep is an AI junior developer that turns bugs and feature requests into code changes. It automatically handles developer experience improvements like adding type hints and improving test coverage.

teams-ai
The Teams AI Library is a software development kit (SDK) that helps developers create bots that can interact with Teams and Microsoft 365 applications. It is built on top of the Bot Framework SDK and simplifies the process of developing bots that interact with Teams' artificial intelligence capabilities. The SDK is available for JavaScript/TypeScript, .NET, and Python.

ai-guide
This guide is dedicated to Large Language Models (LLMs) that you can run on your home computer. It assumes your PC is a lower-end, non-gaming setup.

classifai
Supercharge WordPress Content Workflows and Engagement with Artificial Intelligence. Tap into leading cloud-based services like OpenAI, Microsoft Azure AI, Google Gemini and IBM Watson to augment your WordPress-powered websites. Publish content faster while improving SEO performance and increasing audience engagement. ClassifAI integrates Artificial Intelligence and Machine Learning technologies to lighten your workload and eliminate tedious tasks, giving you more time to create original content that matters.

chatbot-ui
Chatbot UI is an open-source AI chat app that allows users to create and deploy their own AI chatbots. It is easy to use and can be customized to fit any need. Chatbot UI is perfect for businesses, developers, and anyone who wants to create a chatbot.

BricksLLM
BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM: * Set LLM usage limits for users on different pricing tiers * Track LLM usage on a per user and per organization basis * Block or redact requests containing PIIs * Improve LLM reliability with failovers, retries and caching * Distribute API keys with rate limits and cost limits for internal development/production use cases * Distribute API keys with rate limits and cost limits for students

uAgents
uAgents is a Python library developed by Fetch.ai that allows for the creation of autonomous AI agents. These agents can perform various tasks on a schedule or take action on various events. uAgents are easy to create and manage, and they are connected to a fast-growing network of other uAgents. They are also secure, with cryptographically secured messages and wallets.

griptape
Griptape is a modular Python framework for building AI-powered applications that securely connect to your enterprise data and APIs. It offers developers the ability to maintain control and flexibility at every step. Griptape's core components include Structures (Agents, Pipelines, and Workflows), Tasks, Tools, Memory (Conversation Memory, Task Memory, and Meta Memory), Drivers (Prompt and Embedding Drivers, Vector Store Drivers, Image Generation Drivers, Image Query Drivers, SQL Drivers, Web Scraper Drivers, and Conversation Memory Drivers), Engines (Query Engines, Extraction Engines, Summary Engines, Image Generation Engines, and Image Query Engines), and additional components (Rulesets, Loaders, Artifacts, Chunkers, and Tokenizers). Griptape enables developers to create AI-powered applications with ease and efficiency.