ai-code-interface.el

ai-code-interface.el

Unified Emacs interface supporting Claude Code, OpenAI Codex, Gemini CLI, GitHub Copilot CLI, Opencode, and more

Stars: 139

Visit
 screenshot

AI Code Interface is an Emacs package designed for AI-assisted software development, providing a uniform interface for various AI backends. It offers context-aware AI coding actions and seamless integration with AI-driven agile development workflows. The package supports multiple AI coding CLIs such as Claude Code, Gemini CLI, OpenAI Codex, GitHub Copilot CLI, Opencode, Grok CLI, Cursor CLI, Kiro CLI, CodeBuddy Code CLI, and Aider CLI. It aims to streamline the use of different AI tools within Emacs while maintaining a consistent user experience.

README:

[[file:./icon.png]]

  • AI Code Interface

[[https://melpa.org/#/ai-code][https://melpa.org/packages/ai-code-badge.svg]] [[https://stable.melpa.org/#/ai-code][https://stable.melpa.org/packages/ai-code-badge.svg]] [[https://github.com/tninja/ai-code.el/graphs/contributors][https://img.shields.io/github/contributors/tninja/ai-code.el.svg]]

An Emacs interface for AI-assisted software development. The purpose is to provide a uniform interface and experience for different AI backends, with context-aware AI coding actions, and integrating seamlessly with AI-driven agile development workflows.

[[./transient_menu.png]]

** Installation

Enable installation of packages from MELPA by adding an entry to package-archives after (require 'package) and before the call to package-initialize in your init.el or .emacs file:

#+begin_src emacs-lisp (require 'package) (add-to-list 'package-archives '("melpa" . "https://melpa.org/packages/") t) (package-initialize) #+end_src

  • Use =M-x package-refresh-contents= or =M-x package-list-packages= to ensure that Emacs has fetched the MELPA package list
  • Use =M-x package-install= to install =ai-code= package
  • Import and configure =ai-code= in your init.el or .emacs file:

#+begin_src emacs-lisp (use-package ai-code ;; :straight (:host github :repo "tninja/ai-code-interface.el") ;; if you want to use straight to install, no need to have MELPA setting above :config ;; use codex as backend, other options are 'claude-code, 'gemini, 'github-copilot-cli, 'opencode, 'grok, 'cursor, 'kiro, 'codebuddy, 'aider, 'claude-code-ide, 'claude-code-el (ai-code-set-backend 'codex) ;; Enable global keybinding for the main menu (global-set-key (kbd "C-c a") #'ai-code-menu) ;; Optional: Use eat if you prefer, by default it is vterm ;; (setq ai-code-backends-infra-terminal-backend 'eat) ;; the way to config all native supported CLI. for external backend such as claude-code-ide.el and claude-code.el, please check their config ;; Optional: Enable @ file completion in comments and AI sessions (ai-code-prompt-filepath-completion-mode 1) ;; Optional: Ask AI to run test after code changes, for a tighter build-test loop (setq ai-code-auto-test-type 'test-after-change) ;; Optional: In AI session buffers, SPC in Evil normal state triggers the prompt-enter UI (with-eval-after-load 'evil (ai-code-backends-infra-evil-setup)) ;; Optional: Turn on auto-revert buffer, so that the AI code change automatically appears in the buffer (global-auto-revert-mode 1) (setq auto-revert-interval 1) ;; set to 1 second for faster update ;; (global-set-key (kbd "C-c a C") #'ai-code-toggle-filepath-completion) ;; Optional: Set up Magit integration for AI commands in Magit popups (with-eval-after-load 'magit (ai-code-magit-setup-transients))) #+end_src

** Dependencies

*** Required Dependencies

  • Emacs 28.1 or later
  • org: Org-mode support
  • magit: Git integration
  • transient: For the menu system
  • vterm (default) or eat needs to be installed to support AI coding CLI backends.

*** Optional Dependencies

  • helm: For an enhanced auto-completion experience (ai-code-input.el).
  • yasnippet: For snippet support in the prompt file. A library of snippets is included.
  • gptel: For intelligent, AI-generated headlines in the prompt file.
  • flycheck: To enable the ai-code-flycheck-fix-errors-in-scope command.
  • projectile: For project root initialization.
  • helm-gtags: For tags creation.
  • python-pytest: For running python tests in the TDD workflow.
  • jest: For running JavaScript / TypeScript tests in the TDD workflow.

** Key Features

  • Transient-Driven Hub (C-c a): One keystroke opens a contextual transient menu that groups every capability (CLI control, code actions, agile workflows, utilities) so you never need to memorize scattered keybindings.
  • AI CLI Session Management: Start (a), resume (R), or jump back into (z) the active AI CLI buffer, instantly swap backends (s), upgrade them (u), edit backend configs (g), and run prompts against the current file (|). It support multiple sessions per project.
  • Context-Aware Code Actions: The menu exposes dedicated entries for changing code (c), implementing TODOs (i), asking questions (q), explaining code (x), sending free-form commands (<SPC>), and refreshing AI context (@). Each command automatically captures the surrounding function, region, or clipboard contents (via C-u) to keep prompts precise.
  • Agile Development Workflows: Use the refactoring navigator (r), the guided TDD cycle (t), and the pull/review diff helper (v) to keep AI-assisted work aligned with agile best practices. Prompt authoring is first-class through quick access to the prompt file (p), block sending (b), and AI-assisted shell/file execution (!).
  • Productivity & Debugging Utilities: Initialize project navigation assets (.), investigate exceptions (e), auto-fix Flycheck issues in scope (f), copy or open file paths formatted for prompts (k, o), generate MCP inspector commands (m), capture session notes straight into Org (n), and toggle desktop notifications (N) to get alerted when AI responses are ready in background sessions.
  • Seamless Prompt Management: Open the prompt file via ai-code-open-prompt-file (stored under .ai.code.files/.ai.code.prompt.org by default), send regions with ai-code-prompt-send-block, and reuse prompt snippets via yasnippet to keep conversations organized.
  • Interactive Chat & Context Tools: Dedicated buffers hold long-running chats, automatically enriched with file paths, diffs, and history from Magit or Git commands for richer AI responses.
  • AI-Assisted Bash Commands: From Dired, shell, eshell, or vterm, run C-c a ! and type natural-language commands prefixed with : (e.g., :count lines of python code recursively); the tool generates the shell command for review and executes it in a compile buffer.

*** Typical Workflows Example

  • Changing Code: Position the cursor on a function or select a region of code. Press C-c a, then c (ai-code-code-change). Describe the change you want to make in the prompt. The AI will receive the context of the function or region and your instruction.
  • Implementing a TODO: Write a comment in your code, like ;; TODO: Implement caching for this function. Place your cursor on that line and press C-c a, then i (ai-code-implement-todo). The AI will generate the implementation based on the comment.
  • Asking a Question: Place your cursor within a function, press C-c a, then q (ai-code-ask-question), type your question, and press Enter. The question, along with context, will be sent to the AI.
  • Refactoring a Function: With the cursor in a function, press C-c a, then r (ai-code-refactor-book-method). Select a refactoring technique from the list, provide any required input (e.g., a new method name), and the prompt will be generated.
  • Automatically run tests after change: When ai-code-auto-test-type is non-nil, AI will automatically run tests after code changes and follow up on results.
  • Reviewing a Pull Request: Press C-c a, then v (ai-code-pull-or-review-diff-file). Choose to generate a diff between two branches. The diff will be created in a new buffer, and you'll be prompted to start a review.
  • Multiple Sessions Support: Start more AI coding session with C-c a a after launching one. Select active session with C-c a z. Prompt with above command will be sent to the selected session.

*** Context Engineering

Context engineering is the deliberate practice of selecting, structuring, and delivering the right information to an AI model so the output is specific, accurate, and actionable. For AI-assisted programming, the model cannot read your whole codebase by default, so the quality of the result depends heavily on the clarity and relevance of the provided context (file paths, functions, regions, related files, and repo-level notes). Good context engineering reduces ambiguity, prevents irrelevant suggestions, and keeps changes aligned with the current code.

This package makes context engineering easy by automatically assembling precise context blocks and letting you curate additional context on demand:

  • Automatic file and window context: prompts can include the current file and other visible files (ai-code--get-context-files-string), so the AI sees related code without manual copying.
  • Function or region scoping: most actions capture the current function or active region, keeping requests focused (e.g., ai-code-code-change, ai-code-implement-todo, ai-code-ask-question).
  • Manual context curation: C-c a @ (ai-code-context-action) stores file paths, function anchors, or region ranges in a repo-scoped list, which is appended to prompts via ai-code--format-repo-context-info.
  • Optional clipboard context: prefix with C-u to append clipboard content to prompts for external snippets or logs.
  • @-triggered filepath completion in comments and AI sessions. Type @ to open a completion list of recent and visible repo files, then select a path to insert.
  • Prompt suffix guardrails: set ai-code-prompt-suffix to append persistent constraints to every prompt (when ai-code-use-prompt-suffix is non-nil). Example: (setq ai-code-prompt-suffix "Only use English in code file, but Reply in Simplified Chinese language").
  • Optional GPTel headline generation: set ai-code-use-gptel-headline to auto-generate prompt headings with GPTel.

Example (focused refactor with curated context):

  1. In a buffer, run C-c a @ to add the current function or selected region to stored repo context.
  2. Open another related file in a window so it is picked up by ai-code--get-context-files-string.
  3. Place the cursor in the target function and run C-c a c to request a change. The generated prompt will include the function/region scope, visible file list, and stored repo context entries, giving the AI exactly the surrounding information it needs.

*** Build / Test Feedback Loop

Use the prompt suffix and TDD helpers to keep a tight build + test loop. This reduces context switching, shortens the time between a change and verified feedback, and lets the AI work more independently with less human-in-the-loop effort:

  • ai-code-auto-test-type: Selects how prompts request tests after code changes (test-after-change, TDD Red+Green, or off).
  • ai-code--tdd-red-green-stage: Generates a single prompt for Red + Green with explicit test follow-up.
  • ai-code-build-or-test-project: Run the project build or test from the menu (C-c a b).

*** Desktop Notifications (Experimental)

When working with multiple AI sessions, it can be useful to receive desktop notifications when AI responses are complete. This is especially helpful when you prompt an AI and then switch to other tasks while waiting for the response.

**** Enabling Notifications

  • Notifications are disabled by default.
  • Press C-c a then N to toggle notifications on/off.
  • Alternatively, use M-x ai-code-notifications-toggle.
  • To enable notifications in your config: #+begin_src emacs-lisp (setq ai-code-notifications-enabled t) (setq ai-code-notifications-show-on-response t) #+end_src

**** How It Works

  • The package monitors terminal activity in AI session buffers.
  • When the terminal has been idle for ~5 seconds (configurable via ai-code-backends-infra-idle-delay), it's considered a completed response.
  • If the AI session buffer is not currently visible/focused, a desktop notification is sent.
  • Notifications are throttled to avoid spam (minimum 2 seconds between notifications).

**** Platform Support

  • On Linux with D-Bus, native desktop notifications are used.
  • On other platforms, notifications appear in the Emacs minibuffer.

** AI coding CLI backend

*** Backend Configuration This package acts as a generic interface that requires a backend AI assistant package to function. You can configure it to work with different backends.

  • Press C-c a to open the AI menu, then s to "Select Backend".
  • Pick one of the supported backends and the integration will switch immediately.
  • The selection updates the start/switch/send commands and the CLI used by ai-code-apply-prompt-on-current-file.

Natively supported options:

It also supports external backends through customization of the ai-code-backends variable; currently it includes:

  • Claude Code IDE ([[https://github.com/manzaltu/claude-code-ide.el][claude-code-ide.el]])
  • Claude Code ([[https://github.com/stevemolitor/claude-code.el][claude-code.el]])

**** Grok CLI setup Install [[https://grokcli.io/][grok-cli]] and ensure the grok executable is on your PATH. Customize grok-cli-program or grok-cli-program-switches if you want to point at a different binary or pass additional flags (for example, selecting a profile). After that, select the backend through ai-code-select-backend or bind a helper in your config.

**** CodeBuddy Code CLI setup Install CodeBuddy Code CLI via npm: =npm install -g @tencent-ai/codebuddy-code=, or via Homebrew: =brew install Tencent-CodeBuddy/tap/codebuddy-code=. Ensure the codebuddy executable is on your PATH. Customize ai-code-codebuddy-cli-program or ai-code-codebuddy-cli-program-switches if you want to point at a different binary or pass additional flags. After that, select the backend through ai-code-select-backend or bind a helper in your config. To resume previous conversations, use =-c= flag (automatically handled by the resume command).

You can add other backends by customizing the ai-code-backends variable.

**** Add a new AI coding CLI backend

  • [[https://github.com/tninja/ai-code-interface.el/pull/2][This PR]] adds github-copilot-cli. It can be an example to add basic support for other AI coding CLI.

  • Open an issue, post information about the new AI coding CLI backend (eg. cursor CLI?), at least providing the command line name. You can also include the version upgrade command, how to resume, where the configuration files are located, and so on. We can ask GitHub Copilot to add support features based on the issue.

** [[https://github.com/tninja/aider.el/blob/main/appendix.org#be-careful-about-ai-generated-code][Why Agile development with AI?]]

** FAQ

*** Q: Using Opencode as backend, it might have performance issues with eat.el in Doom Emacs. [[https://github.com/tninja/ai-code-interface.el/issues/9#issuecomment-3543277108][Issue]]

*** Q: Gemini CLI response is relatively slow, how to improve?

  • A: use gemini-3-flash model, it is pretty fast, with good quality (being able to solve leetcode hard problems), and it is free. You can set the following in your Emacs config:

#+begin_src elisp (setq ai-code-gemini-cli-program-switches '("--model" "gemini-3-flash-preview")) #+end_src

*** Q: Codex CLI use my API key, instead of my ChatGPT Plus subscription and cost money, how to fix that?

  • A: use codex login to login with your OpenAI account that has ChatGPT Plus subscription. After that, Codex CLI will use your ChatGPT Plus subscription automatically. To confirm, check with /status inside the codex CLI buffer.

** AI Assisted Programming related books

The following books introduce how to use AI to assist programming and potentially be helpful to aider / aider.el users.

** Related Emacs packages

** License

Apache-2.0 License

** Contributing

Contributions, issue reports, and improvement suggestions are welcome! Please open an issue or submit a pull request on the project's GitHub repository.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for ai-code-interface.el

Similar Open Source Tools

No tools available

For similar tasks

For similar jobs