
opencode.nvim
neovim frontend for opencode - a terminal-based AI coding agent
Stars: 68

Opencode.nvim is a neovim frontend for Opencode, a terminal-based AI coding agent. It provides a chat interface between neovim and the Opencode AI agent, capturing editor context to enhance prompts. The plugin maintains persistent sessions for continuous conversations with the AI assistant, similar to Cursor AI.
README:
neovim frontend for opencode - a terminal-based AI coding agent
This plugin is a fork of the original goose.nvim plugin by azorng For git history purposes the original code is copied instead of just forked.
This plugin provides a bridge between neovim and the opencode AI agent, creating a chat interface while capturing editor context (current file, selections) to enhance your prompts. It maintains persistent sessions tied to your workspace, allowing for continuous conversations with the AI assistant similar to what tools like Cursor AI offer.
β οΈ Caution- Requirements
- Installation
- Configuration
- Usage
- Context
- Agents
- User Commands
- Contextual Actions for Snapshots
- Setting up opencode
This plugin is in early development and may have bugs and breaking changes. It is not recommended for production use yet. Please report any issues you encounter on the GitHub repository.
Opencode is also in early development and may have breaking changes. Ensure you are using a compatible version of the Opencode CLI (v0.6.3+ or more).
If your upgrade breaks the plugin, please open an issue or downgrade to the last working version.
- Opencode (v0.6.3+ or more) CLI installed and available (see Setting up opencode below)
Install the plugin with your favorite package manager. See the Configuration section below for customization options.
{
"sudo-tee/opencode.nvim",
config = function()
require("opencode").setup({})
end,
dependencies = {
"nvim-lua/plenary.nvim",
{
"MeanderingProgrammer/render-markdown.nvim",
opts = {
anti_conceal = { enabled = false },
file_types = { 'markdown', 'opencode_output' },
},
ft = { 'markdown', 'Avante', 'copilot-chat', 'opencode_output' },
}
},
}
-- Default configuration with all available options
require('opencode').setup({
prefered_picker = nil, -- 'telescope', 'fzf', 'mini.pick', 'snacks', if nil, it will use the best available picker
default_global_keymaps = true, -- If false, disables all default global keymaps
default_mode = 'build', -- 'build' or 'plan' or any custom configured. @see [OpenCode Agents](https://opencode.ai/docs/modes/)
config_file_path = nil, -- Path to opencode configuration file if different from the default `~/.config/opencode/config.json` or `~/.config/opencode/opencode.json`
keymap = {
global = {
toggle = '<leader>oa', -- Open opencode. Close if opened
open_input = '<leader>oi', -- Opens and focuses on input window on insert mode
open_input_new_session = '<leader>oI', -- Opens and focuses on input window on insert mode. Creates a new session
open_output = '<leader>oo', -- Opens and focuses on output window
toggle_focus = '<leader>ot', -- Toggle focus between opencode and last window
close = '<leader>oq', -- Close UI windows
select_session = '<leader>os', -- Select and load a opencode session
configure_provider = '<leader>op', -- Quick provider and model switch from predefined list
diff_open = '<leader>od', -- Opens a diff tab of a modified file since the last opencode prompt
diff_next = '<leader>o]', -- Navigate to next file diff
diff_prev = '<leader>o[', -- Navigate to previous file diff
diff_close = '<leader>oc', -- Close diff view tab and return to normal editing
diff_revert_all_last_prompt = '<leader>ora', -- Revert all file changes since the last opencode prompt
diff_revert_this_last_prompt = '<leader>ort', -- Revert current file changes since the last opencode prompt
diff_revert_all = '<leader>orA', -- Revert all file changes since the last opencode session
diff_revert_this = '<leader>orT', -- Revert current file changes since the last opencode session
swap_position = '<leader>ox', -- Swap Opencode pane left/right
},
window = {
submit = '<cr>', -- Submit prompt (normal mode)
submit_insert = '<C-s>', -- Submit prompt (insert mode)
close = '<esc>', -- Close UI windows
stop = '<C-c>', -- Stop opencode while it is running
next_message = ']]', -- Navigate to next message in the conversation
prev_message = '[[', -- Navigate to previous message in the conversation
mention_file = '@', -- Pick a file and add to context. See File Mentions section
slash_command = '/', -- Pick a command to run in the input window
toggle_pane = '<tab>', -- Toggle between input and output panes
prev_prompt_history = '<up>', -- Navigate to previous prompt in history
next_prompt_history = '<down>', -- Navigate to next prompt in history
switch_mode = '<M-m>', -- Switch between modes (build/plan)
focus_input = '<C-i>', -- Focus on input window and enter insert mode at the end of the input from the output window
debug_messages = '<leader>oD', -- Open raw message in new buffer for debugging
debug_output = '<leader>oO', -- Open raw output in new buffer for debugging
},
},
ui = {
position = 'right', -- 'right' (default) or 'left'. Position of the UI split
input_position = 'bottom', -- 'bottom' (default) or 'top'. Position of the input window
window_width = 0.40, -- Width as percentage of editor width
input_height = 0.15, -- Input height as percentage of window height
display_model = true, -- Display model name on top winbar
display_context_size = true, -- Display context size in the footer
display_cost = true, -- Display cost in the footer
window_highlight = 'Normal:OpencodeBackground,FloatBorder:OpencodeBorder', -- Highlight group for the opencode window
icons = {
preset = 'emoji', -- 'emoji' | 'text'. Choose UI icon style (default: 'emoji')
overrides = {}, -- Optional per-key overrides, see section below
},
output = {
tools = {
show_output = true, -- Show tools output [diffs, cmd output, etc.] (default: true)
},
},
input = {
text = {
wrap = false, -- Wraps text inside input window
},
},
},
context = {
cursor_data = true, -- send cursor position and current line to opencode
diagnostics = {
info = false, -- Include diagnostics info in the context (default to false
warn = true, -- Include diagnostics warnings in the context
error = true, -- Include diagnostics errors in the context
},
},
debug = {
enabled = false, -- Enable debug messages in the output window
},
})
By default, opencode.nvim uses emojis for icons in the UI. If you prefer a plain, emoji-free interface, you can switch to the text
preset or override icons individually.
Minimal config to disable emojis everywhere:
require('opencode').setup({
ui = {
icons = {
preset = 'text', -- switch all icons to text
},
},
})
Override specific icons while keeping the preset:
require('opencode').setup({
ui = {
icons = {
preset = 'emoji',
overrides = {
header_user = '> U',
header_assistant = 'AI',
search = 'FIND',
border = '|',
},
},
},
})
Available icon keys (see implementation at lua/opencode/ui/icons.lua lines 7-29):
- header_user, header_assistant
- run, task, read, edit, write
- plan, search, web, list, tool
- snapshot, restore_point, restore_count, file
- status_on, status_off
- border, bullet
The plugin provides the following actions that can be triggered via keymaps, commands, or the Lua API:
Action | Default keymap | Command | API Function |
---|---|---|---|
Open opencode. Close if opened | <leader>og |
:Opencode |
require('opencode.api').toggle() |
Open input window (current session) | <leader>oi |
:OpencodeOpenInput |
require('opencode.api').open_input() |
Open input window (new session) | <leader>oI |
:OpencodeOpenInputNewSession |
require('opencode.api').open_input_new_session() |
Open output window | <leader>oo |
:OpencodeOpenOutput |
require('opencode.api').open_output() |
Create and switch to a named session | - | :OpencodeNewSession |
require('opencode.api').new_session() |
Toggle focus opencode / last window | <leader>ot |
:OpencodeToggleFocus |
require('opencode.api').toggle_focus() |
Close UI windows | <leader>oq |
:OpencodeClose |
require('opencode.api').close() |
Select and load session | <leader>os |
:OpencodeSelectSession |
require('opencode.api').select_session() |
Configure provider and model | <leader>op |
:OpencodeConfigureProvider |
require('opencode.api').configure_provider() |
Open diff view of changes | <leader>od |
:OpencodeDiff |
require('opencode.api').diff_open() |
Navigate to next file diff | <leader>o] |
:OpencodeDiffNext |
require('opencode.api').diff_next() |
Navigate to previous file diff | <leader>o[ |
:OpencodeDiffPrev |
require('opencode.api').diff_prev() |
Close diff view tab | <leader>oc |
:OpencodeDiffClose |
require('opencode.api').diff_close() |
Revert all file changes since last prompt | <leader>ora |
:OpencodeRevertAllLastPrompt |
require('opencode.api').diff_revert_all_last_prompt() |
Revert current file changes last prompt | <leader>ort |
:OpencodeRevertAllLastPrompt |
require('opencode.api').diff_revert_this_last_prompt() |
Revert all file changes since last session | <leader>orA |
:OpencodeRevertAllLastSession |
require('opencode.api').diff_revert_all_last_prompt() |
Revert current file changes last session | <leader>orT |
:OpencodeRevertAllLastSession |
require('opencode.api').diff_revert_this_last_prompt() |
Initialize/update AGENTS.md file | - | :OpencodeInit |
require('opencode.api').initialize() |
Run prompt (continue session) | - | :OpencodeRun <prompt> |
require('opencode.api').run("prompt") |
Run prompt (new session) | - | :OpencodeRunNewSession <prompt> |
require('opencode.api').run_new_session("prompt") |
Open config file | - | :OpencodeConfigFile |
require('opencode.api').open_configuration_file() |
Stop opencode while it is running | <C-c> |
:OpencodeStop |
require('opencode.api').stop() |
Set mode to Build | - | :OpencodeAgentBuild |
require('opencode.api').mode_build() |
Set mode to Plan | - | :OpencodeAgentPlan |
require('opencode.api').mode_plan() |
Select and switch mode/agent | - | :OpencodeAgentSelect |
require('opencode.api').select_agent() |
Display list of availale mcp servers | - | :OpencodeMCP |
require('opencode.api').list_mcp_servers() |
Run user commands | - | :RunUserCommand |
require('opencode.api').run_user_command() |
Pick a file and add to context | @ |
- | - |
Navigate to next message | ]] |
- | - |
Navigate to previous message | [[ |
- | - |
Navigate to previous prompt in history | <up> |
- | require('opencode.api').prev_history() |
Navigate to next prompt in history | <down> |
- | require('opencode.api').next_history() |
Toggle input/output panes | <tab> |
- | - |
Swap Opencode pane left/right | <leader>ox |
:OpencodeSwapPosition |
require('opencode.api').swap_position() |
The following editor context is automatically captured and included in your conversations.
Context Type | Description |
---|---|
Current file | Path to the focused file before entering opencode |
Selected text | Text and lines currently selected in visual mode |
Mentioned files | File info added through mentions |
Diagnostics | Diagnostics from the current file (if any) |
Cursor position | Current cursor position and line content in the file |
You can reference files in your project directly in your conversations with Opencode. This is useful when you want to ask about or provide context about specific files. Type @
in the input window to trigger the file picker.
Supported pickers include fzf-lua
, telescope
, mini.pick
, snacks
Opencode provides two built-in agents and supports custom ones:
- Build (default): Full development agent with all tools enabled for making code changes
- Plan: Restricted agent for planning and analysis without making file changes. Useful for code review and understanding code without modifications
Press <M-m>
(Alt+M) in the input window to switch between agents during a session.
You can create custom agents through your opencode config file. Each agent can have its own:
- Agentl configuration
- Custom prompt
- Enabled/disabled tools
- And more
See Opencode Agents Documentation for full configuration options.
You can run predefined user commands from the input window by typing /
. This will open a command picker where you can select a command to execute. The output of the command will be included in your prompt context.
To configure user commands
.opencode/command/
- Project-specific commands
command/
- Global commands in config directory
See User Commands Documentation for more details details.
[!WARNING] > Snapshots are an experimental feature in opencode and sometimes the dev team may disable them or change their behavior. This repository will be updated to match the latest opencode changes as soon as possible.
Opencode.nvim automatically creates snapshots of your workspace at key moments (such as after running prompts or making changes). These snapshots are like lightweight git commits, allowing you to review, compare, and restore your project state at any time.
Contextual actions for snapshots are available directly in the output window. When a snapshot is referenced in the conversation, you can trigger actions on it via keymaps displayed by the UI.
- Diff: View the differences between the current state and the snapshot.
- Revert file: Revert the selected file to the state it was in at the snapshot.
- Revert all files: Revert all files in the workspace to the state they were
- When a message in the output references a snapshot (look for πΈ Created Snapshot or similar), move your cursor to that line and a little menu will be displayed above.
When you see a snapshot in the output:
Tip: Reverting a snapshot will restore all files to the state they were in at that snapshot, so use it with caution!
Opencode.nvim automatically creates restore points before a revet operation. This allows you to undo a revert if needed.
You will see restore points under the Snapshot line like so:
- Restore file: Restore the selected file to the state it was in before the last revert operation.
- Restore all : Restore all files in the workspace to the state they were in before the revert action
The plugin defines several highlight groups that can be customized to match your colorscheme:
-
OpencodeBorder
: Border color for Opencode windows (default: #616161) -
OpencodeBackground
: Background color for Opencode windows (linked toNormal
) -
OpencodeSessionDescription
: Session description text color (linked toComment
) -
OpencodeMention
: Highlight for @file mentions (linked toSpecial
) -
OpencodeToolBorder
: Border color for tool execution blocks (default: #3b4261) -
OpencodeMessageRoleAssistant
: Assistant message highlight (linked toAdded
) -
OpencodeMessageRoleUser
: User message highlight (linked toQuestion
) -
OpencodeDiffAdd
: Highlight for added line in diffs (default: #2B3328) -
OpencodeDiffDelete
: Highlight for deleted line in diffs (default: #43242B) -
OpencodeAgentPlan
: Agent indicator in winbar for Plan mode (default: #61AFEF background) -
OpencodeAgentBuild
: Agent indicator in winbar for Build mode (default: #616161 background) -
OpencodeAgentCustom
: Agent indicator in winbar for custom modes (default: #3b4261 background) -
OpencodeContestualAction
: Highlight for contextual actions in the output window (default: #3b4261 background) -
OpencodeInpuutLegend
: Highlight for input window legend (default: #CCCCCC background) -
OpencodeHint
: Highlight for hinting messages in input window and token info in output window footer (linked toComment
)
If you're new to opencode:
-
What is Opencode?
- Opencode is an AI coding agent built for the terminal
- It offers powerful AI assistance with extensible configurations such as LLMs and MCP servers
-
Installation:
- Visit Install Opencode for installation and configuration instructions
- Ensure the
opencode
command is available after installation
-
Configuration:
- Run
opencode auth login
to set up your LLM provider - Configure your preferred LLM provider and model in the
~/.config/opencode/config.json
or~/.config/opencode/opencode.json
file
- Run
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for opencode.nvim
Similar Open Source Tools

opencode.nvim
Opencode.nvim is a neovim frontend for Opencode, a terminal-based AI coding agent. It provides a chat interface between neovim and the Opencode AI agent, capturing editor context to enhance prompts. The plugin maintains persistent sessions for continuous conversations with the AI assistant, similar to Cursor AI.

llm_processes
This repository contains code for LLM Processes, which focuses on generating numerical predictive distributions conditioned on natural language. It supports various LLMs through Hugging Face transformer APIs and includes experiments on prompt engineering, 1D synthetic data, comparison to LLMTime, Fashion MNIST, black-box optimization, weather regression, in-context learning, and text conditioning. The code requires Python 3.9+, PyTorch 2.3.0+, and other dependencies for running experiments and reproducing results.

ax
Ax is a Typescript library that allows users to build intelligent agents inspired by agentic workflows and the Stanford DSP paper. It seamlessly integrates with multiple Large Language Models (LLMs) and VectorDBs to create RAG pipelines or collaborative agents capable of solving complex problems. The library offers advanced features such as streaming validation, multi-modal DSP, and automatic prompt tuning using optimizers. Users can easily convert documents of any format to text, perform smart chunking, embedding, and querying, and ensure output validation while streaming. Ax is production-ready, written in Typescript, and has zero dependencies.

rust-genai
genai is a multi-AI providers library for Rust that aims to provide a common and ergonomic single API to various generative AI providers such as OpenAI, Anthropic, Cohere, Ollama, and Gemini. It focuses on standardizing chat completion APIs across major AI services, prioritizing ergonomics and commonality. The library initially focuses on text chat APIs and plans to expand to support images, function calling, and more in the future versions. Version 0.1.x will have breaking changes in patches, while version 0.2.x will follow semver more strictly. genai does not provide a full representation of a given AI provider but aims to simplify the differences at a lower layer for ease of use.

fittencode.nvim
Fitten Code AI Programming Assistant for Neovim provides fast completion using AI, asynchronous I/O, and support for various actions like document code, edit code, explain code, find bugs, generate unit test, implement features, optimize code, refactor code, start chat, and more. It offers features like accepting suggestions with Tab, accepting line with Ctrl + Down, accepting word with Ctrl + Right, undoing accepted text, automatic scrolling, and multiple HTTP/REST backends. It can run as a coc.nvim source or nvim-cmp source.

factorio-learning-environment
Factorio Learning Environment is an open source framework designed for developing and evaluating LLM agents in the game of Factorio. It provides two settings: Lab-play with structured tasks and Open-play for building large factories. Results show limitations in spatial reasoning and automation strategies. Agents interact with the environment through code synthesis, observation, action, and feedback. Tools are provided for game actions and state representation. Agents operate in episodes with observation, planning, and action execution. Tasks specify agent goals and are implemented in JSON files. The project structure includes directories for agents, environment, cluster, data, docs, eval, and more. A database is used for checkpointing agent steps. Benchmarks show performance metrics for different configurations.

avante.nvim
avante.nvim is a Neovim plugin that emulates the behavior of the Cursor AI IDE, providing AI-driven code suggestions and enabling users to apply recommendations to their source files effortlessly. It offers AI-powered code assistance and one-click application of suggested changes, streamlining the editing process and saving time. The plugin is still in early development, with functionalities like setting API keys, querying AI about code, reviewing suggestions, and applying changes. Key bindings are available for various actions, and the roadmap includes enhancing AI interactions, stability improvements, and introducing new features for coding tasks.

onnxruntime-server
ONNX Runtime Server is a server that provides TCP and HTTP/HTTPS REST APIs for ONNX inference. It aims to offer simple, high-performance ML inference and a good developer experience. Users can provide inference APIs for ONNX models without writing additional code by placing the models in the directory structure. Each session can choose between CPU or CUDA, analyze input/output, and provide Swagger API documentation for easy testing. Ready-to-run Docker images are available, making it convenient to deploy the server.

dvc
DVC, or Data Version Control, is a command-line tool and VS Code extension that helps you develop reproducible machine learning projects. With DVC, you can version your data and models, iterate fast with lightweight pipelines, track experiments in your local Git repo, compare any data, code, parameters, model, or performance plots, and share experiments and automatically reproduce anyone's experiment.

pr-pilot
PR Pilot is an AI-powered tool designed to assist users in their daily workflow by delegating routine work to AI with confidence and predictability. It integrates seamlessly with popular development tools and allows users to interact with it through a Command-Line Interface, Python SDK, REST API, and Smart Workflows. Users can automate tasks such as generating PR titles and descriptions, summarizing and posting issues, and formatting README files. The tool aims to save time and enhance productivity by providing AI-powered solutions for common development tasks.

StableToolBench
StableToolBench is a new benchmark developed to address the instability of Tool Learning benchmarks. It aims to balance stability and reality by introducing features such as a Virtual API System with caching and API simulators, a new set of solvable queries determined by LLMs, and a Stable Evaluation System using GPT-4. The Virtual API Server can be set up either by building from source or using a prebuilt Docker image. Users can test the server using provided scripts and evaluate models with Solvable Pass Rate and Solvable Win Rate metrics. The tool also includes model experiments results comparing different models' performance.

MHA2MLA
This repository contains the code for the paper 'Towards Economical Inference: Enabling DeepSeek's Multi-Head Latent Attention in Any Transformer-based LLMs'. It provides tools for fine-tuning and evaluating Llama models, converting models between different frameworks, processing datasets, and performing specific model training tasks like Partial-RoPE Fine-Tuning and Multiple-Head Latent Attention Fine-Tuning. The repository also includes commands for model evaluation using Lighteval and LongBench, along with necessary environment setup instructions.

atlas-mcp-server
ATLAS (Adaptive Task & Logic Automation System) is a high-performance Model Context Protocol server designed for LLMs to manage complex task hierarchies. Built with TypeScript, it features ACID-compliant storage, efficient task tracking, and intelligent template management. ATLAS provides LLM Agents task management through a clean, flexible tool interface. The server implements the Model Context Protocol (MCP) for standardized communication between LLMs and external systems, offering hierarchical task organization, task state management, smart templates, enterprise features, and performance optimization.

BetaML.jl
The Beta Machine Learning Toolkit is a package containing various algorithms and utilities for implementing machine learning workflows in multiple languages, including Julia, Python, and R. It offers a range of supervised and unsupervised models, data transformers, and assessment tools. The models are implemented entirely in Julia and are not wrappers for third-party models. Users can easily contribute new models or request implementations. The focus is on user-friendliness rather than computational efficiency, making it suitable for educational and research purposes.

ovos-installer
The ovos-installer is a simple and multilingual tool designed to install Open Voice OS and HiveMind using Bash, Whiptail, and Ansible. It supports various Linux distributions and provides an automated installation process. Users can easily start and stop services, update their Open Voice OS instance, and uninstall the tool if needed. The installer also allows for non-interactive installation through scenario files. It offers a user-friendly way to set up Open Voice OS on different systems.

MockingBird
MockingBird is a toolbox designed for Mandarin speech synthesis using PyTorch. It supports multiple datasets such as aidatatang_200zh, magicdata, aishell3, and data_aishell. The toolbox can run on Windows, Linux, and M1 MacOS, providing easy and effective speech synthesis with pretrained encoder/vocoder models. It is webserver ready for remote calling. Users can train their own models or use existing ones for the encoder, synthesizer, and vocoder. The toolbox offers a demo video and detailed setup instructions for installation and model training.
For similar tasks

opencode.nvim
Opencode.nvim is a neovim frontend for Opencode, a terminal-based AI coding agent. It provides a chat interface between neovim and the Opencode AI agent, capturing editor context to enhance prompts. The plugin maintains persistent sessions for continuous conversations with the AI assistant, similar to Cursor AI.

Advanced-Prompt-Generator
This project is an LLM-based Advanced Prompt Generator designed to automate the process of prompt engineering by enhancing given input prompts using large language models (LLMs). The tool can generate advanced prompts with minimal user input, leveraging LLM agents for optimized prompt generation. It supports gpt-4o or gpt-4o-mini, offers FastAPI & Docker deployment for efficiency, provides a Gradio interface for easy testing, and is hosted on Hugging Face Spaces for quick demos. Users can expand model support to offer more variety and flexibility.
For similar jobs

sweep
Sweep is an AI junior developer that turns bugs and feature requests into code changes. It automatically handles developer experience improvements like adding type hints and improving test coverage.

teams-ai
The Teams AI Library is a software development kit (SDK) that helps developers create bots that can interact with Teams and Microsoft 365 applications. It is built on top of the Bot Framework SDK and simplifies the process of developing bots that interact with Teams' artificial intelligence capabilities. The SDK is available for JavaScript/TypeScript, .NET, and Python.

ai-guide
This guide is dedicated to Large Language Models (LLMs) that you can run on your home computer. It assumes your PC is a lower-end, non-gaming setup.

classifai
Supercharge WordPress Content Workflows and Engagement with Artificial Intelligence. Tap into leading cloud-based services like OpenAI, Microsoft Azure AI, Google Gemini and IBM Watson to augment your WordPress-powered websites. Publish content faster while improving SEO performance and increasing audience engagement. ClassifAI integrates Artificial Intelligence and Machine Learning technologies to lighten your workload and eliminate tedious tasks, giving you more time to create original content that matters.

chatbot-ui
Chatbot UI is an open-source AI chat app that allows users to create and deploy their own AI chatbots. It is easy to use and can be customized to fit any need. Chatbot UI is perfect for businesses, developers, and anyone who wants to create a chatbot.

BricksLLM
BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM: * Set LLM usage limits for users on different pricing tiers * Track LLM usage on a per user and per organization basis * Block or redact requests containing PIIs * Improve LLM reliability with failovers, retries and caching * Distribute API keys with rate limits and cost limits for internal development/production use cases * Distribute API keys with rate limits and cost limits for students

uAgents
uAgents is a Python library developed by Fetch.ai that allows for the creation of autonomous AI agents. These agents can perform various tasks on a schedule or take action on various events. uAgents are easy to create and manage, and they are connected to a fast-growing network of other uAgents. They are also secure, with cryptographically secured messages and wallets.

griptape
Griptape is a modular Python framework for building AI-powered applications that securely connect to your enterprise data and APIs. It offers developers the ability to maintain control and flexibility at every step. Griptape's core components include Structures (Agents, Pipelines, and Workflows), Tasks, Tools, Memory (Conversation Memory, Task Memory, and Meta Memory), Drivers (Prompt and Embedding Drivers, Vector Store Drivers, Image Generation Drivers, Image Query Drivers, SQL Drivers, Web Scraper Drivers, and Conversation Memory Drivers), Engines (Query Engines, Extraction Engines, Summary Engines, Image Generation Engines, and Image Query Engines), and additional components (Rulesets, Loaders, Artifacts, Chunkers, and Tokenizers). Griptape enables developers to create AI-powered applications with ease and efficiency.