
mcp-redis
The official Redis MCP Server is a natural language interface designed for agentic applications to manage and search data in Redis efficiently
Stars: 230

The Redis MCP Server is a natural language interface designed for agentic applications to efficiently manage and search data in Redis. It integrates seamlessly with MCP (Model Content Protocol) clients, enabling AI-driven workflows to interact with structured and unstructured data in Redis. The server supports natural language queries, seamless MCP integration, full Redis support for various data types, search and filtering capabilities, scalability, and lightweight design. It provides tools for managing data stored in Redis, such as string, hash, list, set, sorted set, pub/sub, streams, JSON, query engine, and server management. Installation can be done from PyPI or GitHub, with options for testing, development, and Docker deployment. Configuration can be via command line arguments or environment variables. Integrations include OpenAI Agents SDK, Augment, Claude Desktop, and VS Code with GitHub Copilot. Use cases include AI assistants, chatbots, data search & analytics, and event processing. Contributions are welcome under the MIT License.
README:
The Redis MCP Server is a natural language interface designed for agentic applications to efficiently manage and search data in Redis. It integrates seamlessly with MCP (Model Content Protocol) clients, enabling AI-driven workflows to interact with structured and unstructured data in Redis. Using this MCP Server, you can ask questions like:
- "Store the entire conversation in a stream"
- "Cache this item"
- "Store the session with an expiration time"
- "Index and search this vector"
- Overview
- Features
- Tools
- Installation
- Configuration
- Integrations
- Testing
- Example Use Cases
- Contributing
- License
- Badges
- Contact
- Natural Language Queries: Enables AI agents to query and update Redis using natural language.
- Seamless MCP Integration: Works with any MCP client for smooth communication.
- Full Redis Support: Handles hashes, lists, sets, sorted sets, streams, and more.
- Search & Filtering: Supports efficient data retrieval and searching in Redis.
- Scalable & Lightweight: Designed for high-performance data operations.
- The Redis MCP Server supports the
stdio
transport. Support to thestremable-http
transport will be added in the future.
This MCP Server provides tools to manage the data stored in Redis.
-
string
tools to set, get strings with expiration. Useful for storing simple configuration values, session data, or caching responses. -
hash
tools to store field-value pairs within a single key. The hash can store vector embeddings. Useful for representing objects with multiple attributes, user profiles, or product information where fields can be accessed individually. -
list
tools with common operations to append and pop items. Useful for queues, message brokers, or maintaining a list of most recent actions. -
set
tools to add, remove and list set members. Useful for tracking unique values like user IDs or tags, and for performing set operations like intersection. -
sorted set
tools to manage data for e.g. leaderboards, priority queues, or time-based analytics with score-based ordering. -
pub/sub
functionality to publish messages to channels and subscribe to receive them. Useful for real-time notifications, chat applications, or distributing updates to multiple clients. -
streams
tools to add, read, and delete from data streams. Useful for event sourcing, activity feeds, or sensor data logging with consumer groups support. -
JSON
tools to store, retrieve, and manipulate JSON documents in Redis. Useful for complex nested data structures, document databases, or configuration management with path-based access.
Additional tools.
-
query engine
tools to manage vector indexes and perform vector search -
server management
tool to retrieve information about the database
The Redis MCP Server is available as a PyPI package and as direct installation from the GitHub repository.
Configuring the latest Redis MCP Server version from PyPI, as an example, can be done importing the following JSON configuration in the desired framework or tool.
The uvx
command will download the server on the fly (if not cached already), create a temporary environment, and then run it.
{
"mcpServers": {
"RedisMCPServer": {
"command": "uvx",
"args": [
"--from",
"redis-mcp-server@latest",
"redis-mcp-server",
"--url",
"\"redis://localhost:6379/0\""
]
}
}
}
You will find examples for different platforms along the README.
You can install the package as follows:
pip install redis-mcp-server
And start it using uv
the package in your environment.
uv python install 3.13
uv sync
uv run redis-mcp-server --url redis://localhost:6379/0
However, starting the MCP Server is most useful when delegate to the framework or tool where this MCP Server is configured.
You can configure the desired Redis MCP Server version with uvx
, which allows you to run it directly from GitHub (from a branch, or use a tagged release).
It is recommended to use a tagged release, the
main
branch is under active development and may contain breaking changes.
As an example, you can execute the following command to run the 0.2.0
release:
uvx --from git+https://github.com/redis/[email protected] redis-mcp-server --url redis://localhost:6379/0
Check the release notes for the latest version in the Releases section. Additional examples are provided below.
# Run with Redis URI
uvx --from git+https://github.com/redis/mcp-redis.git redis-mcp-server --url redis://localhost:6379/0
# Run with Redis URI and SSL
uvx --from git+https://github.com/redis/mcp-redis.git redis-mcp-server --url "rediss://<USERNAME>:<PASSWORD>@<HOST>:<PORT>?ssl_cert_reqs=required&ssl_ca_certs=<PATH_TO_CERT>"
# Run with individual parameters
uvx --from git+https://github.com/redis/mcp-redis.git redis-mcp-server --host localhost --port 6379 --password mypassword
# See all options
uvx --from git+https://github.com/redis/mcp-redis.git redis-mcp-server --help
For development or if you prefer to clone the repository:
# Clone the repository
git clone https://github.com/redis/mcp-redis.git
cd mcp-redis
# Install dependencies using uv
uv venv
source .venv/bin/activate
uv sync
# Run with CLI interface
uv run redis-mcp-server --help
# Or run the main file directly (uses environment variables)
uv run src/main.py
Once you cloned the repository, installed the dependencies and verified you can run the server, you can configure Claude Desktop or any other MCP Client to use this MCP Server running the main file directly (it uses environment variables). This is usually preferred for development. The following example is for Claude Desktop, but the same applies to any other MCP Client.
- Specify your Redis credentials and TLS configuration
- Retrieve your
uv
command full path (e.g.which uv
) - Edit the
claude_desktop_config.json
configuration file- on a MacOS, at
~/Library/Application\ Support/Claude/
- on a MacOS, at
{
"mcpServers": {
"redis": {
"command": "<full_path_uv_command>",
"args": [
"--directory",
"<your_mcp_server_directory>",
"run",
"src/main.py"
],
"env": {
"REDIS_HOST": "<your_redis_database_hostname>",
"REDIS_PORT": "<your_redis_database_port>",
"REDIS_PWD": "<your_redis_database_password>",
"REDIS_SSL": True|False,
"REDIS_CA_PATH": "<your_redis_ca_path>",
"REDIS_CLUSTER_MODE": True|False
}
}
}
}
You can troubleshoot problems by tailing the log file.
tail -f ~/Library/Logs/Claude/mcp-server-redis.log
You can use a dockerized deployment of this server. You can either build your own image or use the official Redis MCP Docker image.
If you'd like to build your own image, the Redis MCP Server provides a Dockerfile. Build this server's image with:
docker build -t mcp-redis .
Finally, configure the client to create the container at start-up. An example for Claude Desktop is provided below. Edit the claude_desktop_config.json
and add:
{
"mcpServers": {
"redis": {
"command": "docker",
"args": ["run",
"--rm",
"--name",
"redis-mcp-server",
"-i",
"-e", "REDIS_HOST=<redis_hostname>",
"-e", "REDIS_PORT=<redis_port>",
"-e", "REDIS_USERNAME=<redis_username>",
"-e", "REDIS_PWD=<redis_password>",
"mcp-redis"]
}
}
}
To use the official Redis MCP Docker image, just replace your image name (mcp-redis
in the example above) with mcp/redis
.
The Redis MCP Server can be configured in two ways: via command line arguments or via environment variables. The precedence is: command line arguments > environment variables > default values.
You can configure Redis ACL to restrict the access to the Redis database. For example, to create a read-only user:
127.0.0.1:6379> ACL SETUSER readonlyuser on >mypassword ~* +@read -@write
Configure the user via command line arguments or environment variables.
When using the CLI interface, you can configure the server with command line arguments:
# Basic Redis connection
uvx --from redis-mcp-server@latest redis-mcp-server \
--host localhost \
--port 6379 \
--password mypassword
# Using Redis URI (simpler)
uvx --from redis-mcp-server@latest redis-mcp-server \
--url redis://user:pass@localhost:6379/0
# SSL connection
uvx --from redis-mcp-server@latest redis-mcp-server \
--url rediss://user:[email protected]:6379/0
# See all available options
uvx --from redis-mcp-server@latest redis-mcp-server --help
Available CLI Options:
-
--url
- Redis connection URI (redis://user:pass@host:port/db) -
--host
- Redis hostname (default: 127.0.0.1) -
--port
- Redis port (default: 6379) -
--db
- Redis database number (default: 0) -
--username
- Redis username -
--password
- Redis password -
--ssl
- Enable SSL connection -
--ssl-ca-path
- Path to CA certificate file -
--ssl-keyfile
- Path to SSL key file -
--ssl-certfile
- Path to SSL certificate file -
--ssl-cert-reqs
- SSL certificate requirements (default: required) -
--ssl-ca-certs
- Path to CA certificates file -
--cluster-mode
- Enable Redis cluster mode
If desired, you can use environment variables. Defaults are provided for all variables.
Name | Description | Default Value |
---|---|---|
REDIS_HOST |
Redis IP or hostname | "127.0.0.1" |
REDIS_PORT |
Redis port | 6379 |
REDIS_DB |
Database | 0 |
REDIS_USERNAME |
Default database username | "default" |
REDIS_PWD |
Default database password | "" |
REDIS_SSL |
Enables or disables SSL/TLS | False |
REDIS_CA_PATH |
CA certificate for verifying server | None |
REDIS_SSL_KEYFILE |
Client's private key file for client authentication | None |
REDIS_SSL_CERTFILE |
Client's certificate file for client authentication | None |
REDIS_CERT_REQS |
Whether the client should verify the server's certificate | "required" |
REDIS_CA_CERTS |
Path to the trusted CA certificates file | None |
REDIS_CLUSTER_MODE |
Enable Redis Cluster mode | False |
There are several ways to set environment variables:
-
Using a
.env
File:
Place a.env
file in your project directory with key-value pairs for each environment variable. Tools likepython-dotenv
,pipenv
, anduv
can automatically load these variables when running your application. This is a convenient and secure way to manage configuration, as it keeps sensitive data out of your shell history and version control (if.env
is in.gitignore
). For example, create a.env
file with the following content from the.env.example
file provided in the repository:
cp .env.example .env
Then edit the .env
file to set your Redis configuration:
OR,
-
Setting Variables in the Shell:
You can export environment variables directly in your shell before running your application. For example:
export REDIS_HOST=your_redis_host
export REDIS_PORT=6379
# Other variables will be set similarly...
This method is useful for temporary overrides or quick testing.
Integrating this MCP Server to development frameworks like OpenAI Agents SDK, or with tools like Claude Desktop, VS Code, or Augment is described in the following sections.
Integrate this MCP Server with the OpenAI Agents SDK. Read the documents to learn more about the integration of the SDK with MCP.
Install the Python SDK.
pip install openai-agents
Configure the OpenAI token:
export OPENAI_API_KEY="<openai_token>"
And run the application.
python3.13 redis_assistant.py
You can troubleshoot your agent workflows using the OpenAI dashboard.
The preferred way of configuring the Redis MCP Server in Augment is to use the Easy MCP feature.
You can also configure the Redis MCP Server in Augment manually by importing the server via JSON:
{
"mcpServers": {
"Redis MCP Server": {
"command": "uvx",
"args": [
"--from",
"redis-mcp-server@latest",
"redis-mcp-server",
"--url",
"redis://localhost:6379/0"
]
}
}
}
The simplest way to configure MCP clients is using uvx
. Add the following JSON to your claude_desktop_config.json
, remember to provide the full path to uvx
.
{
"mcpServers": {
"redis-mcp-server": {
"type": "stdio",
"command": "/Users/mortensi/.local/bin/uvx",
"args": [
"--from", "redis-mcp-server@latest",
"redis-mcp-server",
"--url", "redis://localhost:6379/0"
]
}
}
}
If you'd like to test the Redis MCP Server via Smithery, you can configure Claude Desktop automatically:
npx -y @smithery/cli install @redis/mcp-redis --client claude
Follow the prompt and provide the details to configure the server and connect to Redis (e.g. using a Redis Cloud database).
The procedure will create the proper configuration in the claude_desktop_config.json
configuration file.
To use the Redis MCP Server with VS Code, you must nable the agent mode tools. Add the following to your settings.json
:
{
"chat.agent.enabled": true
}
You can start the GitHub desired version of the Redis MCP server using uvx
by adding the following JSON to your settings.json
:
"mcp": {
"servers": {
"Redis MCP Server": {
"type": "stdio",
"command": "uvx",
"args": [
"--from", "redis-mcp-server@latest",
"redis-mcp-server",
"--url", "redis://localhost:6379/0"
]
},
}
},
Alternatively, you can start the server using uv
and configure your mcp.json
or settings.json
. This is usually desired for development.
{
"servers": {
"redis": {
"type": "stdio",
"command": "<full_path_uv_command>",
"args": [
"--directory",
"<your_mcp_server_directory>",
"run",
"src/main.py"
],
"env": {
"REDIS_HOST": "<your_redis_database_hostname>",
"REDIS_PORT": "<your_redis_database_port>",
"REDIS_USERNAME": "<your_redis_database_username>",
"REDIS_PWD": "<your_redis_database_password>",
}
}
}
}
{
"mcp": {
"servers": {
"redis": {
"type": "stdio",
"command": "<full_path_uv_command>",
"args": [
"--directory",
"<your_mcp_server_directory>",
"run",
"src/main.py"
],
"env": {
"REDIS_HOST": "<your_redis_database_hostname>",
"REDIS_PORT": "<your_redis_database_port>",
"REDIS_USERNAME": "<your_redis_database_username>",
"REDIS_PWD": "<your_redis_database_password>",
}
}
}
}
}
For more information, see the VS Code documentation.
You can use the MCP Inspector for visual debugging of this MCP Server.
npx @modelcontextprotocol/inspector uv run src/main.py
- AI Assistants: Enable LLMs to fetch, store, and process data in Redis.
- Chatbots & Virtual Agents: Retrieve session data, manage queues, and personalize responses.
- Data Search & Analytics: Query Redis for real-time insights and fast lookups.
- Event Processing: Manage event streams with Redis Streams.
- Fork the repo
- Create a new branch (
feature-branch
) - Commit your changes
- Push to your branch and submit a PR!
This project is licensed under the MIT License.
For questions or support, reach out via GitHub Issues.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for mcp-redis
Similar Open Source Tools

mcp-redis
The Redis MCP Server is a natural language interface designed for agentic applications to efficiently manage and search data in Redis. It integrates seamlessly with MCP (Model Content Protocol) clients, enabling AI-driven workflows to interact with structured and unstructured data in Redis. The server supports natural language queries, seamless MCP integration, full Redis support for various data types, search and filtering capabilities, scalability, and lightweight design. It provides tools for managing data stored in Redis, such as string, hash, list, set, sorted set, pub/sub, streams, JSON, query engine, and server management. Installation can be done from PyPI or GitHub, with options for testing, development, and Docker deployment. Configuration can be via command line arguments or environment variables. Integrations include OpenAI Agents SDK, Augment, Claude Desktop, and VS Code with GitHub Copilot. Use cases include AI assistants, chatbots, data search & analytics, and event processing. Contributions are welcome under the MIT License.

Gmail-MCP-Server
Gmail AutoAuth MCP Server is a Model Context Protocol (MCP) server designed for Gmail integration in Claude Desktop. It supports auto authentication and enables AI assistants to manage Gmail through natural language interactions. The server provides comprehensive features for sending emails, reading messages, managing labels, searching emails, and batch operations. It offers full support for international characters, email attachments, and Gmail API integration. Users can install and authenticate the server via Smithery or manually with Google Cloud Project credentials. The server supports both Desktop and Web application credentials, with global credential storage for convenience. It also includes Docker support and instructions for cloud server authentication.

swarmzero
SwarmZero SDK is a library that simplifies the creation and execution of AI Agents and Swarms of Agents. It supports various LLM Providers such as OpenAI, Azure OpenAI, Anthropic, MistralAI, Gemini, Nebius, and Ollama. Users can easily install the library using pip or poetry, set up the environment and configuration, create and run Agents, collaborate with Swarms, add tools for complex tasks, and utilize retriever tools for semantic information retrieval. Sample prompts are provided to help users explore the capabilities of the agents and swarms. The SDK also includes detailed examples and documentation for reference.

Lumos
Lumos is a Chrome extension powered by a local LLM co-pilot for browsing the web. It allows users to summarize long threads, news articles, and technical documentation. Users can ask questions about reviews and product pages. The tool requires a local Ollama server for LLM inference and embedding database. Lumos supports multimodal models and file attachments for processing text and image content. It also provides options to customize models, hosts, and content parsers. The extension can be easily accessed through keyboard shortcuts and offers tools for automatic invocation based on prompts.

vim-ai
vim-ai is a plugin that adds Artificial Intelligence (AI) capabilities to Vim and Neovim. It allows users to generate code, edit text, and have interactive conversations with GPT models powered by OpenAI's API. The plugin uses OpenAI's API to generate responses, requiring users to set up an account and obtain an API key. It supports various commands for text generation, editing, and chat interactions, providing a seamless integration of AI features into the Vim text editor environment.

redis-vl-python
The Python Redis Vector Library (RedisVL) is a tailor-made client for AI applications leveraging Redis. It enhances applications with Redis' speed, flexibility, and reliability, incorporating capabilities like vector-based semantic search, full-text search, and geo-spatial search. The library bridges the gap between the emerging AI-native developer ecosystem and the capabilities of Redis by providing a lightweight, elegant, and intuitive interface. It abstracts the features of Redis into a grammar that is more aligned to the needs of today's AI/ML Engineers or Data Scientists.

json-repair
JSON Repair is a toolkit designed to address JSON anomalies that can arise from Large Language Models (LLMs). It offers a comprehensive solution for repairing JSON strings, ensuring accuracy and reliability in your data processing. With its user-friendly interface and extensive capabilities, JSON Repair empowers developers to seamlessly integrate JSON repair into their workflows.

mcp
Semgrep MCP Server is a beta server under active development for using Semgrep to scan code for security vulnerabilities. It provides a Model Context Protocol (MCP) for various coding tools to get specialized help in tasks. Users can connect to Semgrep AppSec Platform, scan code for vulnerabilities, customize Semgrep rules, analyze and filter scan results, and compare results. The tool is published on PyPI as semgrep-mcp and can be installed using pip, pipx, uv, poetry, or other methods. It supports CLI and Docker environments for running the server. Integration with VS Code is also available for quick installation. The project welcomes contributions and is inspired by core technologies like Semgrep and MCP, as well as related community projects and tools.

nvim.ai
nvim.ai is a powerful Neovim plugin that enables AI-assisted coding and chat capabilities within the editor. Users can chat with buffers, insert code with an inline assistant, and utilize various LLM providers for context-aware AI assistance. The plugin supports features like interacting with AI about code and documents, receiving relevant help based on current work, code insertion, code rewriting (Work in Progress), and integration with multiple LLM providers. Users can configure the plugin, add API keys to dotfiles, and integrate with nvim-cmp for command autocompletion. Keymaps are available for chat and inline assist functionalities. The chat dialog allows parsing content with keywords and supports roles like /system, /you, and /assistant. Context-aware assistance can be accessed through inline assist by inserting code blocks anywhere in the file.

scylla
Scylla is an intelligent proxy pool tool designed for humanities, enabling users to extract content from the internet and build their own Large Language Models in the AI era. It features automatic proxy IP crawling and validation, an easy-to-use JSON API, a simple web-based user interface, HTTP forward proxy server, Scrapy and requests integration, and headless browser crawling. Users can start using Scylla with just one command, making it a versatile tool for various web scraping and content extraction tasks.

deep-searcher
DeepSearcher is a tool that combines reasoning LLMs and Vector Databases to perform search, evaluation, and reasoning based on private data. It is suitable for enterprise knowledge management, intelligent Q&A systems, and information retrieval scenarios. The tool maximizes the utilization of enterprise internal data while ensuring data security, supports multiple embedding models, and provides support for multiple LLMs for intelligent Q&A and content generation. It also includes features like private data search, vector database management, and document loading with web crawling capabilities under development.

flux-aio
Flux All-In-One is a lightweight distribution optimized for running the GitOps Toolkit controllers as a single deployable unit on Kubernetes clusters. It is designed for bare clusters, edge clusters, clusters with restricted communication, clusters with egress via proxies, and serverless clusters. The distribution follows semver versioning and provides documentation for specifications, installation, upgrade, OCI sync configuration, Git sync configuration, and multi-tenancy configuration. Users can deploy Flux using Timoni CLI and a Timoni Bundle file, fine-tune installation options, sync from public Git repositories, bootstrap repositories, and uninstall Flux without affecting reconciled workloads.

mcp-server-mysql
The MCP Server for MySQL based on NodeJS is a Model Context Protocol server that provides access to MySQL databases. It enables users to inspect database schemas and execute SQL queries. The server offers tools for executing SQL queries, providing comprehensive database information, security features like SQL injection prevention, performance optimizations, monitoring, and debugging capabilities. Users can configure the server using environment variables and advanced options. The server supports multi-DB mode, schema-specific permissions, and includes troubleshooting guidelines for common issues. Contributions are welcome, and the project roadmap includes enhancing query capabilities, security features, performance optimizations, monitoring, and expanding schema information.

aws-mcp
AWS MCP is a Model Context Protocol (MCP) server that facilitates interactions between AI assistants and AWS environments. It allows for natural language querying and management of AWS resources during conversations. The server supports multiple AWS profiles, SSO authentication, multi-region operations, and secure credential handling. Users can locally execute commands with their AWS credentials, enhancing the conversational experience with AWS resources.

memobase
Memobase is a user profile-based memory system designed to enhance Generative AI applications by enabling them to remember, understand, and evolve with users. It provides structured user profiles, scalable profiling, easy integration with existing LLM stacks, batch processing for speed, and is production-ready. Users can manage users, insert data, get memory profiles, and track user preferences and behaviors. Memobase is ideal for applications that require user analysis, tracking, and personalized interactions.

instructor
Instructor is a popular Python library for managing structured outputs from large language models (LLMs). It offers a user-friendly API for validation, retries, and streaming responses. With support for various LLM providers and multiple languages, Instructor simplifies working with LLM outputs. The library includes features like response models, retry management, validation, streaming support, and flexible backends. It also provides hooks for logging and monitoring LLM interactions, and supports integration with Anthropic, Cohere, Gemini, Litellm, and Google AI models. Instructor facilitates tasks such as extracting user data from natural language, creating fine-tuned models, managing uploaded files, and monitoring usage of OpenAI models.
For similar tasks

Azure-Analytics-and-AI-Engagement
The Azure-Analytics-and-AI-Engagement repository provides packaged Industry Scenario DREAM Demos with ARM templates (Containing a demo web application, Power BI reports, Synapse resources, AML Notebooks etc.) that can be deployed in a customer’s subscription using the CAPE tool within a matter of few hours. Partners can also deploy DREAM Demos in their own subscriptions using DPoC.

sorrentum
Sorrentum is an open-source project that aims to combine open-source development, startups, and brilliant students to build machine learning, AI, and Web3 / DeFi protocols geared towards finance and economics. The project provides opportunities for internships, research assistantships, and development grants, as well as the chance to work on cutting-edge problems, learn about startups, write academic papers, and get internships and full-time positions at companies working on Sorrentum applications.

tidb
TiDB is an open-source distributed SQL database that supports Hybrid Transactional and Analytical Processing (HTAP) workloads. It is MySQL compatible and features horizontal scalability, strong consistency, and high availability.

zep-python
Zep is an open-source platform for building and deploying large language model (LLM) applications. It provides a suite of tools and services that make it easy to integrate LLMs into your applications, including chat history memory, embedding, vector search, and data enrichment. Zep is designed to be scalable, reliable, and easy to use, making it a great choice for developers who want to build LLM-powered applications quickly and easily.

telemetry-airflow
This repository codifies the Airflow cluster that is deployed at workflow.telemetry.mozilla.org (behind SSO) and commonly referred to as "WTMO" or simply "Airflow". Some links relevant to users and developers of WTMO: * The `dags` directory in this repository contains some custom DAG definitions * Many of the DAGs registered with WTMO don't live in this repository, but are instead generated from ETL task definitions in bigquery-etl * The Data SRE team maintains a WTMO Developer Guide (behind SSO)

mojo
Mojo is a new programming language that bridges the gap between research and production by combining Python syntax and ecosystem with systems programming and metaprogramming features. Mojo is still young, but it is designed to become a superset of Python over time.

pandas-ai
PandasAI is a Python library that makes it easy to ask questions to your data in natural language. It helps you to explore, clean, and analyze your data using generative AI.

databend
Databend is an open-source cloud data warehouse that serves as a cost-effective alternative to Snowflake. With its focus on fast query execution and data ingestion, it's designed for complex analysis of the world's largest datasets.
For similar jobs

sweep
Sweep is an AI junior developer that turns bugs and feature requests into code changes. It automatically handles developer experience improvements like adding type hints and improving test coverage.

teams-ai
The Teams AI Library is a software development kit (SDK) that helps developers create bots that can interact with Teams and Microsoft 365 applications. It is built on top of the Bot Framework SDK and simplifies the process of developing bots that interact with Teams' artificial intelligence capabilities. The SDK is available for JavaScript/TypeScript, .NET, and Python.

ai-guide
This guide is dedicated to Large Language Models (LLMs) that you can run on your home computer. It assumes your PC is a lower-end, non-gaming setup.

classifai
Supercharge WordPress Content Workflows and Engagement with Artificial Intelligence. Tap into leading cloud-based services like OpenAI, Microsoft Azure AI, Google Gemini and IBM Watson to augment your WordPress-powered websites. Publish content faster while improving SEO performance and increasing audience engagement. ClassifAI integrates Artificial Intelligence and Machine Learning technologies to lighten your workload and eliminate tedious tasks, giving you more time to create original content that matters.

chatbot-ui
Chatbot UI is an open-source AI chat app that allows users to create and deploy their own AI chatbots. It is easy to use and can be customized to fit any need. Chatbot UI is perfect for businesses, developers, and anyone who wants to create a chatbot.

BricksLLM
BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM: * Set LLM usage limits for users on different pricing tiers * Track LLM usage on a per user and per organization basis * Block or redact requests containing PIIs * Improve LLM reliability with failovers, retries and caching * Distribute API keys with rate limits and cost limits for internal development/production use cases * Distribute API keys with rate limits and cost limits for students

uAgents
uAgents is a Python library developed by Fetch.ai that allows for the creation of autonomous AI agents. These agents can perform various tasks on a schedule or take action on various events. uAgents are easy to create and manage, and they are connected to a fast-growing network of other uAgents. They are also secure, with cryptographically secured messages and wallets.

griptape
Griptape is a modular Python framework for building AI-powered applications that securely connect to your enterprise data and APIs. It offers developers the ability to maintain control and flexibility at every step. Griptape's core components include Structures (Agents, Pipelines, and Workflows), Tasks, Tools, Memory (Conversation Memory, Task Memory, and Meta Memory), Drivers (Prompt and Embedding Drivers, Vector Store Drivers, Image Generation Drivers, Image Query Drivers, SQL Drivers, Web Scraper Drivers, and Conversation Memory Drivers), Engines (Query Engines, Extraction Engines, Summary Engines, Image Generation Engines, and Image Query Engines), and additional components (Rulesets, Loaders, Artifacts, Chunkers, and Tokenizers). Griptape enables developers to create AI-powered applications with ease and efficiency.