
mcp-server-mysql
A Model Context Protocol server that provides read-only access to MySQL databases. This server enables LLMs to inspect database schemas and execute read-only queries.
Stars: 101

README:
A Model Context Protocol server that provides access to MySQL databases. This server enables LLMs to inspect database schemas and execute SQL queries.
- Requirements
- Installation
- Components
- Configuration
- Environment Variables
- Testing
- Troubleshooting
- Contributing
- License
- Node.js v18 or higher
- MySQL 5.7 or higher (MySQL 8.0+ recommended)
- MySQL user with appropriate permissions for the operations you need
- For write operations: MySQL user with INSERT, UPDATE, and/or DELETE privileges
There are several ways to install and configure the MCP server:
To manually configure the MCP server for Claude Desktop App, add the following to your claude_desktop_config.json
file (typically located in your user directory):
{
"mcpServers": {
"mcp_server_mysql": {
"command": "/path/to/node",
"args": [
"/full/path/to/mcp-server-mysql/dist/index.js"
],
"env": {
"MYSQL_HOST": "127.0.0.1",
"MYSQL_PORT": "3306",
"MYSQL_USER": "root",
"MYSQL_PASS": "your_password",
"MYSQL_DB": "your_database",
"ALLOW_INSERT_OPERATION": "false",
"ALLOW_UPDATE_OPERATION": "false",
"ALLOW_DELETE_OPERATION": "false"
}
}
}
}
For Cursor IDE, you can install this MCP server with the following command in your project:
npm install -g @benborla29/mcp-server-mysql
Then configure it in your Cursor settings.
The easiest way to install and configure this MCP server is through Smithery:
npx -y @smithery/cli@latest install @benborla29/mcp-server-mysql --client claude
During configuration, you'll be prompted to enter your MySQL connection details. Smithery will automatically:
- Set up the correct environment variables
- Configure your LLM application to use the MCP server
- Test the connection to your MySQL database
- Provide helpful troubleshooting if needed
- Configure write operation settings (INSERT, UPDATE, DELETE permissions)
The installation will ask for the following connection details:
- MySQL Host (default: 127.0.0.1)
- MySQL Port (default: 3306)
- MySQL Username
- MySQL Password
- MySQL Database name
- SSL Configuration (if needed)
- Write operations permissions:
- Allow INSERT operations (default: false)
- Allow UPDATE operations (default: false)
- Allow DELETE operations (default: false)
For security reasons, write operations are disabled by default. Enable them only if you need Claude to modify your database data.
You can also install this package using MCP Get:
npx @michaellatman/mcp-get@latest install @benborla29/mcp-server-mysql
MCP Get provides a centralized registry of MCP servers and simplifies the installation process.
For manual installation:
# Using npm
npm install -g @benborla29/mcp-server-mysql
# Using pnpm
pnpm add -g @benborla29/mcp-server-mysql
After manual installation, you'll need to configure your LLM application to use the MCP server (see Configuration section below).
If you want to clone and run this MCP server directly from the source code, follow these steps:
-
Clone the repository
git clone https://github.com/benborla/mcp-server-mysql.git cd mcp-server-mysql
-
Install dependencies
npm install # or pnpm install
-
Build the project
npm run build # or pnpm run build
-
Configure Claude Desktop
Add the following to your Claude Desktop configuration file (
claude_desktop_config.json
):{ "mcpServers": { "mcp_server_mysql": { "command": "/path/to/node", "args": [ "/full/path/to/mcp-server-mysql/dist/index.js" ], "env": { "MYSQL_HOST": "127.0.0.1", "MYSQL_PORT": "3306", "MYSQL_USER": "root", "MYSQL_PASS": "your_password", "MYSQL_DB": "your_database", "ALLOW_INSERT_OPERATION": "false", "ALLOW_UPDATE_OPERATION": "false", "ALLOW_DELETE_OPERATION": "false" } } } }
Replace:
-
/path/to/node
with the full path to your Node.js binary (find it withwhich node
) -
/full/path/to/mcp-server-mysql
with the full path to where you cloned the repository - Set the MySQL credentials to match your environment
-
-
Test the server
# Run the server directly to test node dist/index.js
If it connects to MySQL successfully, you're ready to use it with Claude Desktop.
-
mysql_query
- Execute SQL queries against the connected database
- Input:
sql
(string): The SQL query to execute - By default, limited to READ ONLY operations
- Optional write operations (when enabled via configuration):
- INSERT: Add new data to tables (requires
ALLOW_INSERT_OPERATION=true
) - UPDATE: Modify existing data (requires
ALLOW_UPDATE_OPERATION=true
) - DELETE: Remove data (requires
ALLOW_DELETE_OPERATION=true
)
- INSERT: Add new data to tables (requires
- All operations are executed within a transaction with proper commit/rollback handling
- Supports prepared statements for secure parameter handling
- Configurable query timeouts and result pagination
- Built-in query execution statistics
The server provides comprehensive database information:
-
Table Schemas
- JSON schema information for each table
- Column names and data types
- Index information and constraints
- Foreign key relationships
- Table statistics and metrics
- Automatically discovered from database metadata
- SQL injection prevention through prepared statements
- Query whitelisting/blacklisting capabilities
- Rate limiting for query execution
- Query complexity analysis
- Configurable connection encryption
- Read-only transaction enforcement
- Optimized connection pooling
- Query result caching
- Large result set streaming
- Query execution plan analysis
- Configurable query timeouts
- Comprehensive query logging
- Performance metrics collection
- Error tracking and reporting
- Health check endpoints
- Query execution statistics
If you installed using Smithery, your configuration is already set up. You can view or modify it with:
smithery configure @benborla29/mcp-server-mysql
When reconfiguring, you can update any of the MySQL connection details as well as the write operation settings:
-
Basic connection settings:
- MySQL Host, Port, User, Password, Database
- SSL/TLS configuration (if your database requires secure connections)
-
Write operation permissions:
- Allow INSERT Operations: Set to true if you want to allow adding new data
- Allow UPDATE Operations: Set to true if you want to allow updating existing data
- Allow DELETE Operations: Set to true if you want to allow deleting data
For security reasons, all write operations are disabled by default. Only enable these settings if you specifically need Claude to modify your database data.
For more control over the MCP server's behavior, you can use these advanced configuration options:
{
"mcpServers": {
"mcp_server_mysql": {
"command": "/path/to/npx/binary/npx",
"args": [
"-y",
"@benborla29/mcp-server-mysql"
],
"env": {
// Basic connection settings
"MYSQL_HOST": "127.0.0.1",
"MYSQL_PORT": "3306",
"MYSQL_USER": "root",
"MYSQL_PASS": "",
"MYSQL_DB": "db_name",
"PATH": "/path/to/node/bin:/usr/bin:/bin",
// Performance settings
"MYSQL_POOL_SIZE": "10",
"MYSQL_QUERY_TIMEOUT": "30000",
"MYSQL_CACHE_TTL": "60000",
// Security settings
"MYSQL_RATE_LIMIT": "100",
"MYSQL_MAX_QUERY_COMPLEXITY": "1000",
"MYSQL_SSL": "true",
// Monitoring settings
"MYSQL_ENABLE_LOGGING": "true",
"MYSQL_LOG_LEVEL": "info",
"MYSQL_METRICS_ENABLED": "true",
// Write operation flags
"ALLOW_INSERT_OPERATION": "false",
"ALLOW_UPDATE_OPERATION": "false",
"ALLOW_DELETE_OPERATION": "false"
}
}
}
}
-
MYSQL_HOST
: MySQL server host (default: "127.0.0.1") -
MYSQL_PORT
: MySQL server port (default: "3306") -
MYSQL_USER
: MySQL username (default: "root") -
MYSQL_PASS
: MySQL password -
MYSQL_DB
: Target database name
-
MYSQL_POOL_SIZE
: Connection pool size (default: "10") -
MYSQL_QUERY_TIMEOUT
: Query timeout in milliseconds (default: "30000") -
MYSQL_CACHE_TTL
: Cache time-to-live in milliseconds (default: "60000")
-
MYSQL_RATE_LIMIT
: Maximum queries per minute (default: "100") -
MYSQL_MAX_QUERY_COMPLEXITY
: Maximum query complexity score (default: "1000") -
MYSQL_SSL
: Enable SSL/TLS encryption (default: "false") -
ALLOW_INSERT_OPERATION
: Enable INSERT operations (default: "false") -
ALLOW_UPDATE_OPERATION
: Enable UPDATE operations (default: "false") -
ALLOW_DELETE_OPERATION
: Enable DELETE operations (default: "false")
-
MYSQL_ENABLE_LOGGING
: Enable query logging (default: "false") -
MYSQL_LOG_LEVEL
: Logging level (default: "info") -
MYSQL_METRICS_ENABLED
: Enable performance metrics (default: "false")
Before running tests, you need to set up the test database and seed it with test data:
-
Create Test Database and User
-- Connect as root and create test database CREATE DATABASE IF NOT EXISTS mcp_test; -- Create test user with appropriate permissions CREATE USER IF NOT EXISTS 'mcp_test'@'localhost' IDENTIFIED BY 'mcp_test_password'; GRANT ALL PRIVILEGES ON mcp_test.* TO 'mcp_test'@'localhost'; FLUSH PRIVILEGES;
-
Run Database Setup Script
# Run the database setup script pnpm run setup:test:db
This will create the necessary tables and seed data. The script is located in
scripts/setup-test-db.ts
-
Configure Test Environment Create a
.env.test
file in the project root (if not existing):MYSQL_HOST=127.0.0.1 MYSQL_PORT=3306 MYSQL_USER=mcp_test MYSQL_PASS=mcp_test_password MYSQL_DB=mcp_test
-
Update package.json Scripts Add these scripts to your package.json:
{ "scripts": { "setup:test:db": "ts-node scripts/setup-test-db.ts", "pretest": "pnpm run setup:test:db", "test": "vitest run", "test:watch": "vitest", "test:coverage": "vitest run --coverage" } }
The project includes a comprehensive test suite to ensure functionality and reliability:
# First-time setup
pnpm run setup:test:db
# Run all tests
pnpm test
-
Connection Issues
- Verify MySQL server is running and accessible
- Check credentials and permissions
- Ensure SSL/TLS configuration is correct if enabled
- Try connecting with a MySQL client to confirm access
-
Performance Issues
- Adjust connection pool size
- Configure query timeout values
- Enable query caching if needed
- Check query complexity settings
- Monitor server resource usage
-
Security Restrictions
- Review rate limiting configuration
- Check query whitelist/blacklist settings
- Verify SSL/TLS settings
- Ensure the user has appropriate MySQL permissions
-
Path Resolution If you encounter an error "Could not connect to MCP server mcp-server-mysql", explicitly set the path of all required binaries:
{
"env": {
"PATH": "/path/to/node/bin:/usr/bin:/bin"
}
}
-
Claude Desktop Specific Issues
- If you see "Server disconnected" logs in Claude Desktop, check the logs at
~/Library/Logs/Claude/mcp-server-mcp_server_mysql.log
- Ensure you're using the absolute path to both the Node binary and the server script
- Check if your
.env
file is being properly loaded; use explicit environment variables in the configuration - Try running the server directly from the command line to see if there are connection issues
- If you need write operations (INSERT, UPDATE, DELETE), set the appropriate flags to "true" in your configuration:
"env": { "ALLOW_INSERT_OPERATION": "true", // Enable INSERT operations "ALLOW_UPDATE_OPERATION": "true", // Enable UPDATE operations "ALLOW_DELETE_OPERATION": "true" // Enable DELETE operations }
- Ensure your MySQL user has the appropriate permissions for the operations you're enabling
- For direct execution configuration, use:
{ "mcpServers": { "mcp_server_mysql": { "command": "/full/path/to/node", "args": [ "/full/path/to/mcp-server-mysql/dist/index.js" ], "env": { "MYSQL_HOST": "127.0.0.1", "MYSQL_PORT": "3306", "MYSQL_USER": "root", "MYSQL_PASS": "your_password", "MYSQL_DB": "your_database" } } } }
- If you see "Server disconnected" logs in Claude Desktop, check the logs at
-
Authentication Issues
- For MySQL 8.0+, ensure the server supports the
caching_sha2_password
authentication plugin - Check if your MySQL user is configured with the correct authentication method
- Try creating a user with legacy authentication if needed:
CREATE USER 'user'@'localhost' IDENTIFIED WITH mysql_native_password BY 'password';
- For MySQL 8.0+, ensure the server supports the
Contributions are welcome! Please feel free to submit a Pull Request to https://github.com/benborla/mcp-server-mysql
- Clone the repository
- Install dependencies:
pnpm install
- Build the project:
pnpm run build
- Run tests:
pnpm test
We're actively working on enhancing this MCP server. Check our CHANGELOG.md for details on planned features, including:
- Enhanced query capabilities with prepared statements
- Advanced security features
- Performance optimizations
- Comprehensive monitoring
- Expanded schema information
If you'd like to contribute to any of these areas, please check the issues on GitHub or open a new one to discuss your ideas.
- Fork the repository
- Create a feature branch:
git checkout -b feature/your-feature-name
- Commit your changes:
git commit -am 'Add some feature'
- Push to the branch:
git push origin feature/your-feature-name
- Submit a pull request
This MCP server is licensed under the MIT License. See the LICENSE file for details.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for mcp-server-mysql
Similar Open Source Tools

pipecat-flows
Pipecat Flows is a framework designed for building structured conversations in AI applications. It allows users to create both predefined conversation paths and dynamically generated flows, handling state management and LLM interactions. The framework includes a Python module for building conversation flows and a visual editor for designing and exporting flow configurations. Pipecat Flows is suitable for scenarios such as customer service scripts, intake forms, personalized experiences, and complex decision trees.

ZerePy
ZerePy is an open-source Python framework for deploying agents on X using OpenAI or Anthropic LLMs. It offers CLI interface, Twitter integration, and modular connection system. Users can fine-tune models for creative outputs and create agents with specific tasks. The tool requires Python 3.10+, Poetry 1.5+, and API keys for LLM, OpenAI, Anthropic, and X API.

firecrawl-mcp-server
Firecrawl MCP Server is a Model Context Protocol (MCP) server implementation that integrates with Firecrawl for web scraping capabilities. It supports features like scrape, crawl, search, extract, and batch scrape. It provides web scraping with JS rendering, URL discovery, web search with content extraction, automatic retries with exponential backoff, credit usage monitoring, comprehensive logging system, support for cloud and self-hosted FireCrawl instances, mobile/desktop viewport support, and smart content filtering with tag inclusion/exclusion. The server includes configurable parameters for retry behavior and credit usage monitoring, rate limiting and batch processing capabilities, and tools for scraping, batch scraping, checking batch status, searching, crawling, and extracting structured information from web pages.

aider-desk
AiderDesk is a desktop application that enhances coding workflow by leveraging AI capabilities. It offers an intuitive GUI, project management, IDE integration, MCP support, settings management, cost tracking, structured messages, visual file management, model switching, code diff viewer, one-click reverts, and easy sharing. Users can install it by downloading the latest release and running the executable. AiderDesk also supports Python version detection and auto update disabling. It includes features like multiple project management, context file management, model switching, chat mode selection, question answering, cost tracking, MCP server integration, and MCP support for external tools and context. Development setup involves cloning the repository, installing dependencies, running in development mode, and building executables for different platforms. Contributions from the community are welcome following specific guidelines.

aiavatarkit
AIAvatarKit is a tool for building AI-based conversational avatars quickly. It supports various platforms like VRChat and cluster, along with real-world devices. The tool is extensible, allowing unlimited capabilities based on user needs. It requires VOICEVOX API, Google or Azure Speech Services API keys, and Python 3.10. Users can start conversations out of the box and enjoy seamless interactions with the avatars.

scylla
Scylla is an intelligent proxy pool tool designed for humanities, enabling users to extract content from the internet and build their own Large Language Models in the AI era. It features automatic proxy IP crawling and validation, an easy-to-use JSON API, a simple web-based user interface, HTTP forward proxy server, Scrapy and requests integration, and headless browser crawling. Users can start using Scylla with just one command, making it a versatile tool for various web scraping and content extraction tasks.

ruby-openai
Use the OpenAI API with Ruby! 🤖🩵 Stream text with GPT-4, transcribe and translate audio with Whisper, or create images with DALL·E... Hire me | 🎮 Ruby AI Builders Discord | 🐦 Twitter | 🧠 Anthropic Gem | 🚂 Midjourney Gem ## Table of Contents * Ruby OpenAI * Table of Contents * Installation * Bundler * Gem install * Usage * Quickstart * With Config * Custom timeout or base URI * Extra Headers per Client * Logging * Errors * Faraday middleware * Azure * Ollama * Counting Tokens * Models * Examples * Chat * Streaming Chat * Vision * JSON Mode * Functions * Edits * Embeddings * Batches * Files * Finetunes * Assistants * Threads and Messages * Runs * Runs involving function tools * Image Generation * DALL·E 2 * DALL·E 3 * Image Edit * Image Variations * Moderations * Whisper * Translate * Transcribe * Speech * Errors * Development * Release * Contributing * License * Code of Conduct

json-repair
JSON Repair is a toolkit designed to address JSON anomalies that can arise from Large Language Models (LLMs). It offers a comprehensive solution for repairing JSON strings, ensuring accuracy and reliability in your data processing. With its user-friendly interface and extensive capabilities, JSON Repair empowers developers to seamlessly integrate JSON repair into their workflows.

hf-waitress
HF-Waitress is a powerful server application for deploying and interacting with HuggingFace Transformer models. It simplifies running open-source Large Language Models (LLMs) locally on-device, providing on-the-fly quantization via BitsAndBytes, HQQ, and Quanto. It requires no manual model downloads, offers concurrency, streaming responses, and supports various hardware and platforms. The server uses a `config.json` file for easy configuration management and provides detailed error handling and logging.

deep-searcher
DeepSearcher is a tool that combines reasoning LLMs and Vector Databases to perform search, evaluation, and reasoning based on private data. It is suitable for enterprise knowledge management, intelligent Q&A systems, and information retrieval scenarios. The tool maximizes the utilization of enterprise internal data while ensuring data security, supports multiple embedding models, and provides support for multiple LLMs for intelligent Q&A and content generation. It also includes features like private data search, vector database management, and document loading with web crawling capabilities under development.

aws-mcp
AWS MCP is a Model Context Protocol (MCP) server that facilitates interactions between AI assistants and AWS environments. It allows for natural language querying and management of AWS resources during conversations. The server supports multiple AWS profiles, SSO authentication, multi-region operations, and secure credential handling. Users can locally execute commands with their AWS credentials, enhancing the conversational experience with AWS resources.

vim-ai
vim-ai is a plugin that adds Artificial Intelligence (AI) capabilities to Vim and Neovim. It allows users to generate code, edit text, and have interactive conversations with GPT models powered by OpenAI's API. The plugin uses OpenAI's API to generate responses, requiring users to set up an account and obtain an API key. It supports various commands for text generation, editing, and chat interactions, providing a seamless integration of AI features into the Vim text editor environment.

supergateway
Supergateway is a tool that allows running MCP stdio-based servers over SSE (Server-Sent Events) with one command. It is useful for remote access, debugging, or connecting to SSE-based clients when your MCP server only speaks stdio. The tool supports running in SSE to Stdio mode as well, where it connects to a remote SSE server and exposes a local stdio interface for downstream clients. Supergateway can be used with ngrok to share local MCP servers with remote clients and can also be run in a Docker containerized deployment. It is designed with modularity in mind, ensuring compatibility and ease of use for AI tools exchanging data.