
mcp
[beta] Use Semgrep in LLMs using MCP framework
Stars: 58

README:
This beta Semgrep mcp server is under active development, we would love your feedback, bug reports, feature requests. For more support, join our community slack >
#mcp
channel.
A MCP server for using Semgrep to scan code for security vulnerabilies.
uvx semgrep-mcp -t sse
example Cursor mcp.json
config:
{
"mcpServers": {
"semgrep": {
"command": "uvx",
"args": ["semgrep-mcp"]
}
}
}
Model Context Protocul (MCP) is like Unix pipes or an API for LLMs, agents, and coding tools like Cursor, VS Code, Windsurf, Claude, or any other tool that support MCP, to get specialized help doing a task by using a tool.
To optionally connect to Semgrep AppSec Platform:
- Login or sign up
- Generate a token from Settings page
- Add it to your environment variables
CLI (
export SEMGREP_APP_TOKEN=<token>
)Docker (
docker run -e SEMGREP_APP_TOKEN=<token>
)MCP Config JSON
"env": { "SEMGREP_APP_TOKEN": "<token>" }Semgrep will automatically use the API token to connect and use the remote configuration. Please reach out to [email protected] if you have any problems.
Scanning Code
-
semgrep_scan
: Scan code snippets for security vulnerabilities -
scan_directory
: Perform Semgrep scan on a directory
Customization
-
list_rules
: List available Semgrep rules with optional language filtering -
create_rule
: Create custom Semgrep rules
Results
-
analyze_results
: Analyze scan results including severity counts and top affected files -
filter_results
: Filter scan results by severity, rule ID, file path, etc. -
export_results
: Export scan results in various formats (JSON, SARIF, text) -
compare_results
: Compare two scan results to identify new and fixed issues
This package is published to PyPI as semgrep-mcp
You can install it and run with pip, pipx, uv, poetry, or any other way to install python packages.
For example:
pipx install semgrep-mcp
semgrep-mcp --help
-
Install
uv
using their installation instructions -
Ensure you have Python 3.13+ installed
-
Clone this repository
-
Install Semgrep (additional methods):
pip install semgrep
- Install
docker
using their installation instructions - Clone this repository
- Build the server
docker build -t semgrep-mcp .
uv run mcp run server.py -t sse
Or as a uv
script
chmod +x server.py
./server.py
uv run mcp run server.py -t stdio
See the official python mcp sdk for more details and configuration options.
docker run -p 8000:8000 semgrep-mcp
Also published to ghcr.io/semgrep/mcp
docker run -p 8000:8000 ghcr.io/semgrep/mcp:latest
from mcp.client import Client
client = Client()
client.connect("localhost:8000")
# Scan code for security issues
results = client.call_tool("semgrep_scan",
{
"code_files": [
{
"filename": "hello_world.py",
"content": "def hello(): ..."
}
]
})
Click the install buttons at the top of this section for the quickest installation method. Alternatively, you can manually configure the server using one of the methods below.
Add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
{
"mcp": {
"servers": {
"semgrep": {
"command": "uv",
"args": ["run", "mcp", "run", "server.py", "-t", "sse"]
}
}
}
}
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace:
{
"servers": {
"semgrep": {
"command": "uv",
"args": ["run", "mcp", "run", "server.py", "-t", "sse"]
}
}
}
Add the following JSON block to your User Settings (JSON) file in VS Code:
{
"mcp": {
"servers": {
"semgrep": {
"command": "docker",
"args": ["run", "-p", "8000:8000", "ghcr.io/semgrep/mcp:latest"]
}
}
}
}
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace:
{
"servers": {
"semgrep": {
"command": "docker",
"args": ["run", "-p", "8000:8000", "ghcr.io/semgrep/mcp:latest"]
}
}
}
- Ensure your Semgrep MCP is running in SSE mode in the terminal
- Go to Cursor > Settings > Cursor Settings
- Choose the
MCP
tab - Click "Add new MCP server"
- Name:
Semgrep
, Type:sse
, Server URL:http://127.0.0.1:8000/sse
- Ensure the MCP server is enabled
You can also set it up by adding this to ~/.cursor/mcp.json
{
"mcpServers": {
"Semgrep": {
"url": "http://localhost:8000/sse"
}
}
}
Your contributions to this project are most welcome. Please see the "good first issue" label for easy tasks.
Start the MCP server in development mode:
uv run mcp dev server.py
By default, the MCP server runs on http://localhost:8000
with the inspector server on http://localhost:6274
.
Note: When opening the inspector sever, add query parameters to the url to increase the default timeout of the server from 10s
http://localhost:6274/?timeout=300000
This project builds upon and is inspired by several awesome community projects:
- Semgrep - The underlying static analysis engine that powers this project
- Model Context Protocol (MCP) - The protocol that enables AI agent communication
- semgrep-vscode - Official VSCode extension for Semgrep
- semgrep-intellij - IntelliJ plugin for Semgrep
- semgrep-rules - The official collection of Semgrep rules
- mcp-server-semgrep - Original inspiration written by Szowesgad and stefanskiasan
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for mcp
Similar Open Source Tools

LLMVoX
LLMVoX is a lightweight 30M-parameter, LLM-agnostic, autoregressive streaming Text-to-Speech (TTS) system designed to convert text outputs from Large Language Models into high-fidelity streaming speech with low latency. It achieves significantly lower Word Error Rate compared to speech-enabled LLMs while operating at comparable latency and speech quality. Key features include being lightweight & fast with only 30M parameters, LLM-agnostic for easy integration with existing models, multi-queue streaming for continuous speech generation, and multilingual support for easy adaptation to new languages.

lumen
Lumen is a command-line tool that leverages AI to enhance your git workflow. It assists in generating commit messages, understanding changes, interactive searching, and analyzing impacts without the need for an API key. With smart commit messages, git history insights, interactive search, change analysis, and rich markdown output, Lumen offers a seamless and flexible experience for users across various git workflows.

mcp-framework
MCP-Framework is a TypeScript framework for building Model Context Protocol (MCP) servers with automatic directory-based discovery for tools, resources, and prompts. It provides powerful abstractions, simple server setup, and a CLI for rapid development and project scaffolding.

langchainrb
Langchain.rb is a Ruby library that makes it easy to build LLM-powered applications. It provides a unified interface to a variety of LLMs, vector search databases, and other tools, making it easy to build and deploy RAG (Retrieval Augmented Generation) systems and assistants. Langchain.rb is open source and available under the MIT License.

ai-wechat-bot
Gewechat is a project based on the Gewechat project to implement a personal WeChat channel, using the iPad protocol for login. It can obtain wxid and send voice messages, which is more stable than the itchat protocol. The project provides documentation for the API. Users can deploy the Gewechat service and use the ai-wechat-bot project to interface with it. Configuration parameters for Gewechat and ai-wechat-bot need to be set in the config.json file. Gewechat supports sending voice messages, with limitations on the duration of received voice messages. The project has restrictions such as requiring the server to be in the same province as the device logging into WeChat, limited file download support, and support only for text and image messages.

pipecat-flows
Pipecat Flows is a framework designed for building structured conversations in AI applications. It allows users to create both predefined conversation paths and dynamically generated flows, handling state management and LLM interactions. The framework includes a Python module for building conversation flows and a visual editor for designing and exporting flow configurations. Pipecat Flows is suitable for scenarios such as customer service scripts, intake forms, personalized experiences, and complex decision trees.

e2m
E2M is a Python library that can parse and convert various file types into Markdown format. It supports the conversion of multiple file formats, including doc, docx, epub, html, htm, url, pdf, ppt, pptx, mp3, and m4a. The ultimate goal of the E2M project is to provide high-quality data for Retrieval-Augmented Generation (RAG) and model training or fine-tuning. The core architecture consists of a Parser responsible for parsing various file types into text or image data, and a Converter responsible for converting text or image data into Markdown format.

ai-gateway
LangDB AI Gateway is an open-source enterprise AI gateway built in Rust. It provides a unified interface to all LLMs using the OpenAI API format, focusing on high performance, enterprise readiness, and data control. The gateway offers features like comprehensive usage analytics, cost tracking, rate limiting, data ownership, and detailed logging. It supports various LLM providers and provides OpenAI-compatible endpoints for chat completions, model listing, embeddings generation, and image generation. Users can configure advanced settings, such as rate limiting, cost control, dynamic model routing, and observability with OpenTelemetry tracing. The gateway can be run with Docker Compose and integrated with MCP tools for server communication.

clarifai-python
The Clarifai Python SDK offers a comprehensive set of tools to integrate Clarifai's AI platform to leverage computer vision capabilities like classification , detection ,segementation and natural language capabilities like classification , summarisation , generation , Q&A ,etc into your applications. With just a few lines of code, you can leverage cutting-edge artificial intelligence to unlock valuable insights from visual and textual content.