sonarqube-mcp-server
SonarQube MCP Server
Stars: 377
The SonarQube MCP Server is a Model Context Protocol (MCP) server that enables seamless integration with SonarQube Server or Cloud for code quality and security. It supports the analysis of code snippets directly within the agent context. The server provides various tools for analyzing code, managing issues, accessing metrics, and interacting with SonarQube projects. It also supports advanced features like dependency risk analysis, enterprise portfolio management, and system health checks. The server can be configured for different transport modes, proxy settings, and custom certificates. Telemetry data collection can be disabled if needed.
README:
The SonarQube MCP Server is a Model Context Protocol (MCP) server that enables seamless integration with SonarQube Server or Cloud for code quality and security. It also supports the analysis of code snippet directly within the agent context.
The simplest method is to rely on our container image hosted at mcp/sonarqube. Read below if you want to build it locally.
Note: While the examples below use
docker, any OCI-compatible container runtime works (e.g., Podman, nerdctl). Simply replacedockerwith your preferred tool.
🔒 Important: Your SonarQube token is a sensitive credential. Follow these security practices:
When using CLI commands:
- Avoid hardcoding tokens in command-line arguments - they get saved in shell history
- Use environment variables - set tokens in environment variables before running commands
When using configuration files:
- Never commit tokens to version control
- Use environment variable substitution in config files when possible
Claude Code
- To connect with SonarQube Cloud:
claude mcp add sonarqube \
--env SONARQUBE_TOKEN=$SONAR_TOKEN \
--env SONARQUBE_ORG=$SONAR_ORG \
-- docker run -i --rm -e SONARQUBE_TOKEN -e SONARQUBE_ORG mcp/sonarqubeFor SonarQube Cloud US, add --env SONARQUBE_URL=https://sonarqube.us to the command.
- To connect with SonarQube Server:
claude mcp add sonarqube \
--env SONARQUBE_TOKEN=$SONAR_USER_TOKEN \
--env SONARQUBE_URL=$SONAR_URL \
-- docker run -i --rm -e SONARQUBE_TOKEN -e SONARQUBE_URL mcp/sonarqubeCodex CLI
Manually edit the configuration file at ~/.codex/config.toml and add the following configuration:
- To connect with SonarQube Cloud:
[mcp_servers.sonarqube]
command = "docker"
args = ["run", "--rm", "-i", "-e", "SONARQUBE_TOKEN", "-e", "SONARQUBE_ORG", "mcp/sonarqube"]
env = { "SONARQUBE_TOKEN" = "<YOUR_USER_TOKEN>", "SONARQUBE_ORG" = "<YOUR_ORG>" }For SonarQube Cloud US, add "SONARQUBE_URL" = "https://sonarqube.us" to the env section and "-e", "SONARQUBE_URL" to the args array.
- To connect with SonarQube Server:
[mcp_servers.sonarqube]
command = "docker"
args = ["run", "--rm", "-i", "-e", "SONARQUBE_TOKEN", "-e", "SONARQUBE_URL", "mcp/sonarqube"]
env = { "SONARQUBE_TOKEN" = "<YOUR_TOKEN>", "SONARQUBE_URL" = "<YOUR_SERVER_URL>" }Cursor
- To connect with SonarQube Cloud:
For SonarQube Cloud US, manually add "SONARQUBE_URL": "https://sonarqube.us" to the env section in your MCP configuration after installation.
- To connect with SonarQube Server:
Gemini CLI
You can install our MCP server extension by using the following command:
gemini extensions install https://github.com/SonarSource/sonarqube-mcp-serverYou will need to set the required environment variables before starting Gemini:
Environment Variables Required:
-
For SonarQube Cloud:
-
SONARQUBE_TOKEN- Your SonarQube Cloud token -
SONARQUBE_ORG- Your organization key -
SONARQUBE_URL- (Optional) Set tohttps://sonarqube.usfor SonarQube Cloud US
-
-
For SonarQube Server:
-
SONARQUBE_TOKEN- Your SonarQube Server USER token -
SONARQUBE_URL- Your SonarQube Server URL
-
Once installed, the extension will be installed under <home>/.gemini/extensions/sonarqube-mcp-server/gemini-extension.json.
GitHub Copilot CLI
After starting Copilot CLI, run the following command to add the SonarQube MCP server:
/mcp addYou will have to provide different information about the MCP server, you can use tab to navigate between fields.
- To connect with SonarQube Cloud:
Server Name: sonarqube
Server Type: Local (Press 1)
Command: docker
Arguments: run, --rm, -i, -e, SONARQUBE_TOKEN, -e, SONARQUBE_ORG, mcp/sonarqube
Environment Variables: SONARQUBE_TOKEN=<YOUR_TOKEN>,SONARQUBE_ORG=<YOUR_ORG>
Tools: *
For SonarQube Cloud US, add -e, SONARQUBE_URL to Arguments and SONARQUBE_URL=https://sonarqube.us to Environment Variables.
- To connect with SonarQube Server:
Server Name: sonarqube
Server Type: Local (Press 1)
Command: docker
Arguments: run, --rm, -i, -e, SONARQUBE_TOKEN, -e, SONARQUBE_URL, mcp/sonarqube
Environment Variables: SONARQUBE_TOKEN=<YOUR_USER_TOKEN>,SONARQUBE_URL=<YOUR_SERVER_URL>
Tools: *
The configuration file is located at ~/.copilot/mcp-config.json.
GitHub Copilot coding agent
GitHub Copilot coding agent can leverage the SonarQube MCP server directly in your CI/CD.
To add the secrets to your Copilot environment, follow the Copilot documentation. Only secrets with names prefixed with COPILOT_MCP_ will be available to your MCP configuration.
In your GitHub repository, navigate under Settings -> Copilot -> Coding agent, and add the following configuration in the MCP configuration section:
- To connect with SonarQube Cloud:
{
"mcpServers": {
"sonarqube": {
"type": "local",
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-e",
"SONARQUBE_TOKEN=$SONAR_TOKEN",
"-e",
"SONARQUBE_ORG=$SONAR_ORG",
"mcp/sonarqube"
],
"env": {
"SONAR_TOKEN": "COPILOT_MCP_SONARQUBE_TOKEN",
"SONAR_ORG": "COPILOT_MCP_SONARQUBE_ORG"
},
"tools": ["*"]
}
}
}
For SonarQube Cloud US, add "-e", "SONARQUBE_URL=$SONAR_URL" to the args array and "SONAR_URL": "COPILOT_MCP_SONARQUBE_URL" to the env section, then set the secret COPILOT_MCP_SONARQUBE_URL=https://sonarqube.us.
- To connect with SonarQube Server:
{
"mcpServers": {
"sonarqube": {
"type": "local",
"command": "docker",
"args": [
"run",
"--rm",
"-i",
"-e",
"SONARQUBE_TOKEN=$SONAR_TOKEN",
"-e",
"SONARQUBE_URL=$SONAR_URL",
"mcp/sonarqube"
],
"env": {
"SONAR_TOKEN": "COPILOT_MCP_SONARQUBE_USER_TOKEN",
"SONAR_URL": "COPILOT_MCP_SONARQUBE_URL"
},
"tools": ["*"]
}
}
}
Kiro
Create a .kiro/settings/mcp.json file in your workspace directory (or edit if it already exists), add the following configuration:
- To connect with SonarQube Cloud:
{
"mcpServers": {
"sonarqube": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SONARQUBE_TOKEN",
"-e",
"SONARQUBE_ORG",
"mcp/sonarqube"
],
"env": {
"SONARQUBE_TOKEN": "<YOUR_TOKEN>",
"SONARQUBE_ORG": "<YOUR_ORG>"
},
"disabled": false,
"autoApprove": []
}
}
}
For SonarQube Cloud US, add "-e", "SONARQUBE_URL" to the args array and "SONARQUBE_URL": "https://sonarqube.us" to the env section.
- To connect with SonarQube Server:
{
"mcpServers": {
"sonarqube": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SONARQUBE_TOKEN",
"-e",
"SONARQUBE_URL",
"mcp/sonarqube"
],
"env": {
"SONARQUBE_TOKEN": "<YOUR_USER_TOKEN>",
"SONARQUBE_URL": "<YOUR_SERVER_URL>"
},
"disabled": false,
"autoApprove": []
}
}
}
VS Code
You can use the following buttons to simplify the installation process within VS Code.
For SonarQube Cloud US, manually add "SONARQUBE_URL": "https://sonarqube.us" to the env section in your MCP configuration after installation.
Windsurf
SonarQube MCP Server is available as a Windsurf plugin. Follow these instructions:
- Open Windsurf Settings > Cascade > MCP Servers and select Open MCP Marketplace
- Search for
sonarqubeon the Cascade MCP Marketplace - Choose the SonarQube MCP Server and select Install
- Add the required SonarQube User token. Then add the organization key if you want to connect with SonarQube Cloud, or the SonarQube URL if you want to connect to SonarQube Server or Community Build.
For SonarQube Cloud US, set the URL to https://sonarqube.us.
Zed
Navigate to the Extensions view in Zed and search for SonarQube MCP Server. When installing the extension, you will be prompted to provide the necessary environment variables:
- When using SonarQube Cloud:
{
"sonarqube_token": "YOUR_SONARQUBE_TOKEN",
"sonarqube_org": "SONARQUBE_ORGANIZATION_KEY",
"docker_path": "DOCKER_PATH"
}
For SonarQube Cloud US, add "sonarqube_url": "https://sonarqube.us" to the configuration.
- When using SonarQube Server:
{
"sonarqube_token": "YOUR_SONARQUBE_USER_TOKEN",
"sonarqube_url": "YOUR_SONARQUBE_SERVER_URL",
"docker_path": "DOCKER_PATH"
}
The docker_path is the path to a docker executable. Examples:
Linux/macOS: /usr/bin/docker or /usr/local/bin/docker
Windows: C:\Program Files\Docker\Docker\resources\bin\docker.exe
You can manually install the SonarQube MCP server by copying the following snippet in the MCP servers configuration file:
- To connect with SonarQube Cloud:
{
"sonarqube": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SONARQUBE_TOKEN",
"-e",
"SONARQUBE_ORG",
"mcp/sonarqube"
],
"env": {
"SONARQUBE_TOKEN": "<token>",
"SONARQUBE_ORG": "<org>"
}
}
}- To connect with SonarQube Server:
{
"sonarqube": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SONARQUBE_TOKEN",
"-e",
"SONARQUBE_URL",
"mcp/sonarqube"
],
"env": {
"SONARQUBE_TOKEN": "<token>",
"SONARQUBE_URL": "<url>"
}
}
}The SonarQube MCP Server can integrate with SonarQube for IDE to further enhance your development workflow, providing better code analysis and insights directly within your IDE.
When using SonarQube for IDE, the SONARQUBE_IDE_PORT environment variable should be set with the correct port number. SonarQube for VS Code includes a Quick Install button, which automatically sets the correct port configuration.
For example, with SonarQube Cloud:
{
"sonarqube": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"SONARQUBE_TOKEN",
"-e",
"SONARQUBE_ORG",
"-e",
"SONARQUBE_IDE_PORT",
"mcp/sonarqube"
],
"env": {
"SONARQUBE_TOKEN": "<token>",
"SONARQUBE_ORG": "<org>",
"SONARQUBE_IDE_PORT": "<64120-64130>"
}
}
}When running the MCP server in a container on Linux, the container cannot access the SonarQube for IDE embedded server running on localhost. To allow the container to connect to the SonarQube for IDE server, add the
--network=hostoption to your container run command.
SonarQube MCP Server requires a Java Development Kit (JDK) version 21 or later to build.
Run the following Gradle command to clean the project and build the application:
./gradlew clean build -x testThe JAR file will be created in build/libs/.
You will then need to manually copy and paste the MCP configuration, as follows:
- To connect with SonarQube Cloud:
{
"sonarqube": {
"command": "java",
"args": [
"-jar",
"<path_to_sonarqube_mcp_server_jar>"
],
"env": {
"STORAGE_PATH": "<path_to_your_mcp_storage>",
"SONARQUBE_TOKEN": "<token>",
"SONARQUBE_ORG": "<org>"
}
}
}- To connect with SonarQube Server:
{
"sonarqube": {
"command": "java",
"args": [
"-jar",
"<path_to_sonarqube_mcp_server_jar>"
],
"env": {
"STORAGE_PATH": "<path_to_your_mcp_storage>",
"SONARQUBE_TOKEN": "<token>",
"SONARQUBE_URL": "<url>"
}
}
}Depending on your environment, you should provide specific environment variables.
You should add the following variable when running the MCP Server:
| Environment variable | Description |
|---|---|
STORAGE_PATH |
Mandatory absolute path to a writable directory where SonarQube MCP Server will store its files (e.g., for creation, updates, and persistence), it is automatically provided when using the container image |
SONARQUBE_IDE_PORT |
Optional port number between 64120 and 64130 used to connect SonarQube MCP Server with SonarQube for IDE. |
Beta availability: The advanced analysis tool is currently in beta and only available to specific SonarQube Cloud users with access enabled. If your organization does not have access, enabling it will not provide any additional functionality.
| Environment variable | Description |
|---|---|
SONARQUBE_ADVANCED_ANALYSIS_ENABLED |
When set to true on SonarQube Cloud, enables the advanced analysis tool run_advanced_code_analysis. When enabled, it replaces all local analysis tools (analyze_code_snippet, analyze_file_list, toggle_automatic_analysis). On SonarQube Server, the flag is ignored and the server falls back to standard analysis. Default: false. |
By default, all tools are enabled. You can selectively enable specific toolsets to reduce context overhead and focus on specific functionality.
| Environment variable | Description |
|---|---|
SONARQUBE_TOOLSETS |
Comma-separated list of toolsets to enable. When set, only these toolsets will be available. If not set, all tools are enabled. Note: The projects toolset is always enabled as it's required to find project keys for other operations. |
SONARQUBE_READ_ONLY |
When set to true, enables read-only mode which disables all write operations (changing issue status for example). This filter is cumulative with SONARQUBE_TOOLSETS if both are set. Default: false. |
Available Toolsets
| Toolset | Key | Description |
|---|---|---|
| Analysis | analysis |
Code analysis tools (local analysis and advanced remote analysis) |
| Issues | issues |
Search and manage SonarQube issues |
| Projects | projects |
Browse and search SonarQube projects |
| Quality Gates | quality-gates |
Access quality gates and their status |
| Rules | rules |
Browse and search SonarQube rules |
| Sources | sources |
Access source code and SCM information |
| Duplications | duplications |
Find code duplications across projects |
| Measures | measures |
Retrieve metrics and measures (includes both measures and metrics tools) |
| Languages | languages |
List supported programming languages |
| Portfolios | portfolios |
Manage portfolios and enterprises (Cloud and Server) |
| System | system |
System administration tools (Server only) |
| Webhooks | webhooks |
Manage webhooks |
| Dependency Risks | dependency-risks |
Analyze dependency risks and security issues (SCA) |
Enable analysis, issues, and quality gates toolsets (using Docker with SonarQube Cloud):
docker run -i --rm \
-e SONARQUBE_TOKEN="<token>" \
-e SONARQUBE_ORG="<org>" \
-e SONARQUBE_TOOLSETS="analysis,issues,quality-gates" \
mcp/sonarqubeNote: The projects toolset is always enabled automatically, so you don't need to include it in SONARQUBE_TOOLSETS.
Enable read-only mode (using Docker with SonarQube Cloud):
docker run -i --rm \
-e SONARQUBE_TOKEN="<token>" \
-e SONARQUBE_ORG="<org>" \
-e SONARQUBE_READ_ONLY="true" \
mcp/sonarqubeTo enable full functionality, the following environment variables must be set before starting the server:
| Environment variable | Description | Required |
|---|---|---|
SONARQUBE_TOKEN |
Your SonarQube Cloud token | Yes |
SONARQUBE_ORG |
Your SonarQube Cloud organization key | Yes |
SONARQUBE_URL |
Custom SonarQube Cloud URL (defaults to https://sonarcloud.io). Use this for SonarQube Cloud US: https://sonarqube.us
|
No |
Examples:
-
SonarQube Cloud: Only
SONARQUBE_TOKENandSONARQUBE_ORGare needed -
SonarQube Cloud US: Set
SONARQUBE_TOKEN,SONARQUBE_ORG, andSONARQUBE_URL=https://sonarqube.us
| Environment variable | Description | Required |
|---|---|---|
SONARQUBE_TOKEN |
Your SonarQube Server USER token | Yes |
SONARQUBE_URL |
Your SonarQube Server URL | Yes |
⚠️ Connection to SonarQube Server requires a token of type USER and will not function properly if project tokens or global tokens are used.
💡 Configuration Tip: The presence of
SONARQUBE_ORGdetermines whether you're connecting to SonarQube Cloud or Server. IfSONARQUBE_ORGis set, SonarQube Cloud is used; otherwise, SonarQube Server is used.
The SonarQube MCP Server supports three transport modes:
The recommended mode for local development and single-user setups, used by all MCP clients.
Example - Docker with SonarQube Cloud:
{
"mcpServers": {
"sonarqube": {
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "SONARQUBE_TOKEN", "-e", "SONARQUBE_ORG", "mcp/sonarqube"],
"env": {
"SONARQUBE_TOKEN": "<your-token>",
"SONARQUBE_ORG": "<your-org>"
}
}
}
}Unencrypted HTTP transport. Use HTTPS instead for multi-user deployments.
⚠️ Not Recommended: Use Stdio for local development or HTTPS for multi-user production deployments.
| Environment variable | Description | Default |
|---|---|---|
SONARQUBE_TRANSPORT |
Set to http to enable HTTP transport |
Not set (stdio) |
SONARQUBE_HTTP_PORT |
Port number (1024-65535) | 8080 |
SONARQUBE_HTTP_HOST |
Host to bind (defaults to localhost for security) | 127.0.0.1 |
Note: In HTTP(S) mode, each client sends their own token via the SONARQUBE_TOKEN header. The server's SONARQUBE_TOKEN is only used for initialization.
Secure multi-user transport with TLS encryption. Requires SSL certificates.
✅ Recommended for Production: Use HTTPS when deploying the MCP server for multiple users. The server binds to
127.0.0.1(localhost) by default for security.
| Environment variable | Description | Default |
|---|---|---|
SONARQUBE_TRANSPORT |
Set to https to enable HTTPS transport |
Not set (stdio) |
SONARQUBE_HTTP_PORT |
Port number (typically 8443 for HTTPS) | 8080 |
SONARQUBE_HTTP_HOST |
Host to bind (defaults to localhost for security) | 127.0.0.1 |
SSL Certificate Configuration (Optional):
| Environment variable | Description | Default |
|---|---|---|
SONARQUBE_HTTPS_KEYSTORE_PATH |
Path to keystore file (.p12 or .jks) | /etc/ssl/mcp/keystore.p12 |
SONARQUBE_HTTPS_KEYSTORE_PASSWORD |
Keystore password | sonarlint |
SONARQUBE_HTTPS_KEYSTORE_TYPE |
Keystore type (PKCS12 or JKS) | PKCS12 |
Example - Docker with SonarQube Cloud:
docker run -p 8443:8443 \
-v $(pwd)/keystore.p12:/etc/ssl/mcp/keystore.p12:ro \
-e SONARQUBE_TRANSPORT=https \
-e SONARQUBE_HTTP_PORT=8443 \
-e SONARQUBE_TOKEN="<init-token>" \
-e SONARQUBE_ORG="<your-org>" \
mcp/sonarqubeClient Configuration:
{
"mcpServers": {
"sonarqube-https": {
"url": "https://your-server:8443/mcp",
"headers": {
"SONARQUBE_TOKEN": "<your-token>"
}
}
}
}Note: For local development, use Stdio transport instead (the default). HTTPS is intended for multi-user production deployments with proper SSL certificates.
If your SonarQube Server uses a self-signed certificate or a certificate from a private Certificate Authority (CA), you can add custom certificates to the container that will automatically be installed.
Configuration
Mount a directory containing your certificates when running the container:
docker run -i --rm \
-v /path/to/your/certificates/:/usr/local/share/ca-certificates/:ro \
-e SONARQUBE_TOKEN="<token>" \
-e SONARQUBE_URL="<url>" \
mcp/sonarqubeThe container supports the following certificate formats:
-
.crtfiles (PEM or DER encoded) -
.pemfiles (PEM encoded)
When using custom certificates, you can modify your MCP configuration to mount the certificates:
{
"sonarqube": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-v",
"/path/to/your/certificates/:/usr/local/share/ca-certificates/:ro",
"-e",
"SONARQUBE_TOKEN",
"-e",
"SONARQUBE_URL",
"mcp/sonarqube"
],
"env": {
"SONARQUBE_TOKEN": "<token>",
"SONARQUBE_URL": "<url>"
}
}
}The SonarQube MCP Server supports HTTP proxies through standard Java proxy system properties.
Configuration
You can configure proxy settings using Java system properties. These can be set as environment variables or passed as JVM arguments.
Common Proxy Properties:
| Property | Description | Example |
|---|---|---|
http.proxyHost |
HTTP proxy hostname | proxy.example.com |
http.proxyPort |
HTTP proxy port | 8080 |
https.proxyHost |
HTTPS proxy hostname | proxy.example.com |
https.proxyPort |
HTTPS proxy port | 8443 |
http.nonProxyHosts |
Hosts that bypass the proxy (pipe-separated) | localhost|127.0.0.1|*.internal.com |
If your proxy requires authentication, the SonarQube MCP Server uses Java's standard authentication mechanism. You can set up proxy credentials using Java system properties:
| Property | Description | Example |
|---|---|---|
http.proxyUser |
HTTP proxy username | myuser |
http.proxyPassword |
HTTP proxy password | mypassword |
https.proxyUser |
HTTPS proxy username | myuser |
https.proxyPassword |
HTTPS proxy password | mypassword |
-
analyze_code_snippet - Analyze a file or code snippet with SonarQube analyzers to identify code quality and security issues. Specify the language of the snippet to improve analysis accuracy.
-
codeSnippet- Code snippet or full file content - Required String -
language- Optional language of the code snippet - String -
scope- Optional scope of the file: MAIN or TEST (default: MAIN) - String
-
When integration with SonarQube for IDE is enabled:
-
analyze_file_list - Analyze files in the current working directory using SonarQube for IDE. This tool connects to a running SonarQube for IDE instance to perform code quality analysis on a list of files.
-
file_absolute_paths- List of absolute file paths to analyze - Required String[]
-
-
toggle_automatic_analysis - Enable or disable SonarQube for IDE automatic analysis. When enabled, SonarQube for IDE will automatically analyze files as they are modified in the working directory. When disabled, automatic analysis is turned off.
-
enabled- Enable or disable the automatic analysis - Required Boolean
-
When advanced analysis is enabled (SONARQUBE_ADVANCED_ANALYSIS_ENABLED=true):
Beta availability: The advanced analysis tool is currently in beta and only available to specific SonarQube Cloud users with access enabled.
-
run_advanced_code_analysis - Run advanced code analysis on SonarQube Cloud for a single file. Organization is inferred from MCP configuration.
-
projectKey- The key of the project - Required String -
branchName- Branch name used to retrieve the latest analysis context - Required String -
filePath- Project-relative path of the file to analyze (e.g., 'src/main/java/MyClass.java') - Required String -
fileContent- The original content of the file to analyze - Required String -
fileScope- Defines in which scope the file originates from: 'MAIN' or 'TEST' (default: MAIN) - String
-
Note: Dependency risks are only available when connecting to SonarQube Server 2025.4 Enterprise or higher with SonarQube Advanced Security enabled.
-
search_dependency_risks - Search for software composition analysis issues (dependency risks) of a SonarQube project, paired with releases that appear in the analyzed project, application, or portfolio.
-
projectKey- Project key - String -
branchKey- Optional branch key - String -
pullRequestKey- Optional pull request key - String
-
Note: Enterprises are only available when connecting to SonarQube Cloud.
-
list_enterprises - List the enterprises available in SonarQube Cloud that you have access to. Use this tool to discover enterprise IDs that can be used with other tools.
-
enterpriseKey- Optional enterprise key to filter results - String
-
-
change_sonar_issue_status - Change the status of a SonarQube issue to "accept", "falsepositive" or to "reopen" an issue.
-
key- Issue key - Required String -
status- New issue's status - Required Enum {"accept", "falsepositive", "reopen"}
-
-
search_sonar_issues_in_projects - Search for SonarQube issues in my organization's projects.
-
projects- Optional list of Sonar projects - String[] -
branch- Optional branch name to search for issues in - String -
pullRequestId- Optional Pull Request's identifier - String -
severities- Optional list of severities to filter by. Possible values: INFO, LOW, MEDIUM, HIGH, BLOCKER - String[] -
impactSoftwareQualities- Optional list of software qualities to filter by. Possible values: MAINTAINABILITY, RELIABILITY, SECURITY - String[] -
issueStatuses- Optional list of issue statuses to filter by. Possible values: OPEN, CONFIRMED, FALSE_POSITIVE, ACCEPTED, FIXED, IN_SANDBOX - String[] -
issueKey- Optional issue key to fetch a specific issue - String -
p- Optional page number (default: 1) - Integer -
ps- Optional page size. Must be greater than 0 and less than or equal to 500 (default: 100) - Integer
-
-
list_languages - List all programming languages supported in this SonarQube instance.
-
q- Optional pattern to match language keys/names against - String
-
-
get_component_measures - Get SonarQube measures for a component (project, directory, file).
-
component- Optional component key to get measures for - String -
branch- Optional branch to analyze for measures - String -
metricKeys- Optional metric keys to retrieve (e.g. ncloc, complexity, violations, coverage) - String[] -
pullRequest- Optional pull request identifier to analyze for measures - String
-
-
search_metrics - Search for SonarQube metrics.
-
p- Optional page number (default: 1) - Integer -
ps- Optional page size. Must be greater than 0 and less than or equal to 500 (default: 100) - Integer
-
-
list_portfolios - List enterprise portfolios available in SonarQube with filtering and pagination options.
For SonarQube Server:
-
q- Optional search query to filter portfolios by name or key - String -
favorite- If true, only returns favorite portfolios - Boolean -
pageIndex- Optional 1-based page number (default: 1) - Integer -
pageSize- Optional page size, max 500 (default: 100) - Integer
For SonarQube Cloud:
-
enterpriseId- Enterprise uuid. Can be omitted only if 'favorite' parameter is supplied with value true - String -
q- Optional search query to filter portfolios by name - String -
favorite- Required to be true if 'enterpriseId' parameter is omitted. If true, only returns portfolios favorited by the logged-in user. Cannot be true when 'draft' is true - Boolean -
draft- If true, only returns drafts created by the logged-in user. Cannot be true when 'favorite' is true - Boolean -
pageIndex- Optional index of the page to fetch (default: 1) - Integer -
pageSize- Optional size of the page to fetch (default: 50) - Integer
-
-
search_my_sonarqube_projects - Find SonarQube projects. The response is paginated.
-
page- Optional page number - String
-
-
get_project_quality_gate_status - Get the Quality Gate Status for the SonarQube project.
-
analysisId- Optional analysis ID - String -
branch- Optional branch key - String -
projectId- Optional project ID - String -
projectKey- Optional project key - String -
pullRequest- Optional pull request ID - String
-
-
list_quality_gates - List all quality gates in my SonarQube.
-
list_rule_repositories - List rule repositories available in SonarQube.
-
language- Optional language key - String -
q- Optional search query - String
-
-
show_rule - Shows detailed information about a SonarQube rule.
-
key- Rule key - Required String
-
-
search_duplicated_files - Search for files with code duplications in a SonarQube project. By default, automatically fetches all duplicated files across all pages (up to 10,000 files max). Returns only files with duplications.
-
projectKey- Project key - Required String -
branch- Optional branch key - String -
pullRequest- Optional pull request id - String -
pageSize- Optional number of results per page for manual pagination (max: 500). If not specified, auto-fetches all duplicated files - Integer -
pageIndex- Optional page number for manual pagination (starts at 1). If not specified, auto-fetches all duplicated files - Integer
-
-
get_duplications - Get duplications for a file. Require Browse permission on file's project.
-
key- File key - Required String -
branch- Optional branch key - String -
pullRequest- Optional pull request id - String
-
-
get_raw_source - Get source code as raw text from SonarQube. Require 'See Source Code' permission on file.
-
key- File key - Required String -
branch- Optional branch key - String -
pullRequest- Optional pull request id - String
-
-
get_scm_info - Get SCM information of SonarQube source files. Require See Source Code permission on file's project.
-
key- File key - Required String -
commits_by_line- Group lines by SCM commit if value is false, else display commits for each line - String -
from- First line to return. Starts at 1 - Number -
to- Last line to return (inclusive) - Number
-
Note: System tools are only available when connecting to SonarQube Server.
-
get_system_health - Get the health status of SonarQube Server instance. Returns GREEN (fully operational), YELLOW (usable but needs attention), or RED (not operational).
-
get_system_info - Get detailed information about SonarQube Server system configuration including JVM state, database, search indexes, and settings. Requires 'Administer' permissions.
-
get_system_logs - Get SonarQube Server system logs in plain-text format. Requires system administration permission.
-
name- Optional name of the logs to get. Possible values: access, app, ce, deprecation, es, web. Default: app - String
-
-
ping_system - Ping the SonarQube Server system to check if it's alive. Returns 'pong' as plain text.
-
get_system_status - Get state information about SonarQube Server. Returns status (STARTING, UP, DOWN, RESTARTING, DB_MIGRATION_NEEDED, DB_MIGRATION_RUNNING), version, and id.
-
create_webhook - Create a new webhook for the SonarQube organization or project. Requires 'Administer' permission on the specified project, or global 'Administer' permission.
-
name- Webhook name - Required String -
url- Webhook URL - Required String -
projectKey- Optional project key for project-specific webhook - String -
secret- Optional webhook secret for securing the webhook payload - String
-
-
list_webhooks - List all webhooks for the SonarQube organization or project. Requires 'Administer' permission on the specified project, or global 'Administer' permission.
-
projectKey- Optional project key to list project-specific webhooks - String
-
Once you've set up the SonarQube MCP Server, here are some example prompts for common real-world scenarios:
Fixing a Failing Quality Gate
My quality gate is failing for my project. Can you help me understand why and fix the most critical issues?
The quality gate on my feature branch is red. What do I need to fix to get it passing before I can merge to main?
Pre-Release and Pre-Merge Checks
I'm about to merge my pull request <#247> for the <web-app> project. Can you check if there are any quality issues I should address first?
We're deploying to production tomorrow. Can you check the quality gate status and alert me to any critical issues in this branch?
Improving Code Quality
I want to reduce technical debt in my project. What are the top issues I should prioritize?
Our code coverage dropped below 70%. Can you identify which files have the lowest coverage and help me improve it?
Understanding and Fixing Issues
I have 15 new code smells in my latest commit. Can you explain what they are and help me fix them?
SonarQube flagged a critical security vulnerability in <AuthController.java>. What's the issue and how do I fix it?
Security and Dependency Management
We need to pass a security audit. Can you check all our projects for security vulnerabilities and create a prioritized list of what needs to be fixed?
Are there any known vulnerabilities in our dependencies? Check this project for dependency risks.
Code Review Assistance
I just wrote this authentication function. Can you analyze it for security issues and code quality problems before I commit?
Review the changes in <src/database/migrations> for any potential bugs or security issues.
Project Health Monitoring
Give me a health report for my project: quality gate status, number of bugs, security hotspots, and code coverage.
Compare code quality between our main branch and the develop branch. Are we introducing new issues?
Team Collaboration
What are the most common rule violations across all our projects? We might need to update our coding standards.
Show me all the issues that were marked as false positives in the last month. Are we seeing patterns that suggest our rules need adjustment?
Applications logs will be written to the STORAGE_PATH/logs/mcp.log file.
This server collects anonymous usage data and sends it to SonarSource to help improve the product. No source code or IP address is collected, and SonarSource does not share the data with anyone else. Collection of telemetry can be disabled with the following system property or environment variable: TELEMETRY_DISABLED=true. Click here to see a sample of the data that are collected.
Copyright 2025 SonarSource.
Licensed under the SONAR Source-Available License v1.0. Using the SonarQube MCP Server in compliance with this documentation is a Non-Competitive Purpose and so is allowed under the SSAL.
Your use of SonarQube via MCP is governed by the SonarQube Cloud Terms of Service or SonarQube Server Terms and Conditions, including use of the Results Data solely for your internal software development purposes.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for sonarqube-mcp-server
Similar Open Source Tools
sonarqube-mcp-server
The SonarQube MCP Server is a Model Context Protocol (MCP) server that enables seamless integration with SonarQube Server or Cloud for code quality and security. It supports the analysis of code snippets directly within the agent context. The server provides various tools for analyzing code, managing issues, accessing metrics, and interacting with SonarQube projects. It also supports advanced features like dependency risk analysis, enterprise portfolio management, and system health checks. The server can be configured for different transport modes, proxy settings, and custom certificates. Telemetry data collection can be disabled if needed.
open-edison
OpenEdison is a secure MCP control panel that connects AI to data/software with additional security controls to reduce data exfiltration risks. It helps address the lethal trifecta problem by providing visibility, monitoring potential threats, and alerting on data interactions. The tool offers features like data leak monitoring, controlled execution, easy configuration, visibility into agent interactions, a simple API, and Docker support. It integrates with LangGraph, LangChain, and plain Python agents for observability and policy enforcement. OpenEdison helps gain observability, control, and policy enforcement for AI interactions with systems of records, existing company software, and data to reduce risks of AI-caused data leakage.
capsule
Capsule is a secure and durable runtime for AI agents, designed to coordinate tasks in isolated environments. It allows for long-running workflows, large-scale processing, autonomous decision-making, and multi-agent systems. Tasks run in WebAssembly sandboxes with isolated execution, resource limits, automatic retries, and lifecycle tracking. It enables safe execution of untrusted code within AI agent systems.
core
CORE is an open-source unified, persistent memory layer for all AI tools, allowing developers to maintain context across different tools like Cursor, ChatGPT, and Claude. It aims to solve the issue of context switching and information loss between sessions by creating a knowledge graph that remembers conversations, decisions, and insights. With features like unified memory, temporal knowledge graph, browser extension, chat with memory, auto-sync from apps, and MCP integration hub, CORE provides a seamless experience for managing and recalling context. The tool's ingestion pipeline captures evolving context through normalization, extraction, resolution, and graph integration, resulting in a dynamic memory that grows and changes with the user. When recalling from memory, CORE utilizes search, re-ranking, filtering, and output to provide relevant and contextual answers. Security measures include data encryption, authentication, access control, and vulnerability reporting.
ruler
Ruler is a tool designed to centralize AI coding assistant instructions, providing a single source of truth for managing instructions across multiple AI coding tools. It helps in avoiding inconsistent guidance, duplicated effort, context drift, onboarding friction, and complex project structures by automatically distributing instructions to the right configuration files. With support for nested rule loading, Ruler can handle complex project structures with context-specific instructions for different components. It offers features like centralised rule management, nested rule loading, automatic distribution, targeted agent configuration, MCP server propagation, .gitignore automation, and a simple CLI for easy configuration management.
LightRAG
LightRAG is a repository hosting the code for LightRAG, a system that supports seamless integration of custom knowledge graphs, Oracle Database 23ai, Neo4J for storage, and multiple file types. It includes features like entity deletion, batch insert, incremental insert, and graph visualization. LightRAG provides an API server implementation for RESTful API access to RAG operations, allowing users to interact with it through HTTP requests. The repository also includes evaluation scripts, code for reproducing results, and a comprehensive code structure.
mcp-victoriametrics
The VictoriaMetrics MCP Server is an implementation of Model Context Protocol (MCP) server for VictoriaMetrics. It provides access to your VictoriaMetrics instance and seamless integration with VictoriaMetrics APIs and documentation. The server allows you to use almost all read-only APIs of VictoriaMetrics, enabling monitoring, observability, and debugging tasks related to your VictoriaMetrics instances. It also contains embedded up-to-date documentation and tools for exploring metrics, labels, alerts, and more. The server can be used for advanced automation and interaction capabilities for engineers and tools.
crush
Crush is a versatile tool designed to enhance coding workflows in your terminal. It offers support for multiple LLMs, allows for flexible switching between models, and enables session-based work management. Crush is extensible through MCPs and works across various operating systems. It can be installed using package managers like Homebrew and NPM, or downloaded directly. Crush supports various APIs like Anthropic, OpenAI, Groq, and Google Gemini, and allows for customization through environment variables. The tool can be configured locally or globally, and supports LSPs for additional context. Crush also provides options for ignoring files, allowing tools, and configuring local models. It respects `.gitignore` files and offers logging capabilities for troubleshooting and debugging.
swarmzero
SwarmZero SDK is a library that simplifies the creation and execution of AI Agents and Swarms of Agents. It supports various LLM Providers such as OpenAI, Azure OpenAI, Anthropic, MistralAI, Gemini, Nebius, and Ollama. Users can easily install the library using pip or poetry, set up the environment and configuration, create and run Agents, collaborate with Swarms, add tools for complex tasks, and utilize retriever tools for semantic information retrieval. Sample prompts are provided to help users explore the capabilities of the agents and swarms. The SDK also includes detailed examples and documentation for reference.
parrot.nvim
Parrot.nvim is a Neovim plugin that prioritizes a seamless out-of-the-box experience for text generation. It simplifies functionality and focuses solely on text generation, excluding integration of DALLE and Whisper. It supports persistent conversations as markdown files, custom hooks for inline text editing, multiple providers like Anthropic API, perplexity.ai API, OpenAI API, Mistral API, and local/offline serving via ollama. It allows custom agent definitions, flexible API credential support, and repository-specific instructions with a `.parrot.md` file. It does not have autocompletion or hidden requests in the background to analyze files.
vim-ai
vim-ai is a plugin that adds Artificial Intelligence (AI) capabilities to Vim and Neovim. It allows users to generate code, edit text, and have interactive conversations with GPT models powered by OpenAI's API. The plugin uses OpenAI's API to generate responses, requiring users to set up an account and obtain an API key. It supports various commands for text generation, editing, and chat interactions, providing a seamless integration of AI features into the Vim text editor environment.
avante.nvim
avante.nvim is a Neovim plugin that emulates the behavior of the Cursor AI IDE, providing AI-driven code suggestions and enabling users to apply recommendations to their source files effortlessly. It offers AI-powered code assistance and one-click application of suggested changes, streamlining the editing process and saving time. The plugin is still in early development, with functionalities like setting API keys, querying AI about code, reviewing suggestions, and applying changes. Key bindings are available for various actions, and the roadmap includes enhancing AI interactions, stability improvements, and introducing new features for coding tasks.
text-extract-api
The text-extract-api is a powerful tool that allows users to convert images, PDFs, or Office documents to Markdown text or JSON structured documents with high accuracy. It is built using FastAPI and utilizes Celery for asynchronous task processing, with Redis for caching OCR results. The tool provides features such as PDF/Office to Markdown and JSON conversion, improving OCR results with LLama, removing Personally Identifiable Information from documents, distributed queue processing, caching using Redis, switchable storage strategies, and a CLI tool for task management. Users can run the tool locally or on cloud services, with support for GPU processing. The tool also offers an online demo for testing purposes.
ai
A TypeScript toolkit for building AI-driven video workflows on the server, powered by Mux! @mux/ai provides purpose-driven workflow functions and primitive functions that integrate with popular AI/LLM providers like OpenAI, Anthropic, and Google. It offers pre-built workflows for tasks like generating summaries and tags, content moderation, chapter generation, and more. The toolkit is cost-effective, supports multi-modal analysis, tone control, and configurable thresholds, and provides full TypeScript support. Users can easily configure credentials for Mux and AI providers, as well as cloud infrastructure like AWS S3 for certain workflows. @mux/ai is production-ready, offers composable building blocks, and supports universal language detection.
python-tgpt
Python-tgpt is a Python package that enables seamless interaction with over 45 free LLM providers without requiring an API key. It also provides image generation capabilities. The name _python-tgpt_ draws inspiration from its parent project tgpt, which operates on Golang. Through this Python adaptation, users can effortlessly engage with a number of free LLMs available, fostering a smoother AI interaction experience.
aider-desk
AiderDesk is a desktop application that enhances coding workflow by leveraging AI capabilities. It offers an intuitive GUI, project management, IDE integration, MCP support, settings management, cost tracking, structured messages, visual file management, model switching, code diff viewer, one-click reverts, and easy sharing. Users can install it by downloading the latest release and running the executable. AiderDesk also supports Python version detection and auto update disabling. It includes features like multiple project management, context file management, model switching, chat mode selection, question answering, cost tracking, MCP server integration, and MCP support for external tools and context. Development setup involves cloning the repository, installing dependencies, running in development mode, and building executables for different platforms. Contributions from the community are welcome following specific guidelines.
For similar tasks
sonarqube-mcp-server
The SonarQube MCP Server is a Model Context Protocol (MCP) server that enables seamless integration with SonarQube Server or Cloud for code quality and security. It supports the analysis of code snippets directly within the agent context. The server provides various tools for analyzing code, managing issues, accessing metrics, and interacting with SonarQube projects. It also supports advanced features like dependency risk analysis, enterprise portfolio management, and system health checks. The server can be configured for different transport modes, proxy settings, and custom certificates. Telemetry data collection can be disabled if needed.
mcp-memory-service
The MCP Memory Service is a universal memory service designed for AI assistants, providing semantic memory search and persistent storage. It works with various AI applications and offers fast local search using SQLite-vec and global distribution through Cloudflare. The service supports intelligent memory management, universal compatibility with AI tools, flexible storage options, and is production-ready with cross-platform support and secure connections. Users can store and recall memories, search by tags, check system health, and configure the service for Claude Desktop integration and environment variables.
For similar jobs
kaito
Kaito is an operator that automates the AI/ML inference model deployment in a Kubernetes cluster. It manages large model files using container images, avoids tuning deployment parameters to fit GPU hardware by providing preset configurations, auto-provisions GPU nodes based on model requirements, and hosts large model images in the public Microsoft Container Registry (MCR) if the license allows. Using Kaito, the workflow of onboarding large AI inference models in Kubernetes is largely simplified.
ai-on-gke
This repository contains assets related to AI/ML workloads on Google Kubernetes Engine (GKE). Run optimized AI/ML workloads with Google Kubernetes Engine (GKE) platform orchestration capabilities. A robust AI/ML platform considers the following layers: Infrastructure orchestration that support GPUs and TPUs for training and serving workloads at scale Flexible integration with distributed computing and data processing frameworks Support for multiple teams on the same infrastructure to maximize utilization of resources
tidb
TiDB is an open-source distributed SQL database that supports Hybrid Transactional and Analytical Processing (HTAP) workloads. It is MySQL compatible and features horizontal scalability, strong consistency, and high availability.
nvidia_gpu_exporter
Nvidia GPU exporter for prometheus, using `nvidia-smi` binary to gather metrics.
tracecat
Tracecat is an open-source automation platform for security teams. It's designed to be simple but powerful, with a focus on AI features and a practitioner-obsessed UI/UX. Tracecat can be used to automate a variety of tasks, including phishing email investigation, evidence collection, and remediation plan generation.
openinference
OpenInference is a set of conventions and plugins that complement OpenTelemetry to enable tracing of AI applications. It provides a way to capture and analyze the performance and behavior of AI models, including their interactions with other components of the application. OpenInference is designed to be language-agnostic and can be used with any OpenTelemetry-compatible backend. It includes a set of instrumentations for popular machine learning SDKs and frameworks, making it easy to add tracing to your AI applications.
BricksLLM
BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM: * Set LLM usage limits for users on different pricing tiers * Track LLM usage on a per user and per organization basis * Block or redact requests containing PIIs * Improve LLM reliability with failovers, retries and caching * Distribute API keys with rate limits and cost limits for internal development/production use cases * Distribute API keys with rate limits and cost limits for students
kong
Kong, or Kong API Gateway, is a cloud-native, platform-agnostic, scalable API Gateway distinguished for its high performance and extensibility via plugins. It also provides advanced AI capabilities with multi-LLM support. By providing functionality for proxying, routing, load balancing, health checking, authentication (and more), Kong serves as the central layer for orchestrating microservices or conventional API traffic with ease. Kong runs natively on Kubernetes thanks to its official Kubernetes Ingress Controller.