Dive

Dive

Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨

Stars: 1566

Visit
 screenshot

Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. It offers universal LLM support, cross-platform compatibility, model context protocol for AI agent integration, OAP cloud integration, dual architecture for optimal performance, multi-language support, advanced API management, granular tool control, custom instructions, auto-update mechanism, and more. Dive provides a user-friendly interface for managing multiple AI models and tools, with recent updates introducing major architecture changes, new features, improvements, and platform availability. Users can easily download and install Dive on Windows, MacOS, and Linux, and set up MCP tools through local servers or OAP cloud services.

README:

Dive AI Agent 🤿 🤖

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit Discord Twitter Follow

Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨

Dive Demo

Features 🎯

  • 🌐 Universal LLM Support: Compatible with ChatGPT, Anthropic, Ollama and OpenAI-compatible models
  • 💻 Cross-Platform: Available for Windows, MacOS, and Linux
  • 🔄 Model Context Protocol: Enabling seamless MCP AI agent integration on both stdio and SSE mode
  • ☁️ OAP Cloud Integration: One-click access to managed MCP servers via OAPHub.ai - eliminates complex local deployments
  • 🏗️ Dual Architecture: Modern Tauri version alongside traditional Electron version for optimal performance
  • 🌍 Multi-Language Support: Traditional Chinese, Simplified Chinese, English, Spanish, Japanese, Korean with more coming soon
  • ⚙️ Advanced API Management: Multiple API keys and model switching support with model_settings.json
  • 🛠️ Granular Tool Control: Enable/disable individual MCP tools for precise customization
  • 💡 Custom Instructions: Personalized system prompts for tailored AI behavior
  • 🔄 Auto-Update Mechanism: Automatically checks for and installs the latest application updates

Recent updates(2025/9/11) - v0.9.5 🎉

Major Architecture Changes

  • 🏗️ Dual Architecture Support: Dive now supports both Electron and Tauri frameworks simultaneously
  • Tauri Version: New modern architecture with optimized installer size (Windows < 30MB)
  • 🌐 OAP Platform Integration: Native support for OAPHub.ai cloud services with one-click MCP server deployment

New Features & Improvements

  • 🔐 OAP Authentication: Comprehensive OAP login and authentication support
  • 📁 Enhanced Model Configuration: Complete restructuring with model_settings.json for managing multiple models
  • 🛠️ Granular MCP Control: Individual tool enable/disable functionality for better customization
  • 🎨 UI/UX Enhancements: Streamlined settings interface with combined pages for better user experience
  • 🔧 Improved Network Handling: Enhanced port resolution logic with interval polling for better connectivity
  • ⚙️ Enhanced Model Settings: Improved OpenAI compatible model settings and tool integration in prompts
  • 🐧 Linux Tauri Support: Full Tauri framework support now available on Linux platforms
  • 📦 Smart Dependency Management: Automatic detection and updating of MCP host dependencies
  • 🔄 Updated dive-mcp-host: Latest architectural improvements incorporated

Platform Availability

  • Windows: Available in both Electron and Tauri versions ✅
  • macOS: Currently Electron only 🔜
  • Linux: Available in both Electron and Tauri versions ✅

Migration Note: Existing local MCP/LLM configurations remain fully supported. OAP integration is additive and does not affect current workflows.

Download and Install ⬇️

Get the latest version of Dive: Download

Windows users: 🪟

Choose between two architectures:

  • Tauri Version (Recommended): Smaller installer (<30MB), modern architecture
  • Electron Version: Traditional architecture, fully stable
  • Python and Node.js environments will be downloaded automatically after launching

MacOS users: 🍎

  • Electron Version: Download the .dmg version
  • You need to install Python and Node.js (with npx uvx) environments yourself
  • Follow the installation prompts to complete setup

Linux users: 🐧

Choose between two architectures:

  • Tauri Version (Recommended): Modern architecture with smaller installer size
  • Electron Version: Traditional architecture with .AppImage format
  • You need to install Python and Node.js (with npx uvx) environments yourself
  • For Ubuntu/Debian users:
    • You may need to add --no-sandbox parameter
    • Or modify system settings to allow sandbox
    • Run chmod +x to make the AppImage executable

MCP Setup Options

Dive offers two ways to access MCP tools: OAP Cloud Services (recommended for beginners) and Local MCP Servers (for advanced users).

Option 1: Local MCP Servers 🛠️

For advanced users who prefer local control. The system comes with a default echo MCP Server, and you can add more powerful tools like Fetch and Youtube-dl.

Set MCP

Option 2: OAP Cloud Services ☁️

The easiest way to get started! Access enterprise-grade MCP tools instantly:

  1. Sign up at OAPHub.ai
  2. Connect to Dive using one-click deep links or configuration files
  3. Enjoy managed MCP servers with zero setup - no Python, Docker, or complex dependencies required

Benefits:

  • ✅ Zero configuration needed
  • ✅ Cross-platform compatibility
  • ✅ Enterprise-grade reliability
  • ✅ Automatic updates and maintenance

Quick Local Setup

Add this JSON configuration to your Dive MCP settings to enable local tools:

 "mcpServers":{
    "fetch": {
      "command": "uvx",
      "args": [
        "mcp-server-fetch",
        "--ignore-robots-txt"
      ],
      "enabled": true
    },
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/path/to/allowed/files"
      ],
      "enabled": true
    },
    "youtubedl": {
      "command": "npx",
      "args": [
        "@kevinwatt/yt-dlp-mcp"
      ],
      "enabled": true
    }
  }

Using Streamable HTTP for Cloud MCP Services

You can connect to external cloud MCP servers via Streamable HTTP transport. Here's the Dive configuration example for SearXNG service from OAPHub:

{
  "mcpServers": {
    "SearXNG_MCP_Server": {
      "transport": "streamable",
      "url": "https://proxy.oaphub.ai/v1/mcp/181672830075666436",
      "headers": {
        "Authorization": "GLOBAL_CLIENT_TOKEN"
      }
    }
  }
}

Reference: @https://oaphub.ai/mcp/181672830075666436

Using SSE Server (Non-Local MCP)

You can also connect to external MCP servers (not local ones) via SSE (Server-Sent Events). Add this configuration to your Dive MCP settings:

{
  "mcpServers": {
    "MCP_SERVER_NAME": {
      "enabled": true,
      "transport": "sse",
      "url": "YOUR_SSE_SERVER_URL"
    }
  }
}

Additional Setup for yt-dlp-mcp

yt-dlp-mcp requires the yt-dlp package. Install it based on your operating system:

Windows

winget install yt-dlp

MacOS

brew install yt-dlp

Linux

pip install yt-dlp

Build 🛠️

See BUILD.md for more details.

Connect With Us 🌐

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for Dive

Similar Open Source Tools

For similar tasks

For similar jobs