
fast-mcp
A Ruby Implementation of the Model Context Protocol
Stars: 340

Fast MCP is a Ruby gem that simplifies the integration of AI models with your Ruby applications. It provides a clean implementation of the Model Context Protocol, eliminating complex communication protocols, integration challenges, and compatibility issues. With Fast MCP, you can easily connect AI models to your servers, share data resources, choose from multiple transports, integrate with frameworks like Rails and Sinatra, and secure your AI-powered endpoints. The gem also offers real-time updates and authentication support, making AI integration a seamless experience for developers.
README:
No complex protocols, no integration headaches, no compatibility issues β just beautiful, expressive Ruby code.
AI models are powerful, but they need to interact with your applications to be truly useful. Traditional approaches mean wrestling with:
- π Complex communication protocols and custom JSON formats
- π Integration challenges with different model providers
- π§© Compatibility issues between your app and AI tools
- π§ Managing the state between AI interactions and your data
Fast MCP solves all these problems by providing a clean, Ruby-focused implementation of the Model Context Protocol, making AI integration a joy, not a chore.
- π οΈ Tools API - Let AI models call your Ruby functions securely, with in-depth argument validation through Dry-Schema.
- π Resources API - Share data between your app and AI models
- π Multiple Transports - Choose from STDIO, HTTP, or SSE based on your needs
- π§© Framework Integration - Works seamlessly with Rails, Sinatra or any Rack app.
- π Authentication Support - Secure your AI-powered endpoints with ease
- π Real-time Updates - Subscribe to changes for interactive applications
# Define tools for AI models to use
server = FastMcp::Server.new(name: 'popular-users', version: '1.0.0')
# Define a tool by inheriting from FastMcp::Tool
class CreateUserTool < FastMcp::Tool
description "Create a user"
# These arguments will generate the needed JSON to be presented to the MCP Client
# And they will be validated at run time.
# The validation is based off Dry-Schema, with the addition of the description.
arguments do
required(:first_name).filled(:string).description("First name of the user")
optional(:age).filled(:integer).description("Age of the user")
required(:address).hash do
optional(:street).filled(:string)
optional(:city).filled(:string)
optional(:zipcode).filled(:string)
end
end
def call(first_name:, age: nil, address: {})
User.create!(first_name:, age:, address:)
end
end
# Register the tool with the server
server.register_tool(CreateUserTool)
# Share data resources with AI models by inheriting from FastMcp::Resource
class PopularUsers < FastMcp::Resource
uri "file://popular_users.json"
resource_name "Popular Users"
mime_type "application/json"
def content
JSON.generate(User.popular.limit(5).as_json)
end
end
# Register the resource with the server
server.register_resource(PopularUsers)
# Accessing the resource through the server
server.read_resource(PopularUsers.uri)
# Notify the resource content has been updated to clients
server.notify_resource_updated(PopularUsers.uri)
bundle add fast-mcp
bin/rails generate fast_mcp:install
This will add a configurable fast_mcp.rb
initializer
require 'fast_mcp'
FastMcp.mount_in_rails(
Rails.application,
name: Rails.application.class.module_parent_name.underscore.dasherize,
version: '1.0.0',
path_prefix: '/mcp' # This is the default path prefix
# authenticate: true, # Uncomment to enable authentication
# auth_token: 'your-token', # Required if authenticate: true
) do |server|
Rails.application.config.after_initialize do
# FastMcp will automatically discover and register:
# - All classes that inherit from ApplicationTool (which uses ActionTool::Base)
# - All classes that inherit from ApplicationResource (which uses ActionResource::Base)
server.register_tools(*ApplicationTool.descendants)
server.register_resources(*ApplicationResource.descendants)
# alternatively, you can register tools and resources manually:
# server.register_tool(MyTool)
# server.register_resource(MyResource)
end
end
The install script will also:
- add app/resources folder
- add app/tools folder
- add app/tools/sample_tool.rb
- add app/resources/sample_resource.rb
- add ApplicationTool to inherit from
- add ApplicationResource to inherit from as well
For Rails applications, FastMCP provides Rails-style class names to better fit with Rails conventions:
-
ActionTool::Base
- An alias forFastMcp::Tool
-
ActionResource::Base
- An alias forFastMcp::Resource
These are automatically set up in Rails applications. You can use either naming convention in your code:
# Using Rails-style naming:
class MyTool < ActionTool::Base
description "My awesome tool"
arguments do
required(:input).filled(:string)
end
def call(input:)
# Your implementation
end
end
# Using standard FastMcp naming:
class AnotherTool < FastMcp::Tool
# Both styles work interchangeably in Rails apps
end
When creating new tools or resources, the generators will use the Rails naming convention by default:
# app/tools/application_tool.rb
class ApplicationTool < ActionTool::Base
# Base methods for all tools
end
# app/resources/application_resource.rb
class ApplicationResource < ActionResource::Base
# Base methods for all resources
end
I'll let you check out the dedicated sinatra integration docs.
require 'fast_mcp'
# Create an MCP server
server = FastMcp::Server.new(name: 'my-ai-server', version: '1.0.0')
# Define a tool by inheriting from FastMcp::Tool
class SummarizeTool < FastMcp::Tool
description "Summarize a given text"
arguments do
required(:text).filled(:string).description("Text to summarize")
optional(:max_length).filled(:integer).description("Maximum length of summary")
end
def call(text:, max_length: 100)
# Your summarization logic here
text.split('.').first(3).join('.') + '...'
end
end
# Register the tool with the server
server.register_tool(SummarizeTool)
# Create a resource by inheriting from FastMcp::Resource
class StatisticsResource < FastMcp::Resource
uri "data/statistics"
resource_name "Usage Statistics"
description "Current system statistics"
mime_type "application/json"
def content
JSON.generate({
users_online: 120,
queries_per_minute: 250,
popular_topics: ["Ruby", "AI", "WebDev"]
})
end
end
# Register the resource with the server
server.register_resource(StatisticsResource)
# Start the server
server.start
MCP has developed a very useful inspector. You can use it to validate your implementation. I suggest you use the examples I provided with this project as an easy boilerplate. Clone this project, then give it a go !
npx @modelcontextprotocol/inspector examples/server_with_stdio_transport.rb
Or to test with an SSE transport using a rack middleware:
npx @modelcontextprotocol/inspector examples/rack_middleware.rb
Or to test over SSE with an authenticated rack middleware:
npx @modelcontextprotocol/inspector examples/authenticated_rack_middleware.rb
You can test your custom implementation with the official MCP inspector by using:
# Test with a stdio transport:
npx @modelcontextprotocol/inspector path/to/your_ruby_file.rb
# Test with an HTTP / SSE server. In the UI select SSE and input your address.
npx @modelcontextprotocol/inspector
# app.rb
require 'sinatra'
require 'fast_mcp'
use FastMcp::RackMiddleware.new(name: 'my-ai-server', version: '1.0.0') do |server|
# Register tools and resources here
server.register_tool(SummarizeTool)
end
get '/' do
'Hello World!'
end
Add your server to your Claude Desktop configuration at:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
- Windows:
%APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"my-great-server": {
"command": "ruby",
"args": [
"/Users/path/to/your/awesome/fast-mcp/server.rb"
]
}
}
}
Please refer to configuring_mcp_clients
Feature | Status |
---|---|
β JSON-RPC 2.0 | Full implementation for communication |
β Tool Definition & Calling | Define and call tools with rich argument types |
β Resource Management | Create, read, update, and subscribe to resources |
β Transport Options | STDIO, HTTP, and SSE for flexible integration |
β Framework Integration | Rails, Sinatra, Hanami, and any Rack-compatible framework |
β Authentication | Secure your AI endpoints with token authentication |
β Schema Support | Full JSON Schema for tool arguments with validation |
- π€ AI-powered Applications: Connect LLMs to your Ruby app's functionality
- π Real-time Dashboards: Build dashboards with live AI-generated insights
- π Microservice Communication: Use MCP as a clean protocol between services
- π Interactive Documentation: Create AI-enhanced API documentation
- π¬ Chatbots and Assistants: Build AI assistants with access to your app's data
- π Getting Started Guide
- π§© Integration Guide
- π€οΈ Rails Integration
- π Sinatra Integration
- π Resources
- π οΈ Tools
Check out the examples directory for more detailed examples:
-
π¨ Basic Examples:
-
π Web Integration:
- Ruby 3.2+
We welcome contributions to Fast MCP! Here's how you can help:
- Fork the repository
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create a new Pull Request
Please read our Contributing Guide for more details.
This project is available as open source under the terms of the MIT License.
- The Model Context Protocol team for creating the specification
- The Dry-Schema team for the argument validation.
- All contributors to this project
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for fast-mcp
Similar Open Source Tools

fast-mcp
Fast MCP is a Ruby gem that simplifies the integration of AI models with your Ruby applications. It provides a clean implementation of the Model Context Protocol, eliminating complex communication protocols, integration challenges, and compatibility issues. With Fast MCP, you can easily connect AI models to your servers, share data resources, choose from multiple transports, integrate with frameworks like Rails and Sinatra, and secure your AI-powered endpoints. The gem also offers real-time updates and authentication support, making AI integration a seamless experience for developers.

fraim
Fraim is an AI-powered toolkit designed for security engineers to enhance their workflows by leveraging AI capabilities. It offers solutions to find, detect, fix, and flag vulnerabilities throughout the development lifecycle. The toolkit includes features like Risk Flagger for identifying risks in code changes, Code Security Analysis for context-aware vulnerability detection, and Infrastructure as Code Analysis for spotting misconfigurations in cloud environments. Fraim can be run as a CLI tool or integrated into Github Actions, making it a versatile solution for security teams and organizations looking to enhance their security practices with AI technology.

comfyui-web-viewer
The ComfyUI Web Viewer by vrch.ai is a real-time AI-generated interactive art framework that integrates realtime streaming into ComfyUI workflows. It supports keyboard control nodes, OSC control nodes, sound input nodes, and more, accessible from any device with a web browser. It enables real-time interaction with AI-generated content, ideal for interactive visual projects and enhancing ComfyUI workflows with efficient content management and display.

docs-mcp-server
The docs-mcp-server repository contains the server-side code for the documentation management system. It provides functionalities for managing, storing, and retrieving documentation files. Users can upload, update, and delete documents through the server. The server also supports user authentication and authorization to ensure secure access to the documentation system. Additionally, the server includes APIs for integrating with other systems and tools, making it a versatile solution for managing documentation in various projects and organizations.

aigne-doc-smith
AIGNE DocSmith is a powerful AI-driven documentation generation tool that automates the creation of detailed, structured, and multi-language documentation directly from source code. It intelligently analyzes codebase to generate a comprehensive document structure, populates content with high-quality AI-powered generation, supports seamless translation into 12+ languages, integrates with AIGNE Hub for large language models, offers Discuss Kit publishing, automatically updates documentation with source code changes, and allows for individual document optimization.

action_mcp
Action MCP is a powerful tool for managing and automating your cloud infrastructure. It provides a user-friendly interface to easily create, update, and delete resources on popular cloud platforms. With Action MCP, you can streamline your deployment process, reduce manual errors, and improve overall efficiency. The tool supports various cloud providers and offers a wide range of features to meet your infrastructure management needs. Whether you are a developer, system administrator, or DevOps engineer, Action MCP can help you simplify and optimize your cloud operations.

llmgateway
The llmgateway repository is a tool that provides a gateway for interacting with various LLM (Large Language Model) models. It allows users to easily access and utilize pre-trained language models for tasks such as text generation, sentiment analysis, and language translation. The tool simplifies the process of integrating LLMs into applications and workflows, enabling developers to leverage the power of state-of-the-art language models for various natural language processing tasks.

MetaGPT
MetaGPT is a multi-agent framework that enables GPT to work in a software company, collaborating to tackle more complex tasks. It assigns different roles to GPTs to form a collaborative entity for complex tasks. MetaGPT takes a one-line requirement as input and outputs user stories, competitive analysis, requirements, data structures, APIs, documents, etc. Internally, MetaGPT includes product managers, architects, project managers, and engineers. It provides the entire process of a software company along with carefully orchestrated SOPs. MetaGPT's core philosophy is "Code = SOP(Team)", materializing SOP and applying it to teams composed of LLMs.

search_with_ai
Build your own conversation-based search with AI, a simple implementation with Node.js & Vue3. Live Demo Features: * Built-in support for LLM: OpenAI, Google, Lepton, Ollama(Free) * Built-in support for search engine: Bing, Sogou, Google, SearXNG(Free) * Customizable pretty UI interface * Support dark mode * Support mobile display * Support local LLM with Ollama * Support i18n * Support Continue Q&A with contexts.

RooFlow
RooFlow is a VS Code extension that enhances AI-assisted development by providing persistent project context and optimized mode interactions. It reduces token consumption and streamlines workflow by integrating Architect, Code, Test, Debug, and Ask modes. The tool simplifies setup, offers real-time updates, and provides clearer instructions through YAML-based rule files. It includes components like Memory Bank, System Prompts, VS Code Integration, and Real-time Updates. Users can install RooFlow by downloading specific files, placing them in the project structure, and running an insert-variables script. They can then start a chat, select a mode, interact with Roo, and use the 'Update Memory Bank' command for synchronization. The Memory Bank structure includes files for active context, decision log, product context, progress tracking, and system patterns. RooFlow features persistent context, real-time updates, mode collaboration, and reduced token consumption.

chunkr
Chunkr is an open-source document intelligence API that provides a production-ready service for document layout analysis, OCR, and semantic chunking. It allows users to convert PDFs, PPTs, Word docs, and images into RAG/LLM-ready chunks. The API offers features such as layout analysis, OCR with bounding boxes, structured HTML and markdown output, and VLM processing controls. Users can interact with Chunkr through a Python SDK, enabling them to upload documents, process them, and export results in various formats. The tool also supports self-hosted deployment options using Docker Compose or Kubernetes, with configurations for different AI models like OpenAI, Google AI Studio, and OpenRouter. Chunkr is dual-licensed under the GNU Affero General Public License v3.0 (AGPL-3.0) and a commercial license, providing flexibility for different usage scenarios.

eliza
Eliza is a versatile AI agent operating system designed to support various models and connectors, enabling users to create chatbots, autonomous agents, handle business processes, create video game NPCs, and engage in trading. It offers multi-agent and room support, document ingestion and interaction, retrievable memory and document store, and extensibility to create custom actions and clients. Eliza is easy to use and provides a comprehensive solution for AI agent development.

Pixelle-MCP
Pixelle-MCP is a multi-channel publishing tool designed to streamline the process of publishing content across various social media platforms. It allows users to create, schedule, and publish posts simultaneously on platforms such as Facebook, Twitter, and Instagram. With a user-friendly interface and advanced scheduling features, Pixelle-MCP helps users save time and effort in managing their social media presence. The tool also provides analytics and insights to track the performance of posts and optimize content strategy. Whether you are a social media manager, content creator, or digital marketer, Pixelle-MCP is a valuable tool to enhance your online presence and engage with your audience effectively.

openai-kotlin
OpenAI Kotlin API client is a Kotlin client for OpenAI's API with multiplatform and coroutines capabilities. It allows users to interact with OpenAI's API using Kotlin programming language. The client supports various features such as models, chat, images, embeddings, files, fine-tuning, moderations, audio, assistants, threads, messages, and runs. It also provides guides on getting started, chat & function call, file source guide, and assistants. Sample apps are available for reference, and troubleshooting guides are provided for common issues. The project is open-source and licensed under the MIT license, allowing contributions from the community.

code2prompt
Code2Prompt is a powerful command-line tool that generates comprehensive prompts from codebases, designed to streamline interactions between developers and Large Language Models (LLMs) for code analysis, documentation, and improvement tasks. It bridges the gap between codebases and LLMs by converting projects into AI-friendly prompts, enabling users to leverage AI for various software development tasks. The tool offers features like holistic codebase representation, intelligent source tree generation, customizable prompt templates, smart token management, Gitignore integration, flexible file handling, clipboard-ready output, multiple output options, and enhanced code readability.

gpt-computer-assistant
GPT Computer Assistant (GCA) is an open-source framework designed to build vertical AI agents that can automate tasks on Windows, macOS, and Ubuntu systems. It leverages the Model Context Protocol (MCP) and its own modules to mimic human-like actions and achieve advanced capabilities. With GCA, users can empower themselves to accomplish more in less time by automating tasks like updating dependencies, analyzing databases, and configuring cloud security settings.
For similar tasks

fast-mcp
Fast MCP is a Ruby gem that simplifies the integration of AI models with your Ruby applications. It provides a clean implementation of the Model Context Protocol, eliminating complex communication protocols, integration challenges, and compatibility issues. With Fast MCP, you can easily connect AI models to your servers, share data resources, choose from multiple transports, integrate with frameworks like Rails and Sinatra, and secure your AI-powered endpoints. The gem also offers real-time updates and authentication support, making AI integration a seamless experience for developers.

flute
FLUTE (Flexible Lookup Table Engine for LUT-quantized LLMs) is a tool designed for uniform quantization and lookup table quantization of weights in lower-precision intervals. It offers flexibility in mapping intervals to arbitrary values through a lookup table. FLUTE supports various quantization formats such as int4, int3, int2, fp4, fp3, fp2, nf4, nf3, nf2, and even custom tables. The tool also introduces new quantization algorithms like Learned Normal Float (NFL) for improved performance and calibration data learning. FLUTE provides benchmarks, model zoo, and integration with frameworks like vLLM and HuggingFace for easy deployment and usage.

KernelBench
KernelBench is a benchmark tool designed to evaluate Large Language Models' (LLMs) ability to generate GPU kernels. It focuses on transpiling operators from PyTorch to CUDA kernels at different levels of granularity. The tool categorizes problems into four levels, ranging from single-kernel operators to full model architectures, and assesses solutions based on compilation, correctness, and speed. The repository provides a structured directory layout, setup instructions, usage examples for running single or multiple problems, and upcoming roadmap features like additional GPU platform support and integration with other frameworks.
For similar jobs

sweep
Sweep is an AI junior developer that turns bugs and feature requests into code changes. It automatically handles developer experience improvements like adding type hints and improving test coverage.

teams-ai
The Teams AI Library is a software development kit (SDK) that helps developers create bots that can interact with Teams and Microsoft 365 applications. It is built on top of the Bot Framework SDK and simplifies the process of developing bots that interact with Teams' artificial intelligence capabilities. The SDK is available for JavaScript/TypeScript, .NET, and Python.

ai-guide
This guide is dedicated to Large Language Models (LLMs) that you can run on your home computer. It assumes your PC is a lower-end, non-gaming setup.

classifai
Supercharge WordPress Content Workflows and Engagement with Artificial Intelligence. Tap into leading cloud-based services like OpenAI, Microsoft Azure AI, Google Gemini and IBM Watson to augment your WordPress-powered websites. Publish content faster while improving SEO performance and increasing audience engagement. ClassifAI integrates Artificial Intelligence and Machine Learning technologies to lighten your workload and eliminate tedious tasks, giving you more time to create original content that matters.

chatbot-ui
Chatbot UI is an open-source AI chat app that allows users to create and deploy their own AI chatbots. It is easy to use and can be customized to fit any need. Chatbot UI is perfect for businesses, developers, and anyone who wants to create a chatbot.

BricksLLM
BricksLLM is a cloud native AI gateway written in Go. Currently, it provides native support for OpenAI, Anthropic, Azure OpenAI and vLLM. BricksLLM aims to provide enterprise level infrastructure that can power any LLM production use cases. Here are some use cases for BricksLLM: * Set LLM usage limits for users on different pricing tiers * Track LLM usage on a per user and per organization basis * Block or redact requests containing PIIs * Improve LLM reliability with failovers, retries and caching * Distribute API keys with rate limits and cost limits for internal development/production use cases * Distribute API keys with rate limits and cost limits for students

uAgents
uAgents is a Python library developed by Fetch.ai that allows for the creation of autonomous AI agents. These agents can perform various tasks on a schedule or take action on various events. uAgents are easy to create and manage, and they are connected to a fast-growing network of other uAgents. They are also secure, with cryptographically secured messages and wallets.

griptape
Griptape is a modular Python framework for building AI-powered applications that securely connect to your enterprise data and APIs. It offers developers the ability to maintain control and flexibility at every step. Griptape's core components include Structures (Agents, Pipelines, and Workflows), Tasks, Tools, Memory (Conversation Memory, Task Memory, and Meta Memory), Drivers (Prompt and Embedding Drivers, Vector Store Drivers, Image Generation Drivers, Image Query Drivers, SQL Drivers, Web Scraper Drivers, and Conversation Memory Drivers), Engines (Query Engines, Extraction Engines, Summary Engines, Image Generation Engines, and Image Query Engines), and additional components (Rulesets, Loaders, Artifacts, Chunkers, and Tokenizers). Griptape enables developers to create AI-powered applications with ease and efficiency.