rig

rig

⚙️🦀 Build modular and scalable LLM Applications in Rust

Stars: 4441

Visit
 screenshot

Rig is a Rust library designed for building scalable, modular, and user-friendly applications powered by large language models (LLMs). It provides full support for LLM completion and embedding workflows, offers simple yet powerful abstractions for LLM providers like OpenAI and Cohere, as well as vector stores such as MongoDB and in-memory storage. With Rig, users can easily integrate LLMs into their applications with minimal boilerplate code.

README:

Rig logo
        stars - rig
  &nbsp

 

📑 Docs   •   🌐 Website   •   🤝 Contribute   •   ✍🏽 Blogs

✨ If you would like to help spread the word about Rig, please consider starring the repo!

[!WARNING] Here be dragons! As we plan to ship a torrent of features in the following months, future updates will contain breaking changes. With Rig evolving, we'll annotate changes and highlight migration paths as we encounter them.

Table of contents

What is Rig?

Rig is a Rust library for building scalable, modular, and ergonomic LLM-powered applications.

More information about this crate can be found in the official & crate (API Reference) documentations.

High-level features

  • Full support for LLM completion and embedding workflows
  • Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, SQlite, in-memory)
  • Integrate LLMs in your app with minimal boilerplate

Who is using Rig in production?

Below is a non-exhaustive list of companies and people who are using Rig in production:

  • Dria Compute Node - a node that serves computation results within the Dria Knowledge Network
  • The MCP Rust SDK - the official Model Context Protocol Rust SDK. Has an example for usage with Rig.
  • Probe - an AI-friendly, fully local semantic code search tool.
  • NINE - Neural Interconnected Nodes Engine, by Nethermind.
  • rig-onchain-kit - the Rig Onchain Kit. Intended to make interactions between Solana/EVM and Rig much easier to implement.
  • Linera Protocol - Decentralized blockchain infrastructure designed for highly scalable, secure, low-latency Web3 applications.
  • Listen - A framework aiming to become the go-to framework for AI portfolio management agents. Powers the Listen app.

Are you also using Rig in production? Open an issue to have your name added!

Get Started

cargo add rig-core

Simple example:

use rig::{completion::Prompt, providers::openai};

#[tokio::main]
async fn main() {
    // Create OpenAI client and model
    // This requires the `OPENAI_API_KEY` environment variable to be set.
    let openai_client = openai::Client::from_env();

    let gpt4 = openai_client.agent("gpt-4").build();

    // Prompt the model and print its response
    let response = gpt4
        .prompt("Who are you?")
        .await
        .expect("Failed to prompt GPT-4");

    println!("GPT-4: {response}");
}

Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).

You can find more examples each crate's examples (ie. rig-core/examples) directory. More detailed use cases walkthroughs are regularly published on our Dev.to Blog and added to Rig's official documentation (docs.rig.rs).

Supported Integrations

Vector stores are available as separate companion-crates:

The following providers are available as separate companion-crates:



Build by Playgrounds

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for rig

Similar Open Source Tools

For similar tasks

For similar jobs