rag-chat

rag-chat

Batteries included SDK for RAG development.

Stars: 154

Visit
 screenshot

The `@upstash/rag-chat` package simplifies the development of retrieval-augmented generation (RAG) chat applications by providing Next.js compatibility with streaming support, built-in vector store, optional Redis compatibility for fast chat history management, rate limiting, and disableRag option. Users can easily set up the environment variables and initialize RAGChat to interact with AI models, manage knowledge base, chat history, and enable debugging features. Advanced configuration options allow customization of RAGChat instance with built-in rate limiting, observability via Helicone, and integration with Next.js route handlers and Vercel AI SDK. The package supports OpenAI models, Upstash-hosted models, and custom providers like TogetherAi and Replicate.

README:

Upstash RAG Chat SDK · license npm (scoped) npm weekly download

The @upstash/rag-chat package makes it easy to develop powerful retrieval-augmented generation (RAG) chat applications with minimal setup and configuration.

Features:

  • Next.js compatibility with streaming support
  • Ingest entire websites, PDFs and more out of the box
  • Built-in Vector store for your knowledge base
  • (Optional) built-in Redis compatibility for fast chat history management
  • (Optional) built-in rate limiting
  • (Optional) disableRag option to use it as LLM + chat history
  • (Optional) Analytics via Helicone, Langsmith and Cloudflare AI Gateway

Getting started

Installation

Install the package using your preferred package manager:

pnpm add @upstash/rag-chat

bun add @upstash/rag-chat

npm i @upstash/rag-chat

Quick start

  1. Set up your environment variables:
UPSTASH_VECTOR_REST_URL="XXXXX"
UPSTASH_VECTOR_REST_TOKEN="XXXXX"


# if you use OpenAI compatible models
OPENAI_API_KEY="XXXXX"

# or if you use Upstash hosted models
QSTASH_TOKEN="XXXXX"

# Optional: For Redis-based chat history (default is in-memory)
UPSTASH_REDIS_REST_URL="XXXXX"
UPSTASH_REDIS_REST_TOKEN="XXXXX"
  1. Initialize and use RAGChat:
import { RAGChat } from "@upstash/rag-chat";

const ragChat = new RAGChat();

const response = await ragChat.chat("Tell me about machine learning");
console.log(response);

Basic Usage

import { RAGChat, openai } from "@upstash/rag-chat";

export const ragChat = new RAGChat({
  model: openai("gpt-4-turbo"),
});

await ragChat.context.add({
  type: "text",
  data: "The speed of light is approximately 299,792,458 meters per second.",
});

await ragChat.context.add({
  type: "pdf",
  fileSource: "./data/physics_basics.pdf",
});
const response = await ragChat.chat("What is the speed of light?");

console.log(response.output);

Docs

Checkout the documentation for integrations and advanced options.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for rag-chat

Similar Open Source Tools

For similar tasks

For similar jobs