llm-on-openshift

llm-on-openshift

Resources, demos, recipes,... to work with LLMs on OpenShift with OpenShift AI or Open Data Hub.

Stars: 67

Visit
 screenshot

This repository provides resources, demos, and recipes for working with Large Language Models (LLMs) on OpenShift using OpenShift AI or Open Data Hub. It includes instructions for deploying inference servers for LLMs, such as vLLM, Hugging Face TGI, Caikit-TGIS-Serving, and Ollama. Additionally, it offers guidance on deploying serving runtimes, such as vLLM Serving Runtime and Hugging Face Text Generation Inference, in the Single-Model Serving stack of Open Data Hub or OpenShift AI. The repository also covers vector databases that can be used as a Vector Store for Retrieval Augmented Generation (RAG) applications, including Milvus, PostgreSQL+pgvector, and Redis. Furthermore, it provides examples of inference and application usage, such as Caikit, Langchain, Langflow, and UI examples.

README:

LLM on OpenShift

In this repo you will find resources, demos, recipes... to work with LLMs on OpenShift with OpenShift AI or Open Data Hub.

Content

Inference Servers

The following Inference Servers for LLMs can be deployed standalone on OpenShift:

Serving Runtimes deployment

The following Runtimes can be imported in the Single-Model Serving stack of Open Data Hub or OpenShift AI.

Vector Databases

The following Databases can be used as a Vector Store for Retrieval Augmented Generation (RAG) applications:

  • Milvus: Full recipe to deploy the Milvus vector store, in standalone or cluster mode.
  • PostgreSQL+pgvector: Full recipe to create an instance of PostgreSQL with the pgvector extension, making it usable as a vector store.
  • Redis: Full recipe to deploy Redis, create a Cluster and a suitable Database for a Vector Store.

Inference and application examples

  • Caikit: Basic example demonstrating how to work with Caikit+TGIS for LLM serving.
  • Langchain examples: Various notebooks demonstrating how to work with Langchain. Examples are provided for different types of LLM servers (standalone or using the Single-Model Serving stack of Open Data Hub or OpenShift AI) and different vector databases.
  • Langflow examples: Various examples demonstrating how to work with Langflow.
  • UI examples: Various examples on how to create and deploy a UI to interact with your LLM.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for llm-on-openshift

Similar Open Source Tools

For similar tasks

For similar jobs