llm-hosting-container

llm-hosting-container

Large Language Model Hosting Container

Stars: 80

Visit
 screenshot

The LLM Hosting Container repository provides Dockerfile and associated resources for building and hosting containers for large language models, specifically the HuggingFace Text Generation Inference (TGI) container. This tool allows users to easily deploy and manage large language models in a containerized environment, enabling efficient inference and deployment of language-based applications.

README:

LLM Hosting Container

Welcome to the LLM Hosting Container GitHub repository!

This repository contains Dockerfile and associated resources for building and hosting containers for large language models.

  • HuggingFace Text Generation Inference (TGI) container

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for llm-hosting-container

Similar Open Source Tools

For similar tasks

For similar jobs