chatlas

chatlas

Your friendly guide to building LLM chat apps in Python with less effort and more clarity.

Stars: 120

Visit
 screenshot

Chatlas is a Python tool that provides a simple and unified interface across various large language model providers. It helps users prototype faster by abstracting complexity from tasks like streaming chat interfaces, tool calling, and structured output. Users can easily switch providers by changing one line of code and access provider-specific features when needed. Chatlas focuses on developer experience with typing support, rich console output, and extension points.

README:

chatlas chatlas website

PyPI MIT License versions Python Tests

Your friendly guide to building LLM chat apps in Python with less effort and more clarity.

Install

Install the latest stable release from PyPI:

pip install -U chatlas

Or, install the latest development version from GitHub:

pip install -U git+https://github.com/posit-dev/chatlas

Quick start

Get started in 3 simple steps:

  1. Choose a model provider, such as ChatOpenAI or ChatAnthropic.
  2. Visit the provider's reference page to get setup with necessary credentials.
  3. Create the relevant Chat client and start chatting!
from chatlas import ChatOpenAI

# Optional (but recommended) model and system_prompt
chat = ChatOpenAI(
    model="gpt-4.1-mini",
    system_prompt="You are a helpful assistant.",
)

# Optional tool registration
def get_current_weather(lat: float, lng: float):
    "Get the current weather for a given location."
    return "sunny"

chat.register_tool(get_current_weather)

# Send user prompt to the model for a response.
chat.chat("How's the weather in San Francisco?")

Model response output to the user query: 'How's the weather in San Francisco?'

Learn more at https://posit-dev.github.io/chatlas

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for chatlas

Similar Open Source Tools

For similar tasks

For similar jobs