langwatch

langwatch

The ultimate LLM Ops platform - Monitoring, Analytics, Evaluations, Datasets and Prompt Optimization ✨

Stars: 735

Visit
 screenshot

LangWatch is a monitoring and analytics platform designed to track, visualize, and analyze interactions with Large Language Models (LLMs). It offers real-time telemetry to optimize LLM cost and latency, a user-friendly interface for deep insights into LLM behavior, user analytics for engagement metrics, detailed debugging capabilities, and guardrails to monitor LLM outputs for issues like PII leaks and toxic language. The platform supports OpenAI and LangChain integrations, simplifying the process of tracing LLM calls and generating API keys for usage. LangWatch also provides documentation for easy integration and self-hosting options for interested users.

README:

PRs Welcome Discord LangWatch Python SDK version LangWatch TypeScript SDK version

LangWatch - LLM Monitoring & Optimization Studio

LangWatch is a visual interface for DSPy and a complete LLM Ops platform for monitoring, experimenting, measuring and improving LLM pipelines, with a fair-code distribution model.

LangWatch Optimization Studio Screenshot

Demo

📺 Short video (3 min) for a sneak peak of LangWatch and a brief introduction to the concepts.

Features

🎯 Optimization Studio

  • Drag-and-drop interface for LLM pipeline optimization
  • Built on Stanford's DSPy framework
  • Automatic prompt and few-shot examples generation
  • Visual experiment tracking and version control

📊 Quality Assurance

  • 30+ off-the-shelf evaluators
  • Custom evaluation builder
  • Full dataset management
  • Compliance and safety checks
  • DSPy Visualizer

📈 Monitoring & Analytics

  • Cost and performance tracking
  • Real-time debugging and tracing details
  • User analytics and custom business metrics
  • Custom dashboards and alerts

LangWatch Cloud

Sign-up for a free account on LangWatch Cloud as the easiest way to get started.

Getting Started (local setup)

You need to have docker installed in your local environment to be able to run LangWatch locally.

Get started with:

git clone https://github.com/langwatch/langwatch.git
cp langwatch/.env.example langwatch/.env
docker compose up --build

Then, open LangWatch at http://localhost:5560

Development

You can also run LangWatch locally without docker to develop and help contribute to the project.

Start just the databases using docker and leave it running:

docker compose up redis postgres opensearch

Then, on another terminal, install the dependencies and start LangWatch:

make install
make start

On-Prem (Self-Hosting)

LangWatch also offers commercial support for self-hosting on your own infrastructure. For more information, please refer to the Self-Hosting section of the documentation.

Contributing

Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

Please read our Contribution Guidelines for details on our code of conduct, and the process for submitting pull requests.

Support

If you have questions or need help, join our community:

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for langwatch

Similar Open Source Tools

For similar tasks

For similar jobs