lanarky

lanarky

The web framework for building LLM microservices

Stars: 954

Visit
 screenshot

Lanarky is a Python web framework designed for building microservices using Large Language Models (LLMs). It is LLM-first, fast, modern, supports streaming over HTTP and WebSockets, and is open-source. The framework provides an abstraction layer for developers to easily create LLM microservices. Lanarky guarantees zero vendor lock-in and is free to use. It is built on top of FastAPI and offers features familiar to FastAPI users. The project is now in maintenance mode, with no active development planned, but community contributions are encouraged.

README:

lanarky-logo-light-mode lanarky-logo-dark-mode

The web framework for building LLM microservices.

Stars License Twitter

Python Coverage Version Stats

⚠️ Disclaimer: This project is now in maintenance mode. I won't be adding new features or actively maintaining the project as I have moved on to other projects and priorities. While I will address critical bugs and security issues as needed, active development has ceased from my end. I do encourage the community to continue to contribute to the project if they find it useful. Thank you for using lanarky!

Lanarky is a python (3.9+) web framework for developers who want to build microservices using LLMs. Here are some of its key features:

  • LLM-first: Unlike other web frameworks, lanarky is built specifically for LLM developers. It's unopinionated in terms of how you build your microservices and guarantees zero vendor lock-in with any LLM tooling frameworks or cloud providers
  • Fast & Modern: Built on top of FastAPI, lanarky offers all the FastAPI features you know and love. If you are new to FastAPI, visit fastapi.tiangolo.com to learn more
  • Streaming: Streaming is essential for many real-time LLM applications, like chatbots. Lanarky has got you covered with built-in streaming support over HTTP and WebSockets.
  • Open-source: Lanarky is open-source and free to use. Forever.

To learn more about lanarky and get started, you can find the full documentation on lanarky.ajndkr.com

Installation

The library is available on PyPI and can be installed via pip:

pip install lanarky

Getting Started

Lanarky provides a powerful abstraction layer to allow developers to build simple LLM microservices in just a few lines of code.

Here's an example to build a simple microservice that uses OpenAI's ChatCompletion service:

from lanarky import Lanarky
from lanarky.adapters.openai.resources import ChatCompletionResource
from lanarky.adapters.openai.routing import OpenAIAPIRouter

app = Lanarky()
router = OpenAIAPIRouter()


@router.post("/chat")
def chat(stream: bool = True) -> ChatCompletionResource:
    system = "You are a sassy assistant"
    return ChatCompletionResource(stream=stream, system=system)


app.include_router(router)

Visit Getting Started for the full tutorial on building and testing your first LLM microservice with Lanarky.

Contributing

Code check Publish

Contributions are more than welcome! If you have an idea for a new feature or want to help improve lanarky, please create an issue or submit a pull request on GitHub.

See CONTRIBUTING.md for more information.

See Lanarky Roadmap for the list of new features and future milestones.

License

The library is released under the MIT License.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for lanarky

Similar Open Source Tools

For similar tasks

For similar jobs