pyspur

pyspur

Graph-Based Editor for LLM Workflows

Stars: 1112

Visit
 screenshot

PySpur is a graph-based editor designed for LLM (Large Language Models) workflows. It offers modular building blocks, node-level debugging, and performance evaluation. The tool is easy to hack, supports JSON configs for workflow graphs, and is lightweight with minimal dependencies. Users can quickly set up PySpur by cloning the repository, creating a .env file, starting docker services, and accessing the portal. PySpur can also work with local models served using Ollama, with steps provided for configuration. The roadmap includes features like canvas, async/batch execution, support for Ollama, new nodes, pipeline optimization, templates, code compilation, multimodal support, and more.

README:

PySpur - Graph-Based Editor for LLM Workflows

README in English 简体中文版自述文件 日本語のREADME README in Korean Deutsche Version der README Version française du README Versión en español del README

Join Our Discord

https://github.com/user-attachments/assets/9128885b-47ba-4fc6-ab6b-d567f52e332c

✨ Core Benefits

Modular Building Blocks

https://github.com/user-attachments/assets/6442f0ad-86d8-43d9-aa70-e5c01e55e876

Debug at Node Level:

https://github.com/user-attachments/assets/6e82ad25-2a46-4c50-b030-415ea9994690

Evaluate Final Performance

https://github.com/user-attachments/assets/4dc2abc3-c6e6-4d6d-a5c3-787d518de7ae

Coming soon: Self-improvement

https://github.com/user-attachments/assets/5bef7a16-ef9f-4650-b385-4ea70fa54c8a

🕸️ Why PySpur?

  • Easy-to-hack, eg., one can add new workflow nodes by simply creating a single Python file.
  • JSON configs of workflow graphs, enabling easy sharing and version control.
  • Lightweight via minimal dependencies, avoiding bloated LLM frameworks.

⚡ Quick start

You can launch PySpur using pre-built docker images in the following steps:

  1. Clone the repository:

    git clone https://github.com/PySpur-com/pyspur.git
    cd pyspur
  2. Create a .env file:

    Create a .env file at the root of the project. You may use .env.example as a starting point.

    cp .env.example .env

    Please go through the .env file and change configs wherver necessary If you plan to use third party model providers, please add their API keys in the .env file in this step.

  3. Start the docker services:

    docker compose -f ./docker-compose.prod.yml up --build -d

    This will start a local instance of PySpur that will store spurs and other state information in a postgres database. A local postgres service is used by default. Override POSTGRES_* variables in the .env file to use an external postgres database.

  4. Access the portal:

    Go to http://localhost:6080/ in your browser.

Set up is completed. Click on "New Spur" to create a workflow, or start with one of the stock templates.

  1. [Optional] Manage your LLM provider keys from the app:

    Once PySpur app is running you can manage your LLM provider keys through the portal:

    image

    Select API keys tab

    image

    Enter your provider's key and click save (save button will appear after you add/modify a key)

    image

🛠️ PySpur Development Setup

[ Instructions for development on Unix-like systems. Development on Windows/PC not tested ]

The steps for dev setup are same as above, except for step 3: we launch the app in the dev mode instead

  1. Start the docker services:

    docker compose up --build -d

    This will start a local instance of PySpur that will store spurs and other state information in a postgres database. A local postgres service is used by default. Override POSTGRES_* variables in the .env file to use an external postgres database.

🦙 Using PySpur with Ollama (Local Models)

PySpur can work with local models served using Ollama.

Steps to configure PySpur to work with Ollama running on the same host.

1. Configure Ollama

To ensure Ollama API is reachable from PySpur, we need to start the Ollama service with environment variable OLLAMA_HOST=0.0.0.0 . This allows requests coming from PySpur docker's bridge network to get through to Ollama. An easy way to do this is to launch the ollama service with the following command:

OLLAMA_HOST="0.0.0.0" ollama serve

2. Update the PySpur .env file

Next up we need to update the OLLAMA_BASE_URL environment value in the .env file. If your Ollama port is 11434 (the default port), then the entry in .env file should look like this:

OLLAMA_BASE_URL=http://host.docker.internal:11434 

(Please make sure that there is no trailing slash in the end!)

In PySpur's set up, host.docker.internal refers to the host machine where both PySpur and Ollama are running.

3. Launch the PySpur app

Follow the usual steps to launch the PySpur app, starting with the command:

docker compose -f docker-compose.prod.yml up --build -d

If you wish to do PySpur development with ollama please run the following command instead of above:

docker compose -f docker-compose.yml up --build -d

4. Using Ollama models in the app

You will be able to select Ollama models [ollama/llama3.2, ollama/llama3, ...] from the sidebar for LLM nodes. Please make sure the model you select is explicitly downloaded in ollama. That is, you will need to manually manage these models via ollama. To download a model you can simply run ollama pull <model-name>.

Note on supported models

PySpur only works with models that support structured-output and json mode. Most newer models should be good, but it would still be good to confirm this from Ollama documentation for the model you wish to use.

⭐ Support us

You can support us in our work by leaving a star! Thank you!

star

🗺️ Roadmap

  • [X] Canvas
  • [X] Async/Batch Execution
  • [X] Evals
  • [X] Spur API
  • [x] Support Ollama
  • [ ] New Nodes
    • [X] LLM Nodes
    • [X] If-Else
    • [X] Merge Branches
    • [ ] Tools
    • [ ] Loops
  • [ ] RAG
  • [ ] Pipeline optimization via DSPy and related methods
  • [ ] Templates
  • [ ] Compile Spurs to Code
  • [ ] Multimodal support
  • [ ] Containerization of Code Verifiers
  • [ ] Leaderboard
  • [ ] Generate Spurs via AI

Your feedback will be massively appreciated. Please tell us which features on that list you like to see next or request entirely new ones.

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for pyspur

Similar Open Source Tools

For similar tasks

For similar jobs