promptpanel

promptpanel

Accelerate your AI agent adoption. A platform for building AI assistants. Run LLMs from OpenAI, Anthropic, Ollama, Mistral, Llama, and Google Gemini.

Stars: 53

Visit
 screenshot

Prompt Panel is a tool designed to accelerate the adoption of AI agents by providing a platform where users can run large language models across any inference provider, create custom agent plugins, and use their own data safely. The tool allows users to break free from walled-gardens and have full control over their models, conversations, and logic. With Prompt Panel, users can pair their data with any language model, online or offline, and customize the system to meet their unique business needs without any restrictions.

README:

logo

Prompt Panel
Accelerating your AI agent adoption
Documentation | DockerHub | GitHub

Installation

Via Docker Compose:

curl -sSL https://promptpanel.com/manifest/docker-compose.yml | docker compose -f - up

which runs the following docker-compose.yml:

version: "3.9"
services:
  promptpanel:
    image: promptpanel/promptpanel:latest
    container_name: promptpanel
    restart: always
    volumes:
      - ./database:/app/database
      - ./media:/app/media
    ports:
      - 4000:4000
    environment:
      PROMPT_OLLAMA_HOST: http://ollama:11434
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    restart: always

After running, your environment will be available at: http://localhost:4000

Read more on running Prompt Panel.


Your models, conversations, and logic are locked in walled-gardens.

Let's free your AI interface.

  • Run any large language model, across any inference provider, any way you want. From commercial models like OpenAI, Anthropic, Gemini, or Cohere - to open source models, either hosted or running locally via Ollama.
  • Access controls to assign users to agents without revealing your API tokens or credentials. Enable user sign-up and login with OpenID Connect (OIDC) single sign-on.
  • Bring your own data and store it locally on your instance. Use it safely by pairing it with any language model, whether online or offline.
  • Create custom agent plugins using Python, to customize your AI agent capabilities, and retrieval augmented generation (RAG) pipelines.

Build your own agent plugins

Get started developing using a one-click cloud development environment using GitPod:

Open in Gitpod

This ./plugins directory contains the community plugin agents found in Prompt Panel as well as a sample agent as a template for you to get started with your own development.

  • The ./hello_agent directory gives you some scaffolding for a sample agent.
  • The other community plugin agents give you references to sample from.
  • The docker-compose-agent-dev.yml file gives you a sample with the various mounts and environment variables we recommend for development.

To get more information about how to build your first plugin we recommend giving a read to:

Running DEV_PORT=4001 docker compose up -f docker-compose-agent-dev.yml from this directory with a development port set will bring up a development environment you can use to start developing your agent plugin.

Command:

DEV_PORT=4001 docker compose up -f docker-compose-agent-dev.yml

With these settings, your development environment will be available at: http://localhost:4001

Questions?

Feel free to get in contact with us at:
[email protected]


App Screenshot

Development Experience via GitPod

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for promptpanel

Similar Open Source Tools

For similar tasks

For similar jobs