
bolt-python-ai-chatbot
Bring AI into your workspace using a chatbot powered by Anthropic and OpenAI
Stars: 52

README:
This Slack chatbot app template offers a customizable solution for integrating AI-powered conversations into your Slack workspace. Here's what the app can do out of the box:
- Interact with the bot by mentioning it in conversations and threads
- Send direct messages to the bot for private interactions
- Use the
/ask-bolty
command to communicate with the bot in channels where it hasn't been added - Utilize a custom function for integration with Workflow Builder to summarize messages in conversations
- Select your preferred API/model from the app home to customize the bot's responses
- Bring Your Own Language Model BYO LLM for customization
- Custom FileStateStore creates a file in /data per user to store API/model preferences
Inspired by ChatGPT-in-Slack
Before getting started, make sure you have a development workspace where you have permissions to install apps. If you don’t have one setup, go ahead and create one.
- To use the OpenAI and Anthropic models, you must have an account with sufficient credits.
- To use the Vertex models, you must have a Google Cloud Provider project with sufficient credits.
- Open https://api.slack.com/apps/new and choose "From an app manifest"
- Choose the workspace you want to install the application to
- Copy the contents of manifest.json into the text box that says
*Paste your manifest code here*
(within the JSON tab) and click Next - Review the configuration and click Create
- Click Install to Workspace and Allow on the screen that follows. You'll then be redirected to the App Configuration dashboard.
Before you can run the app, you'll need to store some environment variables.
- Open your apps configuration page from this list, click OAuth & Permissions in the left hand menu, then copy the Bot User OAuth Token. You will store this in your environment as
SLACK_BOT_TOKEN
. - Click Basic Information from the left hand menu and follow the steps in the App-Level Tokens section to create an app-level token with the
connections:write
scope. Copy this token. You will store this in your environment asSLACK_APP_TOKEN
.
Next, set the gathered tokens as environment variables using the following commands:
# MacOS/Linux
export SLACK_BOT_TOKEN=<your-bot-token>
export SLACK_APP_TOKEN=<your-app-token>
# Windows
set SLACK_BOT_TOKEN=<your-bot-token>
set SLACK_APP_TOKEN=<your-app-token>
Different models from different AI providers are available if the corresponding environment variable is added, as shown in the sections below.
To interact with Anthropic models, navigate to your Anthropic account dashboard to create an API key, then export the key as follows:
export ANTHROPIC_API_KEY=<your-api-key>
To use Google Cloud Vertex AI, follow this quick start to create a project for sending requests to the Gemini API, then gather Application Default Credentials with the strategy to match your development environment.
Once your project and credentials are configured, export environment variables to select from Gemini models:
export VERTEX_AI_PROJECT_ID=<your-project-id>
export VERTEX_AI_LOCATION=<location-to-deploy-model>
The project location can be located under the Region on the Vertex AI dashboard, as well as more details about available Gemini models.
Unlock the OpenAI models from your OpenAI account dashboard by clicking create a new secret key, then export the key like so:
export OPENAI_API_KEY=<your-api-key>
# Clone this project onto your machine
git clone https://github.com/slack-samples/bolt-python-ai-chatbot.git
# Change into this project directory
cd bolt-python-ai-chatbot
# Setup your python virtual environment
python3 -m venv .venv
source .venv/bin/activate
# Install the dependencies
pip install -r requirements.txt
# Start your local server
python3 app.py
# Run flake8 from root directory for linting
flake8 *.py && flake8 listeners/
# Run black from root directory for code formatting
black .
manifest.json
is a configuration for Slack apps. With a manifest, you can create an app with a pre-defined configuration, or adjust the configuration of an existing app.
app.py
is the entry point for the application and is the file you'll run to start the server. This project aims to keep this file as thin as possible, primarily using it as a way to route inbound requests.
Every incoming request is routed to a "listener". Inside this directory, we group each listener based on the Slack Platform feature used, so /listeners/commands
handles incoming Slash Commands requests, /listeners/events
handles Events and so on.
-
ai_constants.py
: Defines constants used throughout the AI module.
This module contains classes for communicating with different API providers, such as Anthropic, OpenAI, and Vertex AI. To add your own LLM, create a new class for it using the base_api.py
as an example, then update ai/providers/__init__.py
to include and utilize your new class for API communication.
-
__init__.py
: This file contains utility functions for handling responses from the provider APIs and retrieving available providers.
-
user_identity.py
: This file defines the UserIdentity class for creating user objects. Each object represents a user with the user_id, provider, and model attributes. -
user_state_store.py
: This file defines the base class for FileStateStore. -
file_state_store.py
: This file defines the FileStateStore class which handles the logic for creating and managing files for each user. -
set_user_state.py
: This file creates a user object and uses a FileStateStore to save the user's selected provider to a JSON file. -
get_user_state.py
: This file retrieves a users selected provider from the JSON file created withset_user_state.py
.
Only implement OAuth if you plan to distribute your application across multiple workspaces. A separate app_oauth.py
file can be found with relevant OAuth settings.
When using OAuth, Slack requires a public URL where it can send requests. In this template app, we've used ngrok
. Checkout this guide for setting it up.
Start ngrok
to access the app on an external network and create a redirect URL for OAuth.
ngrok http 3000
This output should include a forwarding address for http
and https
(we'll use https
). It should look something like the following:
Forwarding https://3cb89939.ngrok.io -> http://localhost:3000
Navigate to OAuth & Permissions in your app configuration and click Add a Redirect URL. The redirect URL should be set to your ngrok
forwarding address with the slack/oauth_redirect
path appended. For example:
https://3cb89939.ngrok.io/slack/oauth_redirect
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for bolt-python-ai-chatbot
Similar Open Source Tools

chat-ollama
ChatOllama is an open-source chatbot based on LLMs (Large Language Models). It supports a wide range of language models, including Ollama served models, OpenAI, Azure OpenAI, and Anthropic. ChatOllama supports multiple types of chat, including free chat with LLMs and chat with LLMs based on a knowledge base. Key features of ChatOllama include Ollama models management, knowledge bases management, chat, and commercial LLMs API keys management.

fastllm
A collection of LLM services you can self host via docker or modal labs to support your applications development. The goal is to provide docker containers or modal labs deployments of common patterns when using LLMs and endpoints to integrate easily with existing codebases using the openai api. It supports GPT4all's embedding api, JSONFormer api for chat completion, Cross Encoders based on sentence transformers, and provides documentation using MkDocs.

ai-town
AI Town is a virtual town where AI characters live, chat, and socialize. This project provides a deployable starter kit for building and customizing your own version of AI Town. It features a game engine, database, vector search, auth, text model, deployment, pixel art generation, background music generation, and local inference. You can customize your own simulation by creating characters and stories, updating spritesheets, changing the background, and modifying the background music.

TypeGPT
TypeGPT is a Python application that enables users to interact with ChatGPT or Google Gemini from any text field in their operating system using keyboard shortcuts. It provides global accessibility, keyboard shortcuts for communication, and clipboard integration for larger text inputs. Users need to have Python 3.x installed along with specific packages and API keys from OpenAI for ChatGPT access. The tool allows users to run the program normally or in the background, manage processes, and stop the program. Users can use keyboard shortcuts like `/ask`, `/see`, `/stop`, `/chatgpt`, `/gemini`, `/check`, and `Shift + Cmd + Enter` to interact with the application in any text field. Customization options are available by modifying files like `keys.txt` and `system_prompt.txt`. Contributions are welcome, and future plans include adding support for other APIs and a user-friendly GUI.

AilyticMinds
AilyticMinds Chatbot UI is an open-source AI chat app designed for easy deployment and improved backend compatibility. It provides a user-friendly interface for creating and hosting chatbots, with features like mobile layout optimization and support for various providers. The tool utilizes Supabase for data storage and management, offering a secure and scalable solution for chatbot development. Users can quickly set up their own instances locally or in the cloud, with detailed instructions provided for installation and configuration.

aides-jeunes
The user interface (and the main server) of the simulator of aids and social benefits for young people. It is based on the free socio-fiscal simulator Openfisca.

opencommit
OpenCommit is a tool that auto-generates meaningful commits using AI, allowing users to quickly create commit messages for their staged changes. It provides a CLI interface for easy usage and supports customization of commit descriptions, emojis, and AI models. Users can configure local and global settings, switch between different AI providers, and set up Git hooks for integration with IDE Source Control. Additionally, OpenCommit can be used as a GitHub Action to automatically improve commit messages on push events, ensuring all commits are meaningful and not generic. Payments for OpenAI API requests are handled by the user, with the tool storing API keys locally.

LLM-Engineers-Handbook
The LLM Engineer's Handbook is an official repository containing a comprehensive guide on creating an end-to-end LLM-based system using best practices. It covers data collection & generation, LLM training pipeline, a simple RAG system, production-ready AWS deployment, comprehensive monitoring, and testing and evaluation framework. The repository includes detailed instructions on setting up local and cloud dependencies, project structure, installation steps, infrastructure setup, pipelines for data processing, training, and inference, as well as QA, tests, and running the project end-to-end.

chatbot-ui
Chatbot UI is an open-source AI chat app that allows users to create and deploy their own AI chatbots. It is easy to use and can be customized to fit any need. Chatbot UI is perfect for businesses, developers, and anyone who wants to create a chatbot.

aider-composer
Aider Composer is a VSCode extension that integrates Aider into your development workflow. It allows users to easily add and remove files, toggle between read-only and editable modes, review code changes, use different chat modes, and reference files in the chat. The extension supports multiple models, code generation, code snippets, and settings customization. It has limitations such as lack of support for multiple workspaces, Git repository features, linting, testing, voice features, in-chat commands, and configuration options.

seer
Seer is a service that provides AI capabilities to Sentry by running inference on Sentry issues and providing user insights. It is currently in early development and not yet compatible with self-hosted Sentry instances. The tool requires access to internal Sentry resources and is intended for internal Sentry employees. Users can set up the environment, download model artifacts, integrate with local Sentry, run evaluations for Autofix AI agent, and deploy to a sandbox staging environment. Development commands include applying database migrations, creating new migrations, running tests, and more. The tool also supports VCRs for recording and replaying HTTP requests.

temporal-ai-agent
Temporal AI Agent is a demo showcasing a multi-turn conversation with an AI agent running inside a Temporal workflow. The agent collects information towards a goal using a simple DSL input. It is currently set up to search for events, book flights around those events, and create an invoice for those flights. The AI agent responds with clarifications and prompts for missing information. Users can configure the agent to use ChatGPT 4o or a local LLM via Ollama. The tool requires Rapidapi key for sky-scrapper to find flights and a Stripe key for creating invoices. Users can customize the agent by modifying tool and goal definitions in the codebase.

qrev
QRev is an open-source alternative to Salesforce, offering AI agents to scale sales organizations infinitely. It aims to provide digital workers for various sales roles or a superagent named Qai. The tech stack includes TypeScript for frontend, NodeJS for backend, MongoDB for app server database, ChromaDB for vector database, SQLite for AI server SQL relational database, and Langchain for LLM tooling. The tool allows users to run client app, app server, and AI server components. It requires Node.js and MongoDB to be installed, and provides detailed setup instructions in the README file.

unsight.dev
unsight.dev is a tool built on Nuxt that helps detect duplicate GitHub issues and areas of concern across related repositories. It utilizes Nitro server API routes, GitHub API, and a GitHub App, along with UnoCSS. The tool is deployed on Cloudflare with NuxtHub, using Workers AI, Workers KV, and Vectorize. It also offers a browser extension soon to be released. Users can try the app locally for tweaking the UI and setting up a full development environment as a GitHub App.