SalesGPT
Context-aware AI Sales Agent to automate sales outreach.
Stars: 1872
SalesGPT is an open-source AI agent designed for sales, utilizing context-awareness and LLMs to work across various communication channels like voice, email, and texting. It aims to enhance sales conversations by understanding the stage of the conversation and providing tools like product knowledge base to reduce errors. The agent can autonomously generate payment links, handle objections, and close sales. It also offers features like automated email communication, meeting scheduling, and integration with various LLMs for customization. SalesGPT is optimized for low latency in voice channels and ensures human supervision where necessary. The tool provides enterprise-grade security and supports LangSmith tracing for monitoring and evaluation of intelligent agents built on LLM frameworks.
README:
This repo is an implementation of a context-aware AI Agent for Sales using LLMs and can work across voice, email and texting (SMS, WhatsApp, WeChat, Weibo, Telegram, etc.).
SalesGPT is context-aware, which means it can understand what stage of a sales conversation it is in and act accordingly. Morever, SalesGPT has access to tools, such as your own pre-defined product knowledge base, significantly reducing hallucinations.
We are building SalesGPT to power your best AI Sales Agents. Hence, we would love to learn more about use cases you are building towards which will fuel SalesGPT development roadmap, so please don't hesitate to reach out.
- The AI Agent can access data from any data system (new, old or legacy) via an integration to Mindware's AI Solutions Engineer.
- AI Sales Agents can now ACTUALLY sell! They autonomously generate Stripe payment links to sell products and services to customers.
- You can now test your AI Sales Agents via our frontend.
- Sales Agent can now take advantage of tools, such as look up products in a product catalog!
Unload AI Sales Agent Demos - Powered by SalesGPT: Our new virtual workforce? 🤔
Demo #1: Sarah - Patient Coordinator at South Orange Pediatrics
- 100X Your Healthcare Admin with our Virtual Workforce
Demo #2: Ted - Autonomously create payment links and collect customer payments
Try it yourself on our SalesGPT hosted frontend demo.
We are building SalesGPT to power your best AI Sales Agents. Hence, we would love to learn more about use cases you are building towards which will fuel SalesGPT development roadmap.
If you want us to build better towards your needs, or need help with your AI Sales Agents, please reach out to chat with us: SalesGPT Use Case Intake Survey
The AI Sales Agent understands the conversation stage (you can define your own stages fitting your needs):
- Introduction: Start the conversation by introducing yourself and your company.
- Qualification: Qualify the prospect by confirming if they are the right person to talk to regarding your product/service.
- Value Proposition: Briefly explain how your product/service can benefit the prospect.
- Needs Analysis: Ask open-ended questions to uncover the prospect's needs and pain points.
- Solution Presentation: Based on the prospect's needs, present your product/service as the solution that can address their pain points.
- Objection Handling: Address any objections that the prospect may have regarding your product/service.
- Close: Ask for the sale by proposing a next step.
- End Conversation: The user does not want to continue the conversation, so end the call.
- Reference only your business information & products and significantly reduce hallucinations!
- The AI Agent can actually close sales by generating Stripe payment link and closing orders from customers.
- The AI Agent can leverage data from any data system (new, old or legacy) via an integration to Mindware.
- Enhance your sales process with automated email communication. SalesGPT can now send personalized emails to prospects, including follow-ups or product information.
- The AI Agent can now facilitate scheduling meetings by generating Calendly links.
- Thanks to our integration with LiteLLM, you can choose any closed/open-sourced LLM to work with SalesGPT! Thanks to LiteLLM maintainers for this contribution!
- Synchronous & Asynchronous Completion with LLMs
- Synchronous & Asynchronous Streaming from LLMs
- Voice AI Sales Agent boasts <1s round trip response rate to human speakers which includes the entire pipeline - speech to text, LLM inference, and text to speech - while ensuring stability and scalability.
- For use cases where AI sales agent needs human supervision.
- debug, test, evaluate, and monitor chains and intelligent agents built on any LLM framework
- Upcoming integration with PromptArmor to protect your AI Sales Agents against security vulnerabilities (see our roadmap).
import os
from salesgpt.agents import SalesGPT
from langchain_community.chat_models import ChatLiteLLM
from dotenv import load_dotenv
load_dotenv() # make sure you have .env file with your API keys, eg., OPENAI_API_KEY=sk-xxx, MINDWARE_API_KEY etc.
# select your model - we support 50+ LLMs via LiteLLM https://docs.litellm.ai/docs/providers
llm = ChatLiteLLM(temperature=0.4, model_name="gpt-4-0125-preview")
sales_agent = SalesGPT.from_llm(llm, use_tools=True, verbose=False,
product_catalog = "examples/sample_product_catalog.txt",
salesperson_name="Ted Lasso",
salesperson_role="Sales Representative",
company_name="Sleep Haven",
company_business='''Sleep Haven
is a premium mattress company that provides
customers with the most comfortable and
supportive sleeping experience possible.
We offer a range of high-quality mattresses,
pillows, and bedding accessories
that are designed to meet the unique
needs of our customers.'''
)
sales_agent.seed_agent()
sales_agent.determine_conversation_stage() # optional for demonstration, built into the prompt
# agent
sales_agent.step()
# user
user_input = input('Your response: ') # Yea, sure
sales_agent.human_step(user_input)
# agent
sales_agent.determine_conversation_stage() # optional for demonstration, built into the prompt
sales_agent.step()
# user
user_input = input('Your response: ') # What pricing do you have for your mattresses?
sales_agent.human_step(user_input)
# agent
sales_agent.determine_conversation_stage() # optional for demonstration, built into the prompt
sales_agent.step()
Conversation Stage: Introduction: Start the conversation by introducing yourself and your company.
Thought: Do I need to use a tool? No
Ted Lasso: Hello, my name is Ted Lasso and I'm calling on behalf of Sleep Haven. We are a premium mattress company that provides customers with the most comfortable and supportive sleeping experience possible. I was wondering if you would be interested in learning more about our products and how they can improve your sleep?
User: Yea, sure
Conversation Stage: Value proposition: Briefly explain how your product/service can benefit the prospect.
Thought: Do I need to use a tool? No
Ted Lasso: Great to hear that! Our mattresses are specially designed to contour to your body shape, providing the perfect level of support and comfort for a better night's sleep. Plus, they're made with high-quality materials that are built to last. Would you like to hear more about our different mattress options?
User: What pricing do you have for your mattresses?
Thought: Do I need to use a tool? Yes --> AI Agent uses tools here.
Action: ProductSearch
ActionInput pricing for mattresses
Observation: Sleep Haven offers three mattresses: the Classic Harmony Spring Mattress for $1,299, the Plush Serenity Bamboo Mattress for $2,599, and the Luxury Cloud-Comfort Memory Foam Mattress for $999. The sizes available vary by mattress.
Thought: Do I need to use a tool? No
Ted Lasso: Sleep Haven offers three mattresses at different price points. The Classic Harmony Spring Mattress is priced at $1,299, the Plush Serenity Bamboo Mattress is priced at $2,599, and the Luxury Cloud-Comfort Memory Foam Mattress is priced at $999. The prices may vary depending on the size you choose. Would you like more information about the specific sizes and features of each mattress?
User: No, I will take two Classic Harmony Spring Mattress please.
Thought: Do I need to use a tool? Yes --> AI Agent uses tools here.
Action: GeneratePaymentLink
Action Input: Classic Harmony Spring Mattress, 2, John Doe Observation: {"response":"https://buy.stripe.com/test_9AQ6rygc397Bbv228p"}
Thought: Do I need to use a tool? No
Ted Lasso: Fantastic choice! I've generated a payment link for you to order two Classic Harmony Spring Mattresses. You can complete your purchase by clicking on this link: https://buy.stripe.com/test_9AQ6rygc397Bbv228p. Is there anything else I can assist you with today?
Make sure you have a python >=3.8,<3.12:
Create a virtual environment at a location on your computer. We use the generic "env" name for our virtual environment in the setup. You can rename this, but make sure to then use this name later when working with the environment (also rename the VENV variable in the Makefile accordingly to be able to use make commands successfully after cloning our repository):
- Open Command Prompt or PowerShell.
- Navigate to your project directory:
cd path\to\your\project
- Create a virtual environment:
python -m venv env
- Activate the virtual environment:
.\env\Scripts\activate
- Open Terminal.
- Navigate to your project directory:
cd path/to/your/project
- Create a virtual environment:
python3 -m venv env
- Activate the virtual environment:
source env/bin/activate
To deactivate a virtual environment after you have stopped using it simply run: deactivate
Clone the SalesGPT Github repository:
git clone https://github.com/filip-michalsky/SalesGPT.git
Navigate to the repository and in case you used a different venv name rename the VENV variable in the Makefile:
cd SalesGPT
If you simply want to work with SalesGPT as an end user without local changes you can install from PyPI using:
pip install salesgpt
If you want to work on your own version of SalesGPT or contribute to our open-source version install by activating your virtual environment as aforementioned and then run:
make setup
For more detailed installation steps along with the reasons for doing each please visit CONTRIBUTING.md
Finally, for use of SalesGPT create an .env
file just as our .env.example
and put your API keys there by specifying a new line just as we have done.
Navigate into the SalesGPT directory:
cd SalesGPT
SalesGPT can be run in various ways, tailored to your preferences and setup. Below are the methods available:
For those who prefer containerization, Docker offers an isolated and consistent environment. Ensure Docker is installed on your system by following the official Docker installation guide.
To run SalesGPT with Docker, execute the following steps:
-
Start the Application with Docker Compose:
Use the command below to start SalesGPT in detached mode:
docker-compose up -d
If you've made changes and want them to reflect, append
--build
to the command above. -
Stopping the Application:
To stop and remove all running containers related to SalesGPT, execute:
docker-compose down
Troubleshooting:
-
Clean Up Docker Resources: If you encounter errors, you can clean up Docker by removing all unused containers, networks, images, and volumes with caution:
docker system prune --volumes
-
Rebuild Without Cache: To rebuild and start the services afresh without using cache, run:
docker-compose up -d --build --no-cache
After successful setup, access SalesGPT at localhost:3000/chat in your browser.
If Docker is not part of your workflow, you can directly launch the SalesGPT user interface. Please refer to the README.md
file in the frontend directory for instructions on setting up the UI locally.
For terminal enthusiasts or automation scripts, run SalesGPT with the following command:
python run.py --verbose True --config examples/example_agent_setup.json
For those who wish to integrate SalesGPT's backend with their own user interface or application, running only the backend is a straightforward process. This allows you to leverage the powerful features of SalesGPT while maintaining full control over the user experience.
To run only the backend of SalesGPT, follow these steps:
-
Start the Backend Service:
Use the following command to start the backend service. This will initiate the server on port 8000 by default, making the API accessible:
docker-compose up -d backend
If you need to rebuild the backend image, perhaps after making changes, you can add
--build
to the command above. -
Accessing the Backend:
With the backend running, you can access the API endpoints at
http://localhost:8000
. Refer to the API documentation for details on available endpoints and their usage. -
Stopping the Backend:
To stop the backend service, execute:
docker-compose stop backend
If you wish to remove the backend container entirely, use:
docker-compose down
This setup is ideal for developers looking to integrate SalesGPT's backend into custom applications or those who prefer to use a different frontend technology.
- Activate your environment as described above. (run
source env/bin/activate
on Unix-like systems and.\env\Scripts\activate
on Windows. Replace env with the name of your virtual environment) - cd
SalesGPT
If you haven't already navigated to the SalesGPT home directory make test
All tests should pass. Warnings can be ignored.
Regular updates are added to the SalesGPT repository. If your code is not working please check the current README file for possible changes in launch. With updates often extra environment variables are added, which are however required for certain functionality. Please check current .env.example file.
For any problems you can not solve please don't hesitate to reach out to our discord #support channel.
To delete the virtual environment you used for SalesGPT programming and your SalesGPT repository from your system navigate to the directory where you installed your virtual environment and cloned SalesGPT and run:
make clean
We leverage the langchain
library in this implementation, specifically Custom Agent Configuration and are inspired by BabyAGI architecture.
LangSmith is a platform for building production-grade LLM applications.
It lets you debug, test, evaluate, and monitor chains and intelligent agents built on any LLM framework and seamlessly integrates with LangChain, the go-to open source framework for building with LLMs.
LangSmith is developed by LangChain, the company behind the open source LangChain framework.
To switch on the LangSmith tracing you have to do the following steps:
- Create a LangSmith account
- Create an API key in settings
- Add you API key and Project name from LangSmith account to .env file or run.py module
- Switch on the "LANGCHAIN_TRACING_V2" setting in run.py to "true"
- That's it. You'll get better understanding of your agents and chaing performance in LangChain admin panel.
For futher reading take a look at the docs
- Improve SalesGPT observability
- Enhance prompt versioning
- Add prompt evaluation
- Write more documentation
- Better code documentation
- Refactor
Improve reliability of the parser issue here and here- Improve Deployment Instructions
- Calling Functionality - sample code
- Enterprise-Grade Security - integration with PromptArmor to protect your AI Sales Agents against security vulnerabilities
- LLM evaluations
- Resolve tickets and PRs (ongoing)
- Add example implementation of OpenAI functions agentissue here
- Add support for multiple tools issue here
- Add an agent controller for when stages need to be traversed linearly without skips issue here
- Add
tool_getter
to choose a tool based on vector distance to the tasks needed to be done - What tools should the agent have? (e.g., the ability to search the internet)
- Add the ability of Sales Agent to interact with AI plugins on your website (.well-known/ai-plugin.json)
- More SalesGPT examples
Lead Maintaner: Filip Michalsky
- Contact Email
- Follow us on X at @FilipMichalsky
Our Support Team:
- AI Engineering: Honza Michna (LinkedIn)
- Software Engineer: Illia Makovoz
Contributions are highly encouraged! Please fork and submit a PR.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for SalesGPT
Similar Open Source Tools
SalesGPT
SalesGPT is an open-source AI agent designed for sales, utilizing context-awareness and LLMs to work across various communication channels like voice, email, and texting. It aims to enhance sales conversations by understanding the stage of the conversation and providing tools like product knowledge base to reduce errors. The agent can autonomously generate payment links, handle objections, and close sales. It also offers features like automated email communication, meeting scheduling, and integration with various LLMs for customization. SalesGPT is optimized for low latency in voice channels and ensures human supervision where necessary. The tool provides enterprise-grade security and supports LangSmith tracing for monitoring and evaluation of intelligent agents built on LLM frameworks.
promptflow
**Prompt flow** is a suite of development tools designed to streamline the end-to-end development cycle of LLM-based AI applications, from ideation, prototyping, testing, evaluation to production deployment and monitoring. It makes prompt engineering much easier and enables you to build LLM apps with production quality.
ZetaForge
ZetaForge is an open-source AI platform designed for rapid development of advanced AI and AGI pipelines. It allows users to assemble reusable, customizable, and containerized Blocks into highly visual AI Pipelines, enabling rapid experimentation and collaboration. With ZetaForge, users can work with AI technologies in any programming language, easily modify and update AI pipelines, dive into the code whenever needed, utilize community-driven blocks and pipelines, and share their own creations. The platform aims to accelerate the development and deployment of advanced AI solutions through its user-friendly interface and community support.
LaVague
LaVague is an open-source Large Action Model framework that uses advanced AI techniques to compile natural language instructions into browser automation code. It leverages Selenium or Playwright for browser actions. Users can interact with LaVague through an interactive Gradio interface to automate web interactions. The tool requires an OpenAI API key for default examples and offers a Playwright integration guide. Contributors can help by working on outlined tasks, submitting PRs, and engaging with the community on Discord. The project roadmap is available to track progress, but users should exercise caution when executing LLM-generated code using 'exec'.
langwatch
LangWatch is a monitoring and analytics platform designed to track, visualize, and analyze interactions with Large Language Models (LLMs). It offers real-time telemetry to optimize LLM cost and latency, a user-friendly interface for deep insights into LLM behavior, user analytics for engagement metrics, detailed debugging capabilities, and guardrails to monitor LLM outputs for issues like PII leaks and toxic language. The platform supports OpenAI and LangChain integrations, simplifying the process of tracing LLM calls and generating API keys for usage. LangWatch also provides documentation for easy integration and self-hosting options for interested users.
crewAI
CrewAI is a cutting-edge framework designed to orchestrate role-playing autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks. It enables AI agents to assume roles, share goals, and operate in a cohesive unit, much like a well-oiled crew. Whether you're building a smart assistant platform, an automated customer service ensemble, or a multi-agent research team, CrewAI provides the backbone for sophisticated multi-agent interactions. With features like role-based agent design, autonomous inter-agent delegation, flexible task management, and support for various LLMs, CrewAI offers a dynamic and adaptable solution for both development and production workflows.
Open_Data_QnA
Open Data QnA is a Python library that allows users to interact with their PostgreSQL or BigQuery databases in a conversational manner, without needing to write SQL queries. The library leverages Large Language Models (LLMs) to bridge the gap between human language and database queries, enabling users to ask questions in natural language and receive informative responses. It offers features such as conversational querying with multiturn support, table grouping, multi schema/dataset support, SQL generation, query refinement, natural language responses, visualizations, and extensibility. The library is built on a modular design and supports various components like Database Connectors, Vector Stores, and Agents for SQL generation, validation, debugging, descriptions, embeddings, responses, and visualizations.
conversational-agent-langchain
This repository contains a Rest-Backend for a Conversational Agent that allows embedding documents, semantic search, QA based on documents, and document processing with Large Language Models. It uses Aleph Alpha and OpenAI Large Language Models to generate responses to user queries, includes a vector database, and provides a REST API built with FastAPI. The project also features semantic search, secret management for API keys, installation instructions, and development guidelines for both backend and frontend components.
pgai
pgai simplifies the process of building search and Retrieval Augmented Generation (RAG) AI applications with PostgreSQL. It brings embedding and generation AI models closer to the database, allowing users to create embeddings, retrieve LLM chat completions, reason over data for classification, summarization, and data enrichment directly from within PostgreSQL in a SQL query. The tool requires an OpenAI API key and a PostgreSQL client to enable AI functionality in the database. Users can install pgai from source, run it in a pre-built Docker container, or enable it in a Timescale Cloud service. The tool provides functions to handle API keys using psql or Python, and offers various AI functionalities like tokenizing, detokenizing, embedding, chat completion, and content moderation.
langdrive
LangDrive is an open-source AI library that simplifies training, deploying, and querying open-source large language models (LLMs) using private data. It supports data ingestion, fine-tuning, and deployment via a command-line interface, YAML file, or API, with a quick, easy setup. Users can build AI applications such as question/answering systems, chatbots, AI agents, and content generators. The library provides features like data connectors for ingestion, fine-tuning of LLMs, deployment to Hugging Face hub, inference querying, data utilities for CRUD operations, and APIs for model access. LangDrive is designed to streamline the process of working with LLMs and making AI development more accessible.
genai-for-marketing
This repository provides a deployment guide for utilizing Google Cloud's Generative AI tools in marketing scenarios. It includes step-by-step instructions, examples of crafting marketing materials, and supplementary Jupyter notebooks. The demos cover marketing insights, audience analysis, trendspotting, content search, content generation, and workspace integration. Users can access and visualize marketing data, analyze trends, improve search experience, and generate compelling content. The repository structure includes backend APIs, frontend code, sample notebooks, templates, and installation scripts.
classifai
Supercharge WordPress Content Workflows and Engagement with Artificial Intelligence. Tap into leading cloud-based services like OpenAI, Microsoft Azure AI, Google Gemini and IBM Watson to augment your WordPress-powered websites. Publish content faster while improving SEO performance and increasing audience engagement. ClassifAI integrates Artificial Intelligence and Machine Learning technologies to lighten your workload and eliminate tedious tasks, giving you more time to create original content that matters.
pathway
Pathway is a Python data processing framework for analytics and AI pipelines over data streams. It's the ideal solution for real-time processing use cases like streaming ETL or RAG pipelines for unstructured data. Pathway comes with an **easy-to-use Python API** , allowing you to seamlessly integrate your favorite Python ML libraries. Pathway code is versatile and robust: **you can use it in both development and production environments, handling both batch and streaming data effectively**. The same code can be used for local development, CI/CD tests, running batch jobs, handling stream replays, and processing data streams. Pathway is powered by a **scalable Rust engine** based on Differential Dataflow and performs incremental computation. Your Pathway code, despite being written in Python, is run by the Rust engine, enabling multithreading, multiprocessing, and distributed computations. All the pipeline is kept in memory and can be easily deployed with **Docker and Kubernetes**. You can install Pathway with pip: `pip install -U pathway` For any questions, you will find the community and team behind the project on Discord.
promptpanel
Prompt Panel is a tool designed to accelerate the adoption of AI agents by providing a platform where users can run large language models across any inference provider, create custom agent plugins, and use their own data safely. The tool allows users to break free from walled-gardens and have full control over their models, conversations, and logic. With Prompt Panel, users can pair their data with any language model, online or offline, and customize the system to meet their unique business needs without any restrictions.
gpt-researcher
GPT Researcher is an autonomous agent designed for comprehensive online research on a variety of tasks. It can produce detailed, factual, and unbiased research reports with customization options. The tool addresses issues of speed, determinism, and reliability by leveraging parallelized agent work. The main idea involves running 'planner' and 'execution' agents to generate research questions, seek related information, and create research reports. GPT Researcher optimizes costs and completes tasks in around 3 minutes. Features include generating long research reports, aggregating web sources, an easy-to-use web interface, scraping web sources, and exporting reports to various formats.
design-studio
Tiledesk Design Studio is an open-source, no-code development platform for creating chatbots and conversational apps. It offers a user-friendly, drag-and-drop interface with pre-ready actions and integrations. The platform combines the power of LLM/GPT AI with a flexible 'graph' approach for creating conversations and automations with ease. Users can automate customer conversations, prototype conversations, integrate ChatGPT, enhance user experience with multimedia, provide personalized product recommendations, set conditions, use random replies, connect to other tools like HubSpot CRM, integrate with WhatsApp, send emails, and seamlessly enhance existing setups.
For similar tasks
tiledesk-dashboard
Tiledesk is an open-source live chat platform with integrated chatbots written in Node.js and Express. It is designed to be a multi-channel platform for web, Android, and iOS, and it can be used to increase sales or provide post-sales customer service. Tiledesk's chatbot technology allows for automation of conversations, and it also provides APIs and webhooks for connecting external applications. Additionally, it offers a marketplace for apps and features such as CRM, ticketing, and data export.
SalesGPT
SalesGPT is an open-source AI agent designed for sales, utilizing context-awareness and LLMs to work across various communication channels like voice, email, and texting. It aims to enhance sales conversations by understanding the stage of the conversation and providing tools like product knowledge base to reduce errors. The agent can autonomously generate payment links, handle objections, and close sales. It also offers features like automated email communication, meeting scheduling, and integration with various LLMs for customization. SalesGPT is optimized for low latency in voice channels and ensures human supervision where necessary. The tool provides enterprise-grade security and supports LangSmith tracing for monitoring and evaluation of intelligent agents built on LLM frameworks.
J.A.R.V.I.S.
J.A.R.V.I.S.1.0 is an advanced virtual assistant tool designed to assist users in various tasks. It provides a wide range of functionalities including voice commands, task automation, information retrieval, and communication management. With its intuitive interface and powerful capabilities, J.A.R.V.I.S.1.0 aims to enhance productivity and streamline daily activities for users.
For similar jobs
SalesGPT
SalesGPT is an open-source AI agent designed for sales, utilizing context-awareness and LLMs to work across various communication channels like voice, email, and texting. It aims to enhance sales conversations by understanding the stage of the conversation and providing tools like product knowledge base to reduce errors. The agent can autonomously generate payment links, handle objections, and close sales. It also offers features like automated email communication, meeting scheduling, and integration with various LLMs for customization. SalesGPT is optimized for low latency in voice channels and ensures human supervision where necessary. The tool provides enterprise-grade security and supports LangSmith tracing for monitoring and evaluation of intelligent agents built on LLM frameworks.
qrev
QRev is an open-source alternative to Salesforce, offering AI agents to scale sales organizations infinitely. It aims to provide digital workers for various sales roles or a superagent named Qai. The tech stack includes TypeScript for frontend, NodeJS for backend, MongoDB for app server database, ChromaDB for vector database, SQLite for AI server SQL relational database, and Langchain for LLM tooling. The tool allows users to run client app, app server, and AI server components. It requires Node.js and MongoDB to be installed, and provides detailed setup instructions in the README file.
wppconnect
WPPConnect is an open source project developed by the JavaScript community with the aim of exporting functions from WhatsApp Web to the node, which can be used to support the creation of any interaction, such as customer service, media sending, intelligence recognition based on phrases artificial and many other things.
linkedin-api
The Linkedin API for Python allows users to programmatically search profiles, send messages, and find jobs using a regular Linkedin user account. It does not require 'official' API access, just a valid Linkedin account. However, it is important to note that this library is not officially supported by LinkedIn and using it may violate LinkedIn's Terms of Service. Users can authenticate using any Linkedin account credentials and access features like getting profiles, profile contact info, and connections. The library also provides commercial alternatives for extracting data, scraping public profiles, and accessing a full LinkedIn API. It is not endorsed or supported by LinkedIn and is intended for educational purposes and personal use only.
ChatFAQ
ChatFAQ is an open-source comprehensive platform for creating a wide variety of chatbots: generic ones, business-trained, or even capable of redirecting requests to human operators. It includes a specialized NLP/NLG engine based on a RAG architecture and customized chat widgets, ensuring a tailored experience for users and avoiding vendor lock-in.
anything-llm
AnythingLLM is a full-stack application that enables you to turn any document, resource, or piece of content into context that any LLM can use as references during chatting. This application allows you to pick and choose which LLM or Vector Database you want to use as well as supporting multi-user management and permissions.
chatbot-ui
Chatbot UI is an open-source AI chat app that allows users to create and deploy their own AI chatbots. It is easy to use and can be customized to fit any need. Chatbot UI is perfect for businesses, developers, and anyone who wants to create a chatbot.
deep-chat
Deep Chat is a fully customizable AI chat component that can be injected into your website with minimal to no effort. Whether you want to create a chatbot that leverages popular APIs such as ChatGPT or connect to your own custom service, this component can do it all! Explore deepchat.dev to view all of the available features, how to use them, examples and more!