
awesome-llm-apps
Collection of awesome LLM apps with AI Agents and RAG using OpenAI, Anthropic, Gemini and opensource models.
Stars: 66590

Awesome LLM Apps is a curated collection of applications that leverage RAG with OpenAI, Anthropic, Gemini, and open-source models. The repository contains projects such as Local Llama-3 with RAG for chatting with webpages locally, Chat with Gmail for interacting with Gmail using natural language, Chat with Substack Newsletter for conversing with Substack newsletters using GPT-4, Chat with PDF for intelligent conversation based on PDF documents, and Chat with YouTube Videos for engaging with YouTube video content through natural language. Users can clone the repository, navigate to specific project directories, install dependencies, and follow project-specific instructions to set up and run the apps. Contributions are encouraged, and new app ideas or improvements can be submitted via pull requests.
README:
Deutsch | Español | français | 日本語 | 한국어 | Português | Русский | 中文
A curated collection of Awesome LLM apps built with RAG, AI Agents, Multi-agent Teams, MCP, Voice Agents, and more. This repository features LLM apps that use models from OpenAI ,
Anthropic,
Google,
xAI and open-source models like
Qwen or
Llama that you can run locally on your computer.
- 💡 Discover practical and creative ways LLMs can be applied across different domains, from code repositories to email inboxes and more.
- 🔥 Explore apps that combine LLMs from OpenAI, Anthropic, Gemini, and open-source alternatives with AI Agents, Agent Teams, MCP & RAG.
- 🎓 Learn from well-documented projects and contribute to the growing open-source ecosystem of LLM-powered applications.
![]() Unblocked |
![]() Become a Sponsor |
- 🎙️ AI Blog to Podcast Agent
- ❤️🩹 AI Breakup Recovery Agent
- 📊 AI Data Analysis Agent
- 🩻 AI Medical Imaging Agent
- 😂 AI Meme Generator Agent (Browser)
- 🎵 AI Music Generator Agent
- 🛫 AI Travel Agent (Local & Cloud)
- ✨ Gemini Multimodal Agent
- 🌐 Local News Agent (OpenAI Swarm)
- 🔄 Mixture of Agents
- 📊 xAI Finance Agent
- 🔍 OpenAI Research Agent
- 🕸️ Web Scrapping AI Agent (Local & Cloud)
- 🔍 AI Deep Research Agent
- 🤝 AI Consultant Agent
- 🏗️ AI System Architect Agent
- 🎯 AI Lead Generation Agent
- 💰 AI Financial Coach Agent
- 🎬 AI Movie Production Agent
- 📈 AI Investment Agent
- 🏋️♂️ AI Health & Fitness Agent
- 🚀 AI Product Launch Intelligence Agent
- 🗞️ AI Journalist Agent
- 🧠 AI Mental Wellbeing Agent
- 📑 AI Meeting Agent
- 🧬 AI Self-Evolving Agent
- 🎧 AI Social Media News and Podcast Agent
- 🧲 AI Competitor Intelligence Agent Team
- 💲 AI Finance Agent Team
- 🎨 AI Game Design Agent Team
- 👨⚖️ AI Legal Agent Team (Cloud & Local)
- 💼 AI Recruitment Agent Team
- 🏠 AI Real Estate Agent Team
- 👨💼 AI Services Agency (CrewAI)
- 👨🏫 AI Teaching Agent Team
- 💻 Multimodal Coding Agent Team
- ✨ Multimodal Design Agent Team
- 🌏 AI Travel Planner Agent Team
- 🔥 Agentic RAG with Embedding Gemma
- 🧐 Agentic RAG with Reasoning
- 📰 AI Blog Search (RAG)
- 🔍 Autonomous RAG
- 🔄 Contextual AI RAG Agent
- 🔄 Corrective RAG (CRAG)
- 🐋 Deepseek Local RAG Agent
- 🤔 Gemini Agentic RAG
- 👀 Hybrid Search RAG (Cloud)
- 🔄 Llama 3.1 Local RAG
- 🖥️ Local Hybrid Search RAG
- 🦙 Local RAG Agent
- 🧩 RAG-as-a-Service
- ✨ RAG Agent with Cohere
- ⛓️ Basic RAG Chain
- 📠 RAG with Database Routing
- 🖼️ Vision RAG
- 💾 AI ArXiv Agent with Memory
- 🛩️ AI Travel Agent with Memory
- 💬 Llama3 Stateful Chat
- 📝 LLM App with Personalized Memory
- 🗄️ Local ChatGPT Clone with Memory
- 🧠 Multi-LLM Application with Shared Memory
- 💬 Chat with GitHub (GPT & Llama3)
- 📨 Chat with Gmail
- 📄 Chat with PDF (GPT & Llama3)
- 📚 Chat with Research Papers (ArXiv) (GPT & Llama3)
- 📝 Chat with Substack
- 📽️ Chat with YouTube Videos
- Starter agent; model‑agnostic (OpenAI, Claude)
- Structured outputs (Pydantic)
- Tools: built‑in, function, third‑party, MCP tools
- Memory; callbacks; Plugins
- Simple multi‑agent; Multi‑agent patterns
OpenAI Agents SDK Crash Course
- Starter agent; function calling; structured outputs
- Tools: built‑in, function, third‑party integrations
- Memory; callbacks; evaluation
- Multi‑agent patterns; agent handoffs
- Swarm orchestration; routing logic
-
Clone the repository
git clone https://github.com/Shubhamsaboo/awesome-llm-apps.git
-
Navigate to the desired project directory
cd awesome-llm-apps/starter_ai_agents/ai_travel_agent
-
Install the required dependencies
pip install -r requirements.txt
-
Follow the project-specific instructions in each project's
README.md
file to set up and run the app.
Contributions are welcome! If you have any ideas, improvements, or new apps to add, please create a new GitHub Issue or submit a pull request. Make sure to follow the existing project structure and include a detailed README.md
for each new app.
🌟 Don’t miss out on future updates! Star the repo now and be the first to know about new and exciting LLM apps with RAG and AI Agents.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for awesome-llm-apps
Similar Open Source Tools

awesome-llm-apps
Awesome LLM Apps is a curated collection of applications that leverage RAG with OpenAI, Anthropic, Gemini, and open-source models. The repository contains projects such as Local Llama-3 with RAG for chatting with webpages locally, Chat with Gmail for interacting with Gmail using natural language, Chat with Substack Newsletter for conversing with Substack newsletters using GPT-4, Chat with PDF for intelligent conversation based on PDF documents, and Chat with YouTube Videos for engaging with YouTube video content through natural language. Users can clone the repository, navigate to specific project directories, install dependencies, and follow project-specific instructions to set up and run the apps. Contributions are encouraged, and new app ideas or improvements can be submitted via pull requests.

AI-Engineering.academy
AI Engineering Academy aims to provide a structured learning path for individuals looking to learn Applied AI effectively. The platform offers multiple roadmaps covering topics like Retrieval Augmented Generation, Fine-tuning, and Deployment. Each roadmap equips learners with the knowledge and skills needed to excel in applied GenAI. Additionally, the platform will feature Hands-on End-to-End AI projects in the future.

local-deep-research
Local Deep Research is a powerful AI-powered research assistant that performs deep, iterative analysis using multiple LLMs and web searches. It can be run locally for privacy or configured to use cloud-based LLMs for enhanced capabilities. The tool offers advanced research capabilities, flexible LLM support, rich output options, privacy-focused operation, enhanced search integration, and academic & scientific integration. It also provides a web interface, command line interface, and supports multiple LLM providers and search engines. Users can configure AI models, search engines, and research parameters for customized research experiences.

growi
GROWI is a collaborative wiki platform that allows users to create hierarchical pages with markdown, edit simultaneously with multiple people, and support authentication with LDAP/Active Directory, OAuth, and SAML. It also integrates with Slack/Mattermost, IFTTT, and allows for plugin customization. GROWI is Docker and Docker Compose ready, supports multiple sites, HTTPS with Let's Encrypt proxy integration, and offers migration guides for on-premise installations. The tool is built with Node.js, npm, pnpm, Turborepo, and requires MongoDB, with optional dependencies on Redis and ElasticSearch for full-text search functionality.

aide
Aide is a Visual Studio Code extension that offers AI-powered features to help users master any code. It provides functionalities such as code conversion between languages, code annotation for readability, quick copying of files/folders as AI prompts, executing custom AI commands, defining prompt templates, multi-file support, setting keyboard shortcuts, and more. Users can enhance their productivity and coding experience by leveraging Aide's intelligent capabilities.

bytebot
Bytebot is an open-source AI desktop agent that provides a virtual employee with its own computer to complete tasks for users. It can use various applications, download and organize files, log into websites, process documents, and perform complex multi-step workflows. By giving AI access to a complete desktop environment, Bytebot unlocks capabilities not possible with browser-only agents or API integrations, enabling complete task autonomy, document processing, and usage of real applications.

cherry-studio
Cherry Studio is a desktop client that supports multiple Large Language Model (LLM) providers, available on Windows, Mac, and Linux. It allows users to create multiple Assistants and topics, use multiple models to answer questions in the same conversation, and supports drag-and-drop sorting, code highlighting, and Mermaid chart. The tool is designed to enhance productivity and streamline the process of interacting with various language models.

mcp-pointer
MCP Pointer is a local tool that combines an MCP Server with a Chrome Extension to allow users to visually select DOM elements in the browser and make textual context available to agentic coding tools like Claude Code. It bridges between the browser and AI tools via the Model Context Protocol, enabling real-time communication and compatibility with various AI tools. The tool extracts detailed information about selected elements, including text content, CSS properties, React component detection, and more, making it a valuable asset for developers working with AI-powered web development.

abi
ABI (Agentic Brain Infrastructure) is a Python-based AI Operating System designed to serve as the core infrastructure for building an Agentic AI Ontology Engine. It empowers organizations to integrate, manage, and scale AI-driven operations with multiple AI models, focusing on ontology, agent-driven workflows, and analytics. ABI emphasizes modularity and customization, providing a customizable framework aligned with international standards and regulatory frameworks. It offers features such as configurable AI agents, ontology management, integrations with external data sources, data processing pipelines, workflow automation, analytics, and data handling capabilities.

databend
Databend is an open-source cloud data warehouse built in Rust, offering fast query execution and data ingestion for complex analysis of large datasets. It integrates with major cloud platforms, provides high performance with AI-powered analytics, supports multiple data formats, ensures data integrity with ACID transactions, offers flexible indexing options, and features community-driven development. Users can try Databend through a serverless cloud or Docker installation, and perform tasks such as data import/export, querying semi-structured data, managing users/databases/tables, and utilizing AI functions.

superagentx
SuperAgentX is a lightweight open-source AI framework designed for multi-agent applications with Artificial General Intelligence (AGI) capabilities. It offers goal-oriented multi-agents with retry mechanisms, easy deployment through WebSocket, RESTful API, and IO console interfaces, streamlined architecture with no major dependencies, contextual memory using SQL + Vector databases, flexible LLM configuration supporting various Gen AI models, and extendable handlers for integration with diverse APIs and data sources. It aims to accelerate the development of AGI by providing a powerful platform for building autonomous AI agents capable of executing complex tasks with minimal human intervention.

chatbox
Chatbox is a desktop client for ChatGPT, Claude, and other LLMs, providing features like local data storage, multiple LLM provider support, image generation, enhanced prompting, keyboard shortcuts, and more. It offers a user-friendly interface with dark theme, team collaboration, cross-platform availability, web version access, iOS & Android apps, multilingual support, and ongoing feature enhancements. Developed for prompt and API debugging, it has gained popularity for daily chatting and professional role-playing with AI assistance.

memU
MemU is an open-source memory framework designed for AI companions, offering high accuracy, fast retrieval, and cost-effectiveness. It serves as an intelligent 'memory folder' that adapts to various AI companion scenarios. With MemU, users can create AI companions that remember them, learn their preferences, and evolve through interactions. The framework provides advanced retrieval strategies, 24/7 support, and is specialized for AI companions. MemU offers cloud, enterprise, and self-hosting options, with features like memory organization, interconnected knowledge graph, continuous self-improvement, and adaptive forgetting mechanism. It boasts high memory accuracy, fast retrieval, and low cost, making it suitable for building intelligent agents with persistent memory capabilities.

ComfyUI-Copilot
ComfyUI-Copilot is an intelligent assistant built on the Comfy-UI framework that simplifies and enhances the AI algorithm debugging and deployment process through natural language interactions. It offers intuitive node recommendations, workflow building aids, and model querying services to streamline development processes. With features like interactive Q&A bot, natural language node suggestions, smart workflow assistance, and model querying, ComfyUI-Copilot aims to lower the barriers to entry for beginners, boost development efficiency with AI-driven suggestions, and provide real-time assistance for developers.

chatbox
Chatbox is a desktop client for ChatGPT, Claude, and other LLMs, providing a user-friendly interface for AI copilot assistance on Windows, Mac, and Linux. It offers features like local data storage, multiple LLM provider support, image generation with Dall-E-3, enhanced prompting, keyboard shortcuts, and more. Users can collaborate, access the tool on various platforms, and enjoy multilingual support. Chatbox is constantly evolving with new features to enhance the user experience.

llamator
LLAMATOR is a Red Teaming Python framework designed for testing chatbots and LLM systems. It provides support for custom attacks, a wide range of attack options in English and Russian, custom configuration of chat clients, history tracking of attack requests and responses in Excel and CSV formats, and test report generation in DOCX format. The tool is classified under OWASP as addressing prompt injection, system prompt leakage, and misinformation. It is supported by the AI Security Lab ITMO, Raft Security, and AI Talent Hub, and is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.
For similar tasks

awesome-llm-apps
Awesome LLM Apps is a curated collection of applications that leverage RAG with OpenAI, Anthropic, Gemini, and open-source models. The repository contains projects such as Local Llama-3 with RAG for chatting with webpages locally, Chat with Gmail for interacting with Gmail using natural language, Chat with Substack Newsletter for conversing with Substack newsletters using GPT-4, Chat with PDF for intelligent conversation based on PDF documents, and Chat with YouTube Videos for engaging with YouTube video content through natural language. Users can clone the repository, navigate to specific project directories, install dependencies, and follow project-specific instructions to set up and run the apps. Contributions are encouraged, and new app ideas or improvements can be submitted via pull requests.

page-assist
Page Assist is an open-source Chrome Extension that provides a Sidebar and Web UI for your Local AI model. It allows you to interact with your model from any webpage.
For similar jobs

weave
Weave is a toolkit for developing Generative AI applications, built by Weights & Biases. With Weave, you can log and debug language model inputs, outputs, and traces; build rigorous, apples-to-apples evaluations for language model use cases; and organize all the information generated across the LLM workflow, from experimentation to evaluations to production. Weave aims to bring rigor, best-practices, and composability to the inherently experimental process of developing Generative AI software, without introducing cognitive overhead.

LLMStack
LLMStack is a no-code platform for building generative AI agents, workflows, and chatbots. It allows users to connect their own data, internal tools, and GPT-powered models without any coding experience. LLMStack can be deployed to the cloud or on-premise and can be accessed via HTTP API or triggered from Slack or Discord.

VisionCraft
The VisionCraft API is a free API for using over 100 different AI models. From images to sound.

kaito
Kaito is an operator that automates the AI/ML inference model deployment in a Kubernetes cluster. It manages large model files using container images, avoids tuning deployment parameters to fit GPU hardware by providing preset configurations, auto-provisions GPU nodes based on model requirements, and hosts large model images in the public Microsoft Container Registry (MCR) if the license allows. Using Kaito, the workflow of onboarding large AI inference models in Kubernetes is largely simplified.

PyRIT
PyRIT is an open access automation framework designed to empower security professionals and ML engineers to red team foundation models and their applications. It automates AI Red Teaming tasks to allow operators to focus on more complicated and time-consuming tasks and can also identify security harms such as misuse (e.g., malware generation, jailbreaking), and privacy harms (e.g., identity theft). The goal is to allow researchers to have a baseline of how well their model and entire inference pipeline is doing against different harm categories and to be able to compare that baseline to future iterations of their model. This allows them to have empirical data on how well their model is doing today, and detect any degradation of performance based on future improvements.

tabby
Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features: * Self-contained, with no need for a DBMS or cloud service. * OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE). * Supports consumer-grade GPUs.

spear
SPEAR (Simulator for Photorealistic Embodied AI Research) is a powerful tool for training embodied agents. It features 300 unique virtual indoor environments with 2,566 unique rooms and 17,234 unique objects that can be manipulated individually. Each environment is designed by a professional artist and features detailed geometry, photorealistic materials, and a unique floor plan and object layout. SPEAR is implemented as Unreal Engine assets and provides an OpenAI Gym interface for interacting with the environments via Python.

Magick
Magick is a groundbreaking visual AIDE (Artificial Intelligence Development Environment) for no-code data pipelines and multimodal agents. Magick can connect to other services and comes with nodes and templates well-suited for intelligent agents, chatbots, complex reasoning systems and realistic characters.