
inbox-zero
The world's best AI personal assistant for email. Open source app to help you reach inbox zero fast.
Stars: 8774

Inbox Zero is an open-source email app that helps you reach inbox zero fast with AI assistance. It offers various features such as a newsletter cleaner, AI assistant for auto-responding, archiving, labeling, and forwarding emails, a cold email blocker, email analytics, tracking of new senders and unreplied emails, and a large email finder to free up space. Inbox Zero is built with Next.js, Tailwind CSS, Prisma, Tinybird, Upstash, and Turbo.
README:
Organizes your inbox, pre-drafts replies, and tracks follow‑ups - so you reach inbox zero faster. Open source alternative to Fyxer, but more customisable and secure.
Website
·
Discord
·
Issues
To help you spend less time in your inbox, so you can focus on what matters.
- AI Personal Assistant: Organizes your inbox and pre-drafts replies in your tone and style.
- Cursor Rules for email: Explain in plain English how your AI should handle your inbox.
- Reply Zero: Track emails to reply to and those awaiting responses.
- Smart Categories: Automatically categorize every sender.
- Bulk Unsubscriber: One-click unsubscribe and archive emails you never read.
- Cold Email Blocker: Auto‑block cold emails.
- Email Analytics: Track your activity and trends over time.
Learn more in our docs.
![]() |
![]() |
---|---|
AI Assistant | Reply Zero |
![]() |
![]() |
Gmail client | Bulk Unsubscriber |
To request a feature open a GitHub issue, or join our Discord.
We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.
For a complete guide on deploying Inbox Zero to a VPS using Docker, see our Docker Self-Hosting Guide.
Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.
- Node.js >= 18.0.0
- pnpm >= 8.6.12
- Docker desktop (recommended but optional)
Make sure you have the above installed before starting.
The external services that are required are (detailed setup instructions below):
Create your own .env
file from the example supplied:
cp apps/web/.env.example apps/web/.env
cd apps/web
pnpm install
Set the environment variables in the newly created .env
. You can see a list of required variables in: apps/web/env.ts
.
The required environment variables:
-
AUTH_SECRET
-- can be any random string (try usingopenssl rand -hex 32
for a quick secure random string) -
EMAIL_ENCRYPT_SECRET
-- Secret key for encrypting OAuth tokens (try usingopenssl rand -hex 32
for a secure key) -
EMAIL_ENCRYPT_SALT
-- Salt for encrypting OAuth tokens (try usingopenssl rand -hex 16
for a secure salt) -
NEXT_PUBLIC_BASE_URL
-- The URL where your app is hosted (e.g.,http://localhost:3000
for local development orhttps://yourdomain.com
for production). -
INTERNAL_API_KEY
-- A secret key for internal API calls (try usingopenssl rand -hex 32
for a secure key) -
UPSTASH_REDIS_URL
-- Redis URL from Upstash. (can be empty if you are using Docker Compose) -
UPSTASH_REDIS_TOKEN
-- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)
When using Vercel with Fluid Compute turned off, you should set MAX_DURATION=300
or lower. See Vercel limits for different plans here.
-
GOOGLE_CLIENT_ID
-- Google OAuth client ID. More info here -
GOOGLE_CLIENT_SECRET
-- Google OAuth client secret. More info here
Go to Google Cloud. Create a new project if necessary.
Create new credentials:
-
If the banner shows up, configure consent screen (if not, you can do this later)
- Click the banner, then Click
Get Started
. - Choose a name for your app, and enter your email.
- In Audience, choose
External
- Enter your contact information
- Agree to the User Data policy and then click
Create
. - Return to APIs and Services using the left sidebar.
- Click the banner, then Click
-
Create new credentials:
- Click the
+Create Credentials
button. Choose OAuth Client ID. - In
Application Type
, ChooseWeb application
- Choose a name for your web client
- In Authorized JavaScript origins, add a URI and enter
http://localhost:3000
- In
Authorized redirect URIs
enter:
http://localhost:3000/api/auth/callback/google
http://localhost:3000/api/google/linking/callback
- Click
Create
. - A popup will show up with the new credentials, including the Client ID and secret.
- Click the
-
Update .env file:
- Copy the Client ID to
GOOGLE_CLIENT_ID
- Copy the Client secret to
GOOGLE_CLIENT_SECRET
- Copy the Client ID to
-
Update scopes
- Go to
Data Access
in the left sidebar (or click link above) - Click
Add or remove scopes
- Copy paste the below into the
Manually add scopes
box:
https://www.googleapis.com/auth/userinfo.profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/gmail.modify https://www.googleapis.com/auth/gmail.settings.basic https://www.googleapis.com/auth/contacts
- Click
Update
- Click
Save
in the Data Access page.
- Go to
-
Add yourself as a test user
- Go to Audience
- In the
Test users
section, click+Add users
- Enter your email and press
Save
-
MICROSOFT_CLIENT_ID
-- Microsoft OAuth client ID -
MICROSOFT_CLIENT_SECRET
-- Microsoft OAuth client secret
Go to Microsoft Azure Portal. Create a new Azure Active Directory app registration:
-
Navigate to Azure Active Directory
-
Go to "App registrations" in the left sidebar or search it in the searchbar
-
Click "New registration"
- Choose a name for your application
- Under "Supported account types" select "Accounts in any organizational directory (Any Azure AD directory - Multitenant) and personal Microsoft accounts (e.g. Skype, Xbox)"
- Set the Redirect URI:
- Platform: Web
- URL:
http://localhost:3000/api/auth/callback/microsoft
- Click "Register"
- In the "Manage" menu click "Authentication (Preview)"
- Add the Redirect URI:
http://localhost:3000/api/outlook/linking/callback
-
Get your credentials:
- The "Application (client) ID" shown is your
MICROSOFT_CLIENT_ID
- To get your client secret:
- Click "Certificates & secrets" in the left sidebar
- Click "New client secret"
- Add a description and choose an expiry
- Click "Add"
- Copy the secret Value (not the ID) - this is your
MICROSOFT_CLIENT_SECRET
- The "Application (client) ID" shown is your
-
Configure API permissions:
-
In the "Manage" menu click "API permissions" in the left sidebar
-
Click "Add a permission"
-
Select "Microsoft Graph"
-
Select "Delegated permissions"
-
Add the following permissions:
- openid
- profile
- User.Read
- offline_access
- Mail.ReadWrite
- Mail.Send
- Mail.ReadBasic
- Mail.Read
- Mail.Read.Shared
- MailboxSettings.ReadWrite
- Contacts.ReadWrite
-
Click "Add permissions"
-
Click "Grant admin consent" if you're an admin
-
-
Update your .env file with the credentials:
MICROSOFT_CLIENT_ID=your_client_id_here MICROSOFT_CLIENT_SECRET=your_client_secret_here
You need to set an LLM, but you can use a local one too:
For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:
OLLAMA_BASE_URL=http://localhost:11434/api
NEXT_PUBLIC_OLLAMA_MODEL=phi3
Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api
as the base URL. You might also need to set OLLAMA_HOST
to 0.0.0.0
in the Ollama configuration file.
You can select the model you wish to use in the app on the /settings
page of the app.
If you are using local ollama, you can set it to be default:
DEFAULT_LLM_PROVIDER=ollama
If this is the case you must also set the ECONOMY_LLM_PROVIDER
environment variable.
We use Postgres for the database. For Redis, you can use Upstash Redis or set up your own Redis instance.
You can run Postgres & Redis locally using docker-compose
docker-compose up -d # -d will run the services in the background
To run the migrations:
pnpm prisma migrate dev
To run the app locally for development (slower):
pnpm run dev
Or from the project root:
turbo dev
To build and run the app locally in production mode (faster):
pnpm run build
pnpm start
Open http://localhost:3000 to view the app in your browser.
Many features are available only to premium users. To upgrade yourself, make yourself an admin in the .env
: [email protected]
Then upgrade yourself at: http://localhost:3000/admin.
Follow instructions here.
Set env var GOOGLE_PUBSUB_TOPIC_NAME
.
When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN
or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN
where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN
in your .env
file to be the value of TOKEN
.
To run in development ngrok can be helpful:
ngrok http 3000
# or with an ngrok domain to keep your endpoint stable (set `XYZ`):
ngrok http --domain=XYZ.ngrok-free.app 3000
And then update the webhook endpoint in the Google PubSub subscriptions dashboard.
To start watching emails visit: /api/watch/all
Set a cron job to run these: The Google watch is necessary. Others are optional.
"crons": [
{
"path": "/api/watch/all",
"schedule": "0 1 * * *"
},
{
"path": "/api/resend/summary/all",
"schedule": "0 16 * * 1"
},
{
"path": "/api/reply-tracker/disable-unused-auto-draft",
"schedule": "0 3 * * *"
}
]
Here are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel vercel.json
. Open to PRs if you find a fix for that.
When building the Docker image, you must specify your NEXT_PUBLIC_BASE_URL
as a build argument. This is because Next.js embeds NEXT_PUBLIC_*
variables at build time, not runtime.
# For production with your custom domain
docker build \
--build-arg NEXT_PUBLIC_BASE_URL="https://your-domain.com" \
-t inbox-zero \
-f docker/Dockerfile.prod .
# For local development (default)
docker build -t inbox-zero -f docker/Dockerfile.prod .
After building, run the container with your runtime secrets:
docker run -p 3000:3000 \
-e DATABASE_URL="your-database-url" \
-e AUTH_SECRET="your-auth-secret" \
-e GOOGLE_CLIENT_ID="your-google-client-id" \
-e GOOGLE_CLIENT_SECRET="your-google-client-secret" \
# ... other runtime environment variables
inbox-zero
Important: If you need to change NEXT_PUBLIC_BASE_URL
, you must rebuild the Docker image. It cannot be changed at runtime.
For more detailed Docker build instructions and security considerations, see docker/DOCKER_BUILD_GUIDE.md.
You can view open tasks in our GitHub Issues. Join our Discord to discuss tasks and check what's being worked on.
ARCHITECTURE.md explains the architecture of the project (LLM generated).
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for inbox-zero
Similar Open Source Tools

inbox-zero
Inbox Zero is an open-source email app that helps you reach inbox zero fast with AI assistance. It offers various features such as a newsletter cleaner, AI assistant for auto-responding, archiving, labeling, and forwarding emails, a cold email blocker, email analytics, tracking of new senders and unreplied emails, and a large email finder to free up space. Inbox Zero is built with Next.js, Tailwind CSS, Prisma, Tinybird, Upstash, and Turbo.

iffy
Iffy is a tool for intelligent content moderation at scale, allowing users to keep unwanted content off their platform without the need to manage a team of moderators. It features a Moderation Dashboard to view and manage all moderation activities, User Lifecycle for automatically suspending users with flagged content, Appeals Management for efficient handling of user appeals, and Powerful Rules & Presets to create custom moderation rules based on unique business needs. Users can choose between the managed Iffy Cloud or the free self-hosted Iffy Community version, each offering different features and setups.

shortest
Shortest is a project for local development that helps set up environment variables and services for a web application. It provides a guide for setting up Node.js and pnpm dependencies, configuring services like Clerk, Vercel Postgres, Anthropic, Stripe, and GitHub OAuth, and running the application and tests locally.

phospho
Phospho is a text analytics platform for LLM apps. It helps you detect issues and extract insights from text messages of your users or your app. You can gather user feedback, measure success, and iterate on your app to create the best conversational experience for your users.

cursor-tools
cursor-tools is a CLI tool designed to enhance AI agents with advanced skills, such as web search, repository context, documentation generation, GitHub integration, Xcode tools, and browser automation. It provides features like Perplexity for web search, Gemini 2.0 for codebase context, and Stagehand for browser operations. The tool requires API keys for Perplexity AI and Google Gemini, and supports global installation for system-wide access. It offers various commands for different tasks and integrates with Cursor Composer for AI agent usage.

rclip
rclip is a command-line photo search tool powered by the OpenAI's CLIP neural network. It allows users to search for images using text queries, similar image search, and combining multiple queries. The tool extracts features from photos to enable searching and indexing, with options for previewing results in supported terminals or custom viewers. Users can install rclip on Linux, macOS, and Windows using different installation methods. The repository follows the Conventional Commits standard and welcomes contributions from the community.

llm-vscode
llm-vscode is an extension designed for all things LLM, utilizing llm-ls as its backend. It offers features such as code completion with 'ghost-text' suggestions, the ability to choose models for code generation via HTTP requests, ensuring prompt size fits within the context window, and code attribution checks. Users can configure the backend, suggestion behavior, keybindings, llm-ls settings, and tokenization options. Additionally, the extension supports testing models like Code Llama 13B, Phind/Phind-CodeLlama-34B-v2, and WizardLM/WizardCoder-Python-34B-V1.0. Development involves cloning llm-ls, building it, and setting up the llm-vscode extension for use.

shell-ai
Shell-AI (`shai`) is a CLI utility that enables users to input commands in natural language and receive single-line command suggestions. It leverages natural language understanding and interactive CLI tools to enhance command line interactions. Users can describe tasks in plain English and receive corresponding command suggestions, making it easier to execute commands efficiently. Shell-AI supports cross-platform usage and is compatible with Azure OpenAI deployments, offering a user-friendly and efficient way to interact with the command line.

code2prompt
Code2Prompt is a powerful command-line tool that generates comprehensive prompts from codebases, designed to streamline interactions between developers and Large Language Models (LLMs) for code analysis, documentation, and improvement tasks. It bridges the gap between codebases and LLMs by converting projects into AI-friendly prompts, enabling users to leverage AI for various software development tasks. The tool offers features like holistic codebase representation, intelligent source tree generation, customizable prompt templates, smart token management, Gitignore integration, flexible file handling, clipboard-ready output, multiple output options, and enhanced code readability.

steel-browser
Steel is an open-source browser API designed for AI agents and applications, simplifying the process of building live web agents and browser automation tools. It serves as a core building block for a production-ready, containerized browser sandbox with features like stealth capabilities, text-to-markdown session management, UI for session viewing/debugging, and full browser control through popular automation frameworks. Steel allows users to control, run, and manage a production-ready browser environment via a REST API, offering features such as full browser control, session management, proxy support, extension support, debugging tools, anti-detection mechanisms, resource management, and various browser tools. It aims to streamline complex browsing tasks programmatically, enabling users to focus on their AI applications while Steel handles the underlying complexity.

raycast_api_proxy
The Raycast AI Proxy is a tool that acts as a proxy for the Raycast AI application, allowing users to utilize the application without subscribing. It intercepts and forwards Raycast requests to various AI APIs, then reformats the responses for Raycast. The tool supports multiple AI providers and allows for custom model configurations. Users can generate self-signed certificates, add them to the system keychain, and modify DNS settings to redirect requests to the proxy. The tool is designed to work with providers like OpenAI, Azure OpenAI, Google, and more, enabling tasks such as AI chat completions, translations, and image generation.

ChatOpsLLM
ChatOpsLLM is a project designed to empower chatbots with effortless DevOps capabilities. It provides an intuitive interface and streamlined workflows for managing and scaling language models. The project incorporates robust MLOps practices, including CI/CD pipelines with Jenkins and Ansible, monitoring with Prometheus and Grafana, and centralized logging with the ELK stack. Developers can find detailed documentation and instructions on the project's website.

tiledesk-dashboard
Tiledesk is an open-source live chat platform with integrated chatbots written in Node.js and Express. It is designed to be a multi-channel platform for web, Android, and iOS, and it can be used to increase sales or provide post-sales customer service. Tiledesk's chatbot technology allows for automation of conversations, and it also provides APIs and webhooks for connecting external applications. Additionally, it offers a marketplace for apps and features such as CRM, ticketing, and data export.

pastemax
PasteMax is a modern file viewer application designed for developers to easily navigate, search, and copy code from repositories. It provides features such as file tree navigation, token counting, search capabilities, selection management, sorting options, dark mode, binary file detection, and smart file exclusion. Built with Electron, React, and TypeScript, PasteMax is ideal for pasting code into ChatGPT or other language models. Users can download the application or build it from source, and customize file exclusions. Troubleshooting steps are provided for common issues, and contributions to the project are welcome under the MIT License.

shortest
Shortest is an AI-powered natural language end-to-end testing framework built on Playwright. It provides a seamless testing experience by allowing users to write tests in natural language and execute them using Anthropic Claude API. The framework also offers GitHub integration with 2FA support, making it suitable for testing web applications with complex authentication flows. Shortest simplifies the testing process by enabling users to run tests locally or in CI/CD pipelines, ensuring the reliability and efficiency of web applications.
For similar tasks

inbox-zero
Inbox Zero is an open-source email app that helps you reach inbox zero fast with AI assistance. It offers various features such as a newsletter cleaner, AI assistant for auto-responding, archiving, labeling, and forwarding emails, a cold email blocker, email analytics, tracking of new senders and unreplied emails, and a large email finder to free up space. Inbox Zero is built with Next.js, Tailwind CSS, Prisma, Tinybird, Upstash, and Turbo.
For similar jobs

inbox-zero
Inbox Zero is an open-source email app that helps you reach inbox zero fast with AI assistance. It offers various features such as a newsletter cleaner, AI assistant for auto-responding, archiving, labeling, and forwarding emails, a cold email blocker, email analytics, tracking of new senders and unreplied emails, and a large email finder to free up space. Inbox Zero is built with Next.js, Tailwind CSS, Prisma, Tinybird, Upstash, and Turbo.