
AIOLists
Stremio Addon with a bunch of features to manage lists from MDBList and Trakt
Stars: 85

AIOLists is a stateless open source list management addon for Stremio that allows users to import and manage lists from various sources in one place. It offers unified search, metadata customization, Trakt integration, MDBList integration, external lists import, list sorting, customization options, watchlist updates, RPDB support, genre filtering, discovery lists, and shareable configurations. The addon aims to enhance the list management experience for Stremio users by providing a comprehensive set of features and functionalities.
README:
AIOLists is a stateless open source list management addon for Stremio. The project originated from this post, since then I have continued development to add features I would personally want in a list management addon, and fixed bugs shared by the users.
- Unified List Management: Import and manage lists from various sources in one place.
- Unified Search: Choose between Cinemeta, Trakt, TMDB or all 3 aggregated search.
- Metadata: Choose between Cinemeta or TMDB metadata, and choose between one of their extensive set of supported languages.
- MDBList & Trakt URL Imports: Directly import lists by pasting URLs from MDBList.com and Trakt.tv no API key or connection needed.
- Trakt Integration: Connect your Trakt account to access personal lists, watchlist, recommendations, trending, and popular content.
- MDBList Integration: Enter your MDBList API Key and import all your personal lists and watchlists into one place.
- External Lists from Addon: From letterboxd to anime lists import manifest.json from any external addon into AIOLists.
- Sorting: If the sorting option exists it's there.
-
List Customization:
- Change type: Instead of movies/series change it to whatever you want, even make it blank.
- Reorder: Drag and drop to arrange lists as you like.
- Rename: Give custom names to any list for better organization.
- Merge/Split: If a list contains both movies and series you can merge it into a single Stremio row so it doesn't take up more space than it needs to.
- Hide/Show from homeview: Hide lists from homeview, while still accessing them through the Discover tab.
- Instant Watchlist Updates: Fetches watchlist content on load.
- RPDB Support: Optional RatingPosterDB (RPDB) integration for enhanced poster images across all your lists (requires your own RPDB API key).
- Configurable Genre Filtering: If you add too many list you might hit the 8kb manifest size limit. By disabling genre filtering the manifest size should half so you can have more lists.
- Discovery Lists: Randomly selected MDBList from a set list of users, a new random list is delivered everytime you refresh the catalog.
- Share Your Setup: Generate a shareable hash of your AIOLists configuration (list order, names, imported addons) to share with others.
Speed up the loading time of lists in StremioMerged and Anime Search- Fix issues:
- Rework genre filtering
- Add sorting option for MDBList added without key
- Support for Streams/TV lists from external addons
- Randomize option for lists without sort options
- Better TMDB list support
- Maybe features:
- Switching Profiles
- Native Trakt persistance
Due to the stateless nature of this addon Trakt keys can't automatically update when they expire. I have added an option to make Trakt persistant through Upstash. You can create a free account on there. Here's a short guide:
- Create an account, using a temp-mail works fine.
- After logging in you will be prompted to Create a database press Create Database.
- Input a Name and the region closest to you.
- Next -> Next -> Create
- Scroll down to REST API section and copy UPSTASH_REDIS_REST_URL and UPSTASH_REDIS_REST_TOKEN and put them into AIOLists.
Your Trakt tokens are now stored in the redis db and will automatically refresh when they expire.
If you find this project useful, the best way to support me is to star this repository on GitHub!
AIOLists supports several environment variables for advanced configuration:
Create a .env
file in the root directory with the following variables:
# TMDB Configuration
TMDB_REDIRECT_URI=your_tmdb_redirect_uri_here
TMDB_BEARER_TOKEN=your_tmdb_bearer_token_here
- TMDB_REDIRECT_URI: Redirect URI for TMDB OAuth. When set, users will be redirected after authentication.
- TMDB_BEARER_TOKEN: Your TMDB Read Access Token. When set, the bearer token field is hidden in the UI and this token is used automatically.
When both TMDB_BEARER_TOKEN
and redirect URIs are configured, the "Connect to TMDB" button will redirect users directly to the authentication pages instead of showing manual steps.
The easiest way to host this project for free is through hugging face.
Steps:
- Create a huggingface account. https://huggingface.co/
- Go to https://huggingface.co/new-space?sdk=docker
- Fill in the Space name and Create Space
- Scroll down to "Create your Dockerfile" and press "create the Dockerfile" at the bottom of the section.
- Paste in
FROM ghcr.io/sebastianmorel/aiolists:latest ENV PORT=7860
- Press "Commit new file to main"
- Go to settings and add your TMDB_REDIRECT_URI (username-projectname.hf.space) and TMDB_BEARER_TOKEN
- Wait for it to finish building and you should have your own instance.
Most modern PaaS providers that support Docker can deploy AIOLists.
-
Railway: Connect your GitHub repository and let Railway build from the
Dockerfile
. Set thePORT
environment variable if needed (Railway usually injects it). -
Render: Create a new "Web Service", connect your repository, and choose Docker as the environment. Render will build and deploy from the
Dockerfile
. Set thePORT
environment variable. -
Fly.io: Use the
flyctl
CLI to launch a new app. It can often detect and use yourDockerfile
.
Steps:
-
Clone your fork (or the original repository):
git clone https://github.com/YOUR_USERNAME/AIOLists.git # Replace YOUR_USERNAME if you forked cd AIOLists
-
Build the Docker image:
docker build -t aiolists-addon .
-
Run the container:
docker run -d -p 7000:7000 -e NODE_ENV=production --restart unless-stopped aiolists-addon
Your addon will be available at
http://YOUR_SERVER_IP:7000
. You can then access the configuration panel athttp://YOUR_SERVER_IP:7000/configure
.
You can also run the addon directly with Node.js if you prefer not to use Docker.
-
Clone your fork:
git clone https://github.com/YOUR_USERNAME/AIOLists.git cd AIOLists
-
Install dependencies:
npm install --production
-
Start the server:
The server will start on port 7000 by default. Access
npm run prod
/configure
.
MIT
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for AIOLists
Similar Open Source Tools

AIOLists
AIOLists is a stateless open source list management addon for Stremio that allows users to import and manage lists from various sources in one place. It offers unified search, metadata customization, Trakt integration, MDBList integration, external lists import, list sorting, customization options, watchlist updates, RPDB support, genre filtering, discovery lists, and shareable configurations. The addon aims to enhance the list management experience for Stremio users by providing a comprehensive set of features and functionalities.

WritingTools
Writing Tools is an Apple Intelligence-inspired application for Windows, Linux, and macOS that supercharges your writing with an AI LLM. It allows users to instantly proofread, optimize text, and summarize content from webpages, YouTube videos, documents, etc. The tool is privacy-focused, open-source, and supports multiple languages. It offers powerful features like grammar correction, content summarization, and LLM chat mode, making it a versatile writing assistant for various tasks.

LLMstudio
LLMstudio by TensorOps is a platform that offers prompt engineering tools for accessing models from providers like OpenAI, VertexAI, and Bedrock. It provides features such as Python Client Gateway, Prompt Editing UI, History Management, and Context Limit Adaptability. Users can track past runs, log costs and latency, and export history to CSV. The tool also supports automatic switching to larger-context models when needed. Coming soon features include side-by-side comparison of LLMs, automated testing, API key administration, project organization, and resilience against rate limits. LLMstudio aims to streamline prompt engineering, provide execution history tracking, and enable effortless data export, offering an evolving environment for teams to experiment with advanced language models.

GPThemes
GPThemes is a GitHub repository that provides a collection of customizable themes for various programming languages and text editors. It offers a wide range of color schemes and styling options to enhance the visual appearance of code editors and terminals. Users can easily browse through the available themes and apply them to their preferred development environment to personalize the coding experience. With GPThemes, developers can quickly switch between different themes to find the one that best suits their preferences and workflow, making coding more enjoyable and visually appealing.

maige
Maige is a tool designed to simplify repository maintenance by automating the handling of issue labels. Users can quickly set up Maige to let AI manage their issue labels effortlessly. The tool provides guidance on self-hosting, GitHub app integration, environment variables setup, and offers commands for streamlined issue management. Maige aims to streamline the process of managing issues in a repository, making it easier for users to handle tasks related to labeling and tracking issues.

mikupad
mikupad is a lightweight and efficient language model front-end powered by ReactJS, all packed into a single HTML file. Inspired by the likes of NovelAI, it provides a simple yet powerful interface for generating text with the help of various backends.

CodebaseToPrompt
CodebaseToPrompt is a tool that converts a local directory into a structured prompt for Large Language Models (LLMs). It allows users to select specific files for code review, analysis, or documentation by exploring and filtering through the file tree in an interactive interface. The tool generates a formatted output that can be directly used with LLMs, estimates token count, and supports flexible text selection. Users can deploy the tool using Docker for self-contained usage and can contribute to the project by opening issues or submitting pull requests.

your-source-to-prompt.html
Your Source to Prompt is a single HTML file tool that allows users to easily select code files and combine them into a single text output. It runs entirely in the browser, ensuring local and secure operation without any external dependencies. The tool offers features like preset management, efficient file selection, context size awareness, hierarchical structure preview, minification, and user-friendly UI with dark mode. It aims to simplify the process of preparing code for Large Language Models (LLMs) by providing a well-structured prompt context.

CodebaseToPrompt
CodebaseToPrompt is a simple tool that converts a local directory into a structured prompt for Large Language Models (LLMs). It allows users to select specific files for code review, analysis, or documentation by exploring and filtering through the file tree in a browser-based interface. The tool generates a formatted output that can be directly used with AI tools, provides token count estimates, and supports local storage for saving selections. Users can easily copy the selected files in the desired format for further use.

llm-autoeval
LLM AutoEval is a tool that simplifies the process of evaluating Large Language Models (LLMs) using a convenient Colab notebook. It automates the setup and execution of evaluations using RunPod, allowing users to customize evaluation parameters and generate summaries that can be uploaded to GitHub Gist for easy sharing and reference. LLM AutoEval supports various benchmark suites, including Nous, Lighteval, and Open LLM, enabling users to compare their results with existing models and leaderboards.

AiTimeline
AiTimeline is a comprehensive timeline showcasing the evolution and advancements in artificial intelligence technologies from 2022 to 2024. It provides a detailed overview of key milestones, releases, and developments in the AI industry, organized chronologically by year. The timeline offers a responsive design for seamless viewing on various devices and includes brief descriptions for each event, making it a valuable resource for researchers, enthusiasts, and anyone interested in tracking the progress of AI technologies.

AmigaGPT
AmigaGPT is a versatile ChatGPT client for AmigaOS 3.x, 4.1, and MorphOS. It brings the capabilities of OpenAI’s GPT to Amiga systems, enabling text generation, question answering, and creative exploration. AmigaGPT can generate images using DALL-E, supports speech output, and seamlessly integrates with AmigaOS. Users can customize the UI, choose fonts and colors, and enjoy a native user experience. The tool requires specific system requirements and offers features like state-of-the-art language models, AI image generation, speech capability, and UI customization.

t3rn-airdrop-bot
A bot designed to automate transactions and bridge assets on the t3rn network, making the process seamless and efficient. It supports multiple wallets through a JSON file containing private keys, with robust error handling and retry mechanisms. The tool is user-friendly, easy to set up, and supports bridging from Optimism Sepolia and Arbitrum Sepolia.

transcriptionstream
Transcription Stream is a self-hosted diarization service that works offline, allowing users to easily transcribe and summarize audio files. It includes a web interface for file management, Ollama for complex operations on transcriptions, and Meilisearch for fast full-text search. Users can upload files via SSH or web interface, with output stored in named folders. The tool requires a NVIDIA GPU and provides various scripts for installation and running. Ports for SSH, HTTP, Ollama, and Meilisearch are specified, along with access details for SSH server and web interface. Customization options and troubleshooting tips are provided in the documentation.

22AIE111-Object-Oriented-Programming-in-Java-S2-2025
The 'Object Oriented Programming in Java' repository provides notes and code examples organized into units to help users understand and practice Java concepts step-by-step. It includes theoretical notes, practical Java examples, setup files for Visual Studio Code and IntelliJ IDEA, instructions on setting up Java, running Java programs from the command line, and loading projects in VS Code or IntelliJ IDEA. Users can contribute by opening issues or submitting pull requests. The repository is intended for educational purposes, allowing forking and modification for personal study or classroom use.

merlinn
Merlinn is an open-source AI-powered on-call engineer that automatically jumps into incidents & alerts, providing useful insights and RCA in real time. It integrates with popular observability tools, lives inside Slack, offers an intuitive UX, and prioritizes security. Users can self-host Merlinn, use it for free, and benefit from automatic RCA, Slack integration, integrations with various tools, intuitive UX, and security features.
For similar tasks

AIOLists
AIOLists is a stateless open source list management addon for Stremio that allows users to import and manage lists from various sources in one place. It offers unified search, metadata customization, Trakt integration, MDBList integration, external lists import, list sorting, customization options, watchlist updates, RPDB support, genre filtering, discovery lists, and shareable configurations. The addon aims to enhance the list management experience for Stremio users by providing a comprehensive set of features and functionalities.
For similar jobs

book
Podwise is an AI knowledge management app designed specifically for podcast listeners. With the Podwise platform, you only need to follow your favorite podcasts, such as "Hardcore Hackers". When a program is released, Podwise will use AI to transcribe, extract, summarize, and analyze the podcast content, helping you to break down the hard-core podcast knowledge. At the same time, it is connected to platforms such as Notion, Obsidian, Logseq, and Readwise, embedded in your knowledge management workflow, and integrated with content from other channels including news, newsletters, and blogs, helping you to improve your second brain 🧠.

extractor
Extractor is an AI-powered data extraction library for Laravel that leverages OpenAI's capabilities to effortlessly extract structured data from various sources, including images, PDFs, and emails. It features a convenient wrapper around OpenAI Chat and Completion endpoints, supports multiple input formats, includes a flexible Field Extractor for arbitrary data extraction, and integrates with Textract for OCR functionality. Extractor utilizes JSON Mode from the latest GPT-3.5 and GPT-4 models, providing accurate and efficient data extraction.

Scrapegraph-ai
ScrapeGraphAI is a Python library that uses Large Language Models (LLMs) and direct graph logic to create web scraping pipelines for websites, documents, and XML files. It allows users to extract specific information from web pages by providing a prompt describing the desired data. ScrapeGraphAI supports various LLMs, including Ollama, OpenAI, Gemini, and Docker, enabling users to choose the most suitable model for their needs. The library provides a user-friendly interface through its `SmartScraper` class, which simplifies the process of building and executing scraping pipelines. ScrapeGraphAI is open-source and available on GitHub, with extensive documentation and examples to guide users. It is particularly useful for researchers and data scientists who need to extract structured data from web pages for analysis and exploration.

databerry
Chaindesk is a no-code platform that allows users to easily set up a semantic search system for personal data without technical knowledge. It supports loading data from various sources such as raw text, web pages, files (Word, Excel, PowerPoint, PDF, Markdown, Plain Text), and upcoming support for web sites, Notion, and Airtable. The platform offers a user-friendly interface for managing datastores, querying data via a secure API endpoint, and auto-generating ChatGPT Plugins for each datastore. Chaindesk utilizes a Vector Database (Qdrant), Openai's text-embedding-ada-002 for embeddings, and has a chunk size of 1024 tokens. The technology stack includes Next.js, Joy UI, LangchainJS, PostgreSQL, Prisma, and Qdrant, inspired by the ChatGPT Retrieval Plugin.

auto-news
Auto-News is an automatic news aggregator tool that utilizes Large Language Models (LLM) to pull information from various sources such as Tweets, RSS feeds, YouTube videos, web articles, Reddit, and journal notes. The tool aims to help users efficiently read and filter content based on personal interests, providing a unified reading experience and organizing information effectively. It features feed aggregation with summarization, transcript generation for videos and articles, noise reduction, task organization, and deep dive topic exploration. The tool supports multiple LLM backends, offers weekly top-k aggregations, and can be deployed on Linux/MacOS using docker-compose or Kubernetes.

SemanticFinder
SemanticFinder is a frontend-only live semantic search tool that calculates embeddings and cosine similarity client-side using transformers.js and SOTA embedding models from Huggingface. It allows users to search through large texts like books with pre-indexed examples, customize search parameters, and offers data privacy by keeping input text in the browser. The tool can be used for basic search tasks, analyzing texts for recurring themes, and has potential integrations with various applications like wikis, chat apps, and personal history search. It also provides options for building browser extensions and future ideas for further enhancements and integrations.

1filellm
1filellm is a command-line data aggregation tool designed for LLM ingestion. It aggregates and preprocesses data from various sources into a single text file, facilitating the creation of information-dense prompts for large language models. The tool supports automatic source type detection, handling of multiple file formats, web crawling functionality, integration with Sci-Hub for research paper downloads, text preprocessing, and token count reporting. Users can input local files, directories, GitHub repositories, pull requests, issues, ArXiv papers, YouTube transcripts, web pages, Sci-Hub papers via DOI or PMID. The tool provides uncompressed and compressed text outputs, with the uncompressed text automatically copied to the clipboard for easy pasting into LLMs.

Agently-Daily-News-Collector
Agently Daily News Collector is an open-source project showcasing a workflow powered by the Agent ly AI application development framework. It allows users to generate news collections on various topics by inputting the field topic. The AI agents automatically perform the necessary tasks to generate a high-quality news collection saved in a markdown file. Users can edit settings in the YAML file, install Python and required packages, input their topic idea, and wait for the news collection to be generated. The process involves tasks like outlining, searching, summarizing, and preparing column data. The project dependencies include Agently AI Development Framework, duckduckgo-search, BeautifulSoup4, and PyYAM.