
AIOStreams
Combine all your streams into one addon and display them with consistent formatting, sorting, and filtering
Stars: 174

AIOStreams is a versatile tool that combines streams from various addons into one platform, offering extensive customization options. Users can change result formats, filter results by various criteria, remove duplicates, prioritize services, sort results, specify size limits, and more. The tool scrapes results from selected addons, applies user configurations, and presents the results in a unified manner. It simplifies the process of finding and accessing desired content from multiple sources, enhancing user experience and efficiency.
README:
Combines streams from other addons into one and provides much greater customisation:
- Change the format of the resutls
- Filter all results by resolution, quality, visual tags, audio tags, encodes.
- Remove duplicate results, and prioritise specific services for a given file.
- Sort all results by quality, resolution, size, cached, visual tags, audio tags, encodes, seeders, service, language
- Prioritise or exclude specific languages
- Specify a minimum and/or maximum size
- Limiting to a specific number of results per resolution
- Proxy your streams with MediaFlow
You simply configure your options, add any API keys for any services you use, then enable whichever addons you want, and install.
The addon will scrape all results from all addons, apply your configuration, and give the results back to you in one go.
[!NOTE] Do not install other addons that you have enabled through this addon. You will only cause unnecessary requests to the addon. I also do not recommend installing/enabling too many addons as they all scrape mostly the same sources.
The addon has parsers for specific addons and can extract the relevant information. It goes through each addon you selected and obtains the results with all the parsed information.
Once it has all the parsed information for each result, it can apply your configured sorting and filtering options.
I wanted to have a single addon that could scrape all the sources I wanted and apply my own custom filters and sorting options. Many addons lack the ability to finely tune how you want your results to be sorted and filtered.
Being able to change the format of every result was also a big factor in creating this addon. I preferred the format of my GDrive addon and wanted to use that format for all my results. This makes it easier to parse the results and explain to less tech-savvy people how to pick the best result.
It also means you only have to install one addon instead of configuring multiple addons.
Furthermore, being able to apply a global filter and sort to all results means that you can get the best results from all sources displayed first, rather than having to check each addon individually.
It currently supports:
- Torrentio
- MediaFusion
- Comet
- Torbox Addon
- Debridio
- Jackettio
- Peerflix
- DMM Cast
- Orion Stremio Addon
- Easynews
- Easynews+
- Stremio GDrive
- Custom: You can input an addon URL and name and it will parse as much information as it can.
[!NOTE] The URL can either be a URL to the manifest.json or the url without the manifest.json e.g.
https://torrentio.strem.fun/
orhttps://torrentio.strem.fun/manifest.json
The addon can display your results in different formats. The formats available are:
-
gdrive: Uses the format from this Stremio GDrive addon
-
minimalistic-gdrive A modified version of the
gdrive
format where the filename is not shown. Emojis are used for languages, and seeders are not shown for cached results. -
torrentio: Uses the format from the Torrentio addon.
-
torbox: Uses the format from the Torbox stremio addon.
Read my Stremio guide.
[!IMPORTANT] Torrentio is disabled on the public instance! However, most users don't need Torrentio and MediaFusion also provides streams from Torrentio, so try the public instance first, you may not need to self-host.
ElfHosted have been kind enough to host a community instance of AIOStreams.
This community instance does have a ratelimit in place, but it is unlikely you will reach it. It also avoids the ratelimits of ElfHosted addons like Comet and MediaFusion as AIOStreams' requests to these addons are routed internally. However, other non-ElfHosted addons may rate limit the community instance.
This addon can be deployed as a Cloudflare Worker.
[!NOTE] Cloudflare Workers cannot make requests to other Cloudflare Workers from the same account. If you have deployed the Stremio GDrive addon already on a Cloudflare account, the AIOStreams worker on the same account will not be able to fetch streams from your Stremio GDrive worker.
[!WARNING] A Cloudflare Worker may get blocked by Torrentio. You may also encounter a build error, in which case you will have to edit the code slightly and lose the functionality of the
ADDON_PROXY
environment variable
There are 2 methods to do this. Method 2 requires you to have Git and Node.js installed, method 1 does not, and only requires a web browser and a Cloudflare account.
Method 1
- Fork my GitHub repository.
- Head to the Cloudflare Dashboard, signing up for an account if needed.
- Click the
Create
button and call your workeraiostreams
- Click
Continue to project
after it's done creating - Go to the
Settings
tab. - Scroll down to the
Build
section, and clickConnect
on the Git repository option.- Choose your GitHub account, and the repository you created earlier when forking my repository
- Leave the branch as main
- Build command:
npm install | npm run build
- Deploy command:
npm run deploy:cloudflare-worker
- Click
Connect
- Trigger a redeployment by editing the README file at your fork (you can just add a letter and click commit changes)
- You can find the URL for your cloudflare worker by clicking
View version
at theDeployments
tab under theActive deployments
section
If you get an error about the node:sqlite
module, follow these instructions, editing the code at your forked GitHub repository.
Method 2
- Sign up for a Cloudflare Account
- Install Node.js (I would recommend using package managers e.g. fnm on Windows)
- Install Git
- Run the following commands:
git clone https://github.com/Viren070/AIOStreams.git
cd AIOStreams
npm i
npm run build
npm run deploy:cloudflare-worker
If you get an error about the node:sqlite
module, follow these instructions
Method 1
Go to your forked GitHub repository and click sync fork. This should trigger a deployment, if not follow the same steps above to redeploy.
Method 2
To update the addon, you can simply run the following commands to pull the latest changes, build the project, and deploy the worker.
This will update the worker with the latest changes, which may not be stable. In case, you get the build error about node:sqlite
again, follow the instructions linked above again.
git pull --rebase
npm run build
npm run deploy:cloudflare-worker
[!NOTE] Use the link below to support me, 33% of your AIOStreams subscription will go to me ❤️
AIOStreams is available as a paid product on ElfHosted. This offers you a no-hassle experience where you can expect things to "just work".
[!TIP] Heroku have a student offer which gives you $13 worth of credit each month to spend for 24 months.
To deploy AIOStreams on Heroku, you can fork this repository, and create a new app on the Heroku Dashboard, using GitHub
as the deployment method in the Deploy
tab, and choosing the Node.js
buildpack in the Settings
tab.
Docker is a quick and convenient way to run this. Official images are available at the ghcr.io and docker.io registries
Rather than running this on a personal device, you can follow these instructions to run it on a server or VPS. You can use a free VPS from Oracle for this, or some cheap ones found on LowEndBox.
-
Download the
compose.yaml
and.env.sample
files:curl -O https://raw.githubusercontent.com/Viren070/AIOStreams/main/compose.yaml curl -O https://raw.githubusercontent.com/Viren070/AIOStreams/main/.env.sample -o .env
-
Edit the
.env
file to your liking. -
Run the following command:
docker compose up -d
For an easy way to get HTTPS (which is required to install Stremio addons outside of localhost), you can use Traefik as a reverse proxy.
Requirements:
- A domain/subdomain with a CNAME/A record pointing to the public IP of the server
- A server with ports 443 open
[!TIP] You can use free domains from DuckDNS and Afraid
-
Use the
compose.traefik.yaml
file instead of thecompose.yaml
file.curl -O https://raw.githubusercontent.com/Viren070/AIOStreams/main/compose.traefik.yaml -o compose.yaml
If you haven't already downloaded the
.env
file, do so now:curl -O https://raw.githubusercontent.com/Viren070/AIOStreams/main/.env.sample -o .env
-
Ensure that the 'TRAEFIK CONFIGURATION' section in the
.env
file is filled in. -
Then run the following command:
docker compose up -d
If you have other self-hosted addons, you'd like AIOStreams to use, fill them in in the .env
file.
If you are looking for a more complete compose.yaml
with more addons, you can use my template here.
You can use the prebuilt images using one of the following commands:
GitHub Container Registry:
docker run -p 8080:3000 ghcr.io/viren070/aiostreams:latest
Docker Hub:
docker run -p 8080:3000 viren070/aiostreams:latest
If you would like to pass one of the environment variables, you can provide the -e flag, e.g. to provide a SECRET_KEY (recommended, see CONFIGURING.md for how to generate a secret key.):
docker run -p 8080:3000 -e SECRET_KEY=... viren070/aiostreams:latest
If you don't want to use a prebuilt image, or want to build from a commit that isn't tagged with a version yet, you can build the image yourself using the following commands:
git clone https://github.com/Viren070/aiostreams.git
cd aiostreams
docker build -t aiostreams .
docker run -p 8080:3000 aiostreams
This addon can be deployed using some free solutions, but these should not be considered permanent solutions and can stop working at any point.
You need Node.js and git installed. Node v22 and npm v10.9 were used in the development of this project. I can not guarantee earlier versions will work.
- Clone the project and set it as the current directory
git clone https://github.com/Viren070/AIOStreams.git
cd aiostreams
- Install project dependencies
npm i
- Build project
npm run build
- Run project
npm run start:addon
- Go to
http://localhost:3000/configure
You can change the PORT
environment variable to change the port that the addon will listen on.
If you would like an explanation on the configuration options at the /configure page, have a look at this guide for aiostreams that I made.
Outside of the configuration page, the behaviour of this addon can also be changed with environment variables.
Most users don't need to set any environment variables. However, if you do, the SECRET_KEY is the one you might want to configure. This key enables encrypted manifest URLs, which help protect your API keys.
With encryption, someone who has your manifest URL can't directly see your API keys. However, they can still install the addon using the encrypted URL. Once installed, they can view API keys within other addons' URLs that are contained within AIOStreams' responses, as most addons don’t encrypt their manifest URLs.
Please see CONFIGURING and the sample .env file for a list of environment variables that can be set.
Below, you can find how to set environment variables for the different methods of deployment.
Unfortunately, it is not currently possible to set environment variables for this addon on a Cloudflare Worker. You will have to modify the code directly. You can look in packages/utils/src/settings.ts
to change the default values.
You can set environment variables using a .env file in the root of the project.
ADDON_NAME=AIOStreams
ADDON_ID=aiostreams.viren070.com
PORT=3000
SECRET_KEY=your_secret_key
COMET_URL=https://comet.elfhosted.com/
...
- Clone the project and set it as the current directory
git clone https://github.com/Viren070/AIOStreams.git
cd aiostreams
- Install project dependencies
npm i
Now, you can run various aspects of the project in development.
[!NOTE] Most of these commands require that you build the project beforehand. Changes in other packages do not reflect immediately as it needs to be compiled into JavaScript first. Run
npm run build
to build the project.
To start the addon in development mode, run the following command:
npm run start:addon:dev
To run the cloudflare worker in development mode, run the following command
npm run start:cloudflare-worker:dev
To run the frontend of the project, run the following command
npm run start:frontend:dev
To deploy your cloudflare worker, run the following command:
npm run deploy:cloudflare-worker
AIOStreams and its developer do not host, store, or distribute any content that is found using this addon. All content is sourced from publicly available addons. AIOStreams does not endorse or promote piracy in any form. It is the user's responsibility to ensure that their use of this addon is in compliance with their local laws and regulations.
- Thanks to sleeyax/stremio-easynews-addon for the repository structure and dockerfile.
- Thanks to all addon devs for creating the upstream addons that AIOStreams scrapes.
- MediaFlow for MediaFlow Proxy which is used in this addon to proxy your streams
- Issue templates were stolen from 5rahim/seanime (You should really try out this app)
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for AIOStreams
Similar Open Source Tools

AIOStreams
AIOStreams is a versatile tool that combines streams from various addons into one platform, offering extensive customization options. Users can change result formats, filter results by various criteria, remove duplicates, prioritize services, sort results, specify size limits, and more. The tool scrapes results from selected addons, applies user configurations, and presents the results in a unified manner. It simplifies the process of finding and accessing desired content from multiple sources, enhancing user experience and efficiency.

openui
OpenUI is a tool designed to simplify the process of building UI components by allowing users to describe UI using their imagination and see it rendered live. It supports converting HTML to React, Svelte, Web Components, etc. The tool is open source and aims to make UI development fun, fast, and flexible. It integrates with various AI services like OpenAI, Groq, Gemini, Anthropic, Cohere, and Mistral, providing users with the flexibility to use different models. OpenUI also supports LiteLLM for connecting to various LLM services and allows users to create custom proxy configs. The tool can be run locally using Docker or Python, and it offers a development environment for quick setup and testing.

ai-voice-cloning
This repository provides a tool for AI voice cloning, allowing users to generate synthetic speech that closely resembles a target speaker's voice. The tool is designed to be user-friendly and accessible, with a graphical user interface that guides users through the process of training a voice model and generating synthetic speech. The tool also includes a variety of features that allow users to customize the generated speech, such as the pitch, volume, and speaking rate. Overall, this tool is a valuable resource for anyone interested in creating realistic and engaging synthetic speech.

azure-search-openai-demo
This sample demonstrates a few approaches for creating ChatGPT-like experiences over your own data using the Retrieval Augmented Generation pattern. It uses Azure OpenAI Service to access a GPT model (gpt-35-turbo), and Azure AI Search for data indexing and retrieval. The repo includes sample data so it's ready to try end to end. In this sample application we use a fictitious company called Contoso Electronics, and the experience allows its employees to ask questions about the benefits, internal policies, as well as job descriptions and roles.

webwhiz
WebWhiz is an open-source tool that allows users to train ChatGPT on website data to build AI chatbots for customer queries. It offers easy integration, data-specific responses, regular data updates, no-code builder, chatbot customization, fine-tuning, and offline messaging. Users can create and train chatbots in a few simple steps by entering their website URL, automatically fetching and preparing training data, training ChatGPT, and embedding the chatbot on their website. WebWhiz can crawl websites monthly, collect text data and metadata, and process text data using tokens. Users can train custom data, but bringing custom open AI keys is not yet supported. The tool has no limitations on context size but may limit the number of pages based on the chosen plan. WebWhiz SDK is available on NPM, CDNs, and GitHub, and users can self-host it using Docker or manual setup involving MongoDB, Redis, Node, Python, and environment variables setup. For any issues, users can contact [email protected].

reai-ghidra
The RevEng.AI Ghidra Plugin by RevEng.ai allows users to interact with their API within Ghidra for Binary Code Similarity analysis to aid in Reverse Engineering stripped binaries. Users can upload binaries, rename functions above a confidence threshold, and view similar functions for a selected function.

gpt-subtrans
GPT-Subtrans is an open-source subtitle translator that utilizes large language models (LLMs) as translation services. It supports translation between any language pairs that the language model supports. Note that GPT-Subtrans requires an active internet connection, as subtitles are sent to the provider's servers for translation, and their privacy policy applies.

aiCoder
aiCoder is an AI-powered tool designed to streamline the coding process by automating repetitive tasks, providing intelligent code suggestions, and facilitating the integration of new features into existing codebases. It offers a chat interface for natural language interactions, methods and stubs lists for code modification, and settings customization for project-specific prompts. Users can leverage aiCoder to enhance code quality, focus on higher-level design, and save time during development.

mentat
Mentat is an AI tool designed to assist with coding tasks directly from the command line. It combines human creativity with computer-like processing to help users understand new codebases, add new features, and refactor existing code. Unlike other tools, Mentat coordinates edits across multiple locations and files, with the context of the project already in mind. The tool aims to enhance the coding experience by providing seamless assistance and improving edit quality.

WebCraftifyAI
WebCraftifyAI is a software aid that makes it easy to create and build web pages and content. It is designed to be user-friendly and accessible to people of all skill levels. With WebCraftifyAI, you can quickly and easily create professional-looking websites without having to learn complex coding or design skills.

concierge
Concierge is a versatile automation tool designed to streamline repetitive tasks and workflows. It provides a user-friendly interface for creating custom automation scripts without the need for extensive coding knowledge. With Concierge, users can automate various tasks across different platforms and applications, increasing efficiency and productivity. The tool offers a wide range of pre-built automation templates and allows users to customize and schedule their automation processes. Concierge is suitable for individuals and businesses looking to automate routine tasks and improve overall workflow efficiency.

AlwaysReddy
AlwaysReddy is a simple LLM assistant with no UI that you interact with entirely using hotkeys. It can easily read from or write to your clipboard, and voice chat with you via TTS and STT. Here are some of the things you can use AlwaysReddy for: - Explain a new concept to AlwaysReddy and have it save the concept (in roughly your words) into a note. - Ask AlwaysReddy "What is X called?" when you know how to roughly describe something but can't remember what it is called. - Have AlwaysReddy proofread the text in your clipboard before you send it. - Ask AlwaysReddy "From the comments in my clipboard, what do the r/LocalLLaMA users think of X?" - Quickly list what you have done today and get AlwaysReddy to write a journal entry to your clipboard before you shutdown the computer for the day.

CLI
Bito CLI provides a command line interface to the Bito AI chat functionality, allowing users to interact with the AI through commands. It supports complex automation and workflows, with features like long prompts and slash commands. Users can install Bito CLI on Mac, Linux, and Windows systems using various methods. The tool also offers configuration options for AI model type, access key management, and output language customization. Bito CLI is designed to enhance user experience in querying AI models and automating tasks through the command line interface.

airbroke
Airbroke is an open-source error catcher tool designed for modern web applications. It provides a PostgreSQL-based backend with an Airbrake-compatible HTTP collector endpoint and a React-based frontend for error management. The tool focuses on simplicity, maintaining a small database footprint even under heavy data ingestion. Users can ask AI about issues, replay HTTP exceptions, and save/manage bookmarks for important occurrences. Airbroke supports multiple OAuth providers for secure user authentication and offers occurrence charts for better insights into error occurrences. The tool can be deployed in various ways, including building from source, using Docker images, deploying on Vercel, Render.com, Kubernetes with Helm, or Docker Compose. It requires Node.js, PostgreSQL, and specific system resources for deployment.

dockershrink
Dockershrink is an AI-powered Commandline Tool designed to help reduce the size of Docker images. It combines traditional Rule-based analysis with Generative AI techniques to optimize Image configurations. The tool supports NodeJS applications and aims to save costs on storage, data transfer, and build times while increasing developer productivity. By automatically applying advanced optimization techniques, Dockershrink simplifies the process for engineers and organizations, resulting in significant savings and efficiency improvements.

azure-search-openai-javascript
This sample demonstrates a few approaches for creating ChatGPT-like experiences over your own data using the Retrieval Augmented Generation pattern. It uses Azure OpenAI Service to access the ChatGPT model (gpt-35-turbo), and Azure AI Search for data indexing and retrieval.
For similar tasks

AIOStreams
AIOStreams is a versatile tool that combines streams from various addons into one platform, offering extensive customization options. Users can change result formats, filter results by various criteria, remove duplicates, prioritize services, sort results, specify size limits, and more. The tool scrapes results from selected addons, applies user configurations, and presents the results in a unified manner. It simplifies the process of finding and accessing desired content from multiple sources, enhancing user experience and efficiency.

aiostream
aiostream provides a collection of stream operators for creating asynchronous pipelines of operations. It offers features like operator pipe-lining, repeatability, safe iteration context, simplified execution, slicing and indexing, and concatenation. The stream operators are categorized into creation, transformation, selection, combination, aggregation, advanced, timing, and miscellaneous. Users can combine these operators to perform various asynchronous tasks efficiently.
For similar jobs

book
Podwise is an AI knowledge management app designed specifically for podcast listeners. With the Podwise platform, you only need to follow your favorite podcasts, such as "Hardcore Hackers". When a program is released, Podwise will use AI to transcribe, extract, summarize, and analyze the podcast content, helping you to break down the hard-core podcast knowledge. At the same time, it is connected to platforms such as Notion, Obsidian, Logseq, and Readwise, embedded in your knowledge management workflow, and integrated with content from other channels including news, newsletters, and blogs, helping you to improve your second brain 🧠.

extractor
Extractor is an AI-powered data extraction library for Laravel that leverages OpenAI's capabilities to effortlessly extract structured data from various sources, including images, PDFs, and emails. It features a convenient wrapper around OpenAI Chat and Completion endpoints, supports multiple input formats, includes a flexible Field Extractor for arbitrary data extraction, and integrates with Textract for OCR functionality. Extractor utilizes JSON Mode from the latest GPT-3.5 and GPT-4 models, providing accurate and efficient data extraction.

Scrapegraph-ai
ScrapeGraphAI is a Python library that uses Large Language Models (LLMs) and direct graph logic to create web scraping pipelines for websites, documents, and XML files. It allows users to extract specific information from web pages by providing a prompt describing the desired data. ScrapeGraphAI supports various LLMs, including Ollama, OpenAI, Gemini, and Docker, enabling users to choose the most suitable model for their needs. The library provides a user-friendly interface through its `SmartScraper` class, which simplifies the process of building and executing scraping pipelines. ScrapeGraphAI is open-source and available on GitHub, with extensive documentation and examples to guide users. It is particularly useful for researchers and data scientists who need to extract structured data from web pages for analysis and exploration.

databerry
Chaindesk is a no-code platform that allows users to easily set up a semantic search system for personal data without technical knowledge. It supports loading data from various sources such as raw text, web pages, files (Word, Excel, PowerPoint, PDF, Markdown, Plain Text), and upcoming support for web sites, Notion, and Airtable. The platform offers a user-friendly interface for managing datastores, querying data via a secure API endpoint, and auto-generating ChatGPT Plugins for each datastore. Chaindesk utilizes a Vector Database (Qdrant), Openai's text-embedding-ada-002 for embeddings, and has a chunk size of 1024 tokens. The technology stack includes Next.js, Joy UI, LangchainJS, PostgreSQL, Prisma, and Qdrant, inspired by the ChatGPT Retrieval Plugin.

auto-news
Auto-News is an automatic news aggregator tool that utilizes Large Language Models (LLM) to pull information from various sources such as Tweets, RSS feeds, YouTube videos, web articles, Reddit, and journal notes. The tool aims to help users efficiently read and filter content based on personal interests, providing a unified reading experience and organizing information effectively. It features feed aggregation with summarization, transcript generation for videos and articles, noise reduction, task organization, and deep dive topic exploration. The tool supports multiple LLM backends, offers weekly top-k aggregations, and can be deployed on Linux/MacOS using docker-compose or Kubernetes.

SemanticFinder
SemanticFinder is a frontend-only live semantic search tool that calculates embeddings and cosine similarity client-side using transformers.js and SOTA embedding models from Huggingface. It allows users to search through large texts like books with pre-indexed examples, customize search parameters, and offers data privacy by keeping input text in the browser. The tool can be used for basic search tasks, analyzing texts for recurring themes, and has potential integrations with various applications like wikis, chat apps, and personal history search. It also provides options for building browser extensions and future ideas for further enhancements and integrations.

1filellm
1filellm is a command-line data aggregation tool designed for LLM ingestion. It aggregates and preprocesses data from various sources into a single text file, facilitating the creation of information-dense prompts for large language models. The tool supports automatic source type detection, handling of multiple file formats, web crawling functionality, integration with Sci-Hub for research paper downloads, text preprocessing, and token count reporting. Users can input local files, directories, GitHub repositories, pull requests, issues, ArXiv papers, YouTube transcripts, web pages, Sci-Hub papers via DOI or PMID. The tool provides uncompressed and compressed text outputs, with the uncompressed text automatically copied to the clipboard for easy pasting into LLMs.

Agently-Daily-News-Collector
Agently Daily News Collector is an open-source project showcasing a workflow powered by the Agent ly AI application development framework. It allows users to generate news collections on various topics by inputting the field topic. The AI agents automatically perform the necessary tasks to generate a high-quality news collection saved in a markdown file. Users can edit settings in the YAML file, install Python and required packages, input their topic idea, and wait for the news collection to be generated. The process involves tasks like outlining, searching, summarizing, and preparing column data. The project dependencies include Agently AI Development Framework, duckduckgo-search, BeautifulSoup4, and PyYAM.