fabric
fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
Stars: 26692
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
README:
Updates • What and Why • Philosophy • Installation • Usage • Examples • Just Use the Patterns • Custom Patterns • Helper Apps • Meta
-
fabric
[!NOTE] November 8, 2024
- Multimodal Support: You can now use
-a
(attachment) for Multimodal submissions to OpenAI models that support it. Example:fabric -a https://path/to/image "Give me a description of this image."
Since the start of 2023 and GenAI we've seen a massive number of AI applications for accomplishing tasks. It's powerful, but it's not easy to integrate this functionality into our lives.
Fabric was created to address this by enabling everyone to granularly apply AI to everyday challenges.
Keep in mind that many of these were recorded when Fabric was Python-based, so remember to use the current install instructions below.
AI isn't a thing; it's a magnifier of a thing. And that thing is human creativity.
We believe the purpose of technology is to help humans flourish, so when we talk about AI we start with the human problems we want to solve.
Our approach is to break problems into individual pieces (see below) and then apply AI to them one at a time. See below for some examples.
Prompts are good for this, but the biggest challenge I faced in 2023——which still exists today—is the sheer number of AI prompts out there. We all have prompts that are useful, but it's hard to discover new ones, know if they are good or not, and manage different versions of the ones we like.
One of fabric
's primary features is helping people collect and integrate prompts, which we call Patterns, into various parts of their lives.
Fabric has Patterns for all sorts of life and work activities, including:
- Extracting the most interesting parts of YouTube videos and podcasts
- Writing an essay in your own voice with just an idea as an input
- Summarizing opaque academic papers
- Creating perfectly matched AI art prompts for a piece of writing
- Rating the quality of content to see if you want to read/watch the whole thing
- Getting summaries of long, boring content
- Explaining code to you
- Turning bad documentation into usable documentation
- Creating social media posts from any content input
- And a million more…
To install Fabric, you can use the latest release binaries or install it from the source.
# Windows:
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-windows-amd64.exe > fabric.exe && fabric.exe --version
# MacOS (arm64):
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-darwin-arm64 > fabric && chmod +x fabric && ./fabric --version
# MacOS (amd64):
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-darwin-amd64 > fabric && chmod +x fabric && ./fabric --version
# Linux (amd64):
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-amd64 > fabric && chmod +x fabric && ./fabric --version
# Linux (arm64):
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-arm64 > fabric && chmod +x fabric && ./fabric --version
To install Fabric, make sure Go is installed, and then run the following command.
# Install Fabric directly from the repo
go install github.com/danielmiessler/fabric@latest
You may need to set some environment variables in your ~/.bashrc
on linux or ~/.zshrc
file on mac to be able to run the fabric
command. Here is an example of what you can add:
For Intel based macs or linux
# Golang environment variables
export GOROOT=/usr/local/go
export GOPATH=$HOME/go
# Update PATH to include GOPATH and GOROOT binaries
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH
for Apple Silicon based macs
# Golang environment variables
export GOROOT=$(brew --prefix go)/libexec
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH
Now run the following command
# Run the setup to set up your directories and keys
fabric --setup
If everything works you are good to go.
In order to add aliases for all your patterns and use them directly as commands ie. summarize
instead of fabric --pattern summarize
You can add the following to your .zshrc
or .bashrc
file.
# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in $HOME/.config/fabric/patterns/*; do
# Get the base name of the file (i.e., remove the directory path)
pattern_name=$(basename "$pattern_file")
# Create an alias in the form: alias pattern_name="fabric --pattern pattern_name"
alias_command="alias $pattern_name='fabric --pattern $pattern_name'"
# Evaluate the alias command to add it to the current shell
eval "$alias_command"
done
yt() {
local video_link="$1"
fabric -y "$video_link" --transcript
}
You can add the below code for the equivalent aliases inside PowerShell by running notepad $PROFILE
inside a PowerShell window:
# Path to the patterns directory
$patternsPath = Join-Path $HOME ".config/fabric/patterns"
foreach ($patternDir in Get-ChildItem -Path $patternsPath -Directory) {
$patternName = $patternDir.Name
# Dynamically define a function for each pattern
$functionDefinition = @"
function $patternName {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline = `$true)]
[string] `$InputObject,
[Parameter(ValueFromRemainingArguments = `$true)]
[String[]] `$patternArgs
)
begin {
# Initialize an array to collect pipeline input
`$collector = @()
}
process {
# Collect pipeline input objects
if (`$InputObject) {
`$collector += `$InputObject
}
}
end {
# Join all pipeline input into a single string, separated by newlines
`$pipelineContent = `$collector -join "`n"
# If there's pipeline input, include it in the call to fabric
if (`$pipelineContent) {
`$pipelineContent | fabric --pattern $patternName `$patternArgs
} else {
# No pipeline input; just call fabric with the additional args
fabric --pattern $patternName `$patternArgs
}
}
}
"@
# Add the function to the current session
Invoke-Expression $functionDefinition
}
# Define the 'yt' function as well
function yt {
[CmdletBinding()]
param(
[Parameter(Mandatory = $true)]
[string]$videoLink
)
fabric -y $videoLink --transcript
}
This also creates a yt
alias that allows you to use yt https://www.youtube.com/watch?v=4b0iet22VIk
to get transcripts, comments, and metadata.
If in addition to the above aliases you would like to have the option to save the output to your favourite markdown note vault like Obsidian then instead of the above add the following to your .zshrc
or .bashrc
file:
# Define the base directory for Obsidian notes
obsidian_base="/path/to/obsidian"
# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in ~/.config/fabric/patterns/*; do
# Get the base name of the file (i.e., remove the directory path)
pattern_name=$(basename "$pattern_file")
# Unalias any existing alias with the same name
unalias "$pattern_name" 2>/dev/null
# Define a function dynamically for each pattern
eval "
$pattern_name() {
local title=\$1
local date_stamp=\$(date +'%Y-%m-%d')
local output_path=\"\$obsidian_base/\${date_stamp}-\${title}.md\"
# Check if a title was provided
if [ -n \"\$title\" ]; then
# If a title is provided, use the output path
fabric --pattern \"$pattern_name\" -o \"\$output_path\"
else
# If no title is provided, use --stream
fabric --pattern \"$pattern_name\" --stream
fi
}
"
done
yt() {
local video_link="$1"
fabric -y "$video_link" --transcript
}
This will allow you to use the patterns as aliases like in the above for example summarize
instead of fabric --pattern summarize --stream
, however if you pass in an extra argument like this summarize "my_article_title"
your output will be saved in the destination that you set in obsidian_base="/path/to/obsidian"
in the following format YYYY-MM-DD-my_article_title.md
where the date gets autogenerated for you.
You can tweak the date format by tweaking the date_stamp
format.
If you have the Legacy (Python) version installed and want to migrate to the Go version, here's how you do it. It's basically two steps: 1) uninstall the Python version, and 2) install the Go version.
# Uninstall Legacy Fabric
pipx uninstall fabric
# Clear any old Fabric aliases
(check your .bashrc, .zshrc, etc.)
# Install the Go version
go install github.com/danielmiessler/fabric@latest
# Run setup for the new version. Important because things have changed
fabric --setup
Then set your environmental variables as shown above.
The great thing about Go is that it's super easy to upgrade. Just run the same command you used to install it in the first place and you'll always get the latest version.
go install github.com/danielmiessler/fabric@latest
Once you have it all set up, here's how to use it.
fabric -h
Usage:
fabric [OPTIONS]
Application Options:
-p, --pattern= Choose a pattern from the available patterns
-v, --variable= Values for pattern variables, e.g. -v=#role:expert -v=#points:30"
-C, --context= Choose a context from the available contexts
--session= Choose a session from the available sessions
-a, --attachment= Attachment path or URL (e.g. for OpenAI image recognition messages)
-S, --setup Run setup for all reconfigurable parts of fabric
-t, --temperature= Set temperature (default: 0.7)
-T, --topp= Set top P (default: 0.9)
-s, --stream Stream
-P, --presencepenalty= Set presence penalty (default: 0.0)
-r, --raw Use the defaults of the model without sending chat options (like temperature etc.) and use the user role instead of the system role for patterns.
-F, --frequencypenalty= Set frequency penalty (default: 0.0)
-l, --listpatterns List all patterns
-L, --listmodels List all available models
-x, --listcontexts List all contexts
-X, --listsessions List all sessions
-U, --updatepatterns Update patterns
-c, --copy Copy to clipboard
-m, --model= Choose model
-o, --output= Output to file
--output-session Output the entire session (also a temporary one) to the output file
-n, --latest= Number of latest patterns to list (default: 0)
-d, --changeDefaultModel Change default model
-y, --youtube= YouTube video "URL" to grab transcript, comments from it and send to chat
--transcript Grab transcript from YouTube video and send to chat (it used per default).
--comments Grab comments from YouTube video and send to chat
--metadata Grab metadata from YouTube video and send to chat
-g, --language= Specify the Language Code for the chat, e.g. -g=en -g=zh
-u, --scrape_url= Scrape website URL to markdown using Jina AI
-q, --scrape_question= Search question using Jina AI
-e, --seed= Seed to be used for LMM generation
-w, --wipecontext= Wipe context
-W, --wipesession= Wipe session
--printcontext= Print context
--printsession= Print session
--readability Convert HTML input into a clean, readable view
--serve Initiate the API server
--dry-run Show what would be sent to the model without actually sending it
--version Print current version
Help Options:
-h, --help Show this help message
Fabric Patterns are different than most prompts you'll see.
-
First, we use
Markdown
to help ensure maximum readability and editability. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. Importantly, this also includes the AI you're sending it to!
Here's an example of a Fabric Pattern.
https://github.com/danielmiessler/fabric/blob/main/patterns/extract_wisdom/system.md
-
Next, we are extremely clear in our instructions, and we use the Markdown structure to emphasize what we want the AI to do, and in what order.
-
And finally, we tend to use the System section of the prompt almost exclusively. In over a year of being heads-down with this stuff, we've just seen more efficacy from doing that. If that changes, or we're shown data that says otherwise, we will adjust.
The following examples use the macOS
pbpaste
to paste from the clipboard. See the pbpaste section below for Windows and Linux alternatives.
Now let's look at some things you can do with Fabric.
- Run the
summarize
Pattern based on input fromstdin
. In this case, the body of an article.
pbpaste | fabric --pattern summarize
- Run the
analyze_claims
Pattern with the--stream
option to get immediate and streaming results.
pbpaste | fabric --stream --pattern analyze_claims
- Run the
extract_wisdom
Pattern with the--stream
option to get immediate and streaming results from any Youtube video (much like in the original introduction video).
fabric -y "https://youtube.com/watch?v=uXs-zPc63kM" --stream --pattern extract_wisdom
- Create patterns- you must create a .md file with the pattern and save it to ~/.config/fabric/patterns/[yourpatternname].
If you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the /patterns
directory and start exploring!
We hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.
You can use any of the Patterns you see there in any AI application that you have, whether that's ChatGPT or some other app or website. Our plan and prediction is that people will soon be sharing many more than those we've published, and they will be way better than ours.
The wisdom of crowds for the win.
You may want to use Fabric to create your own custom Patterns—but not share them with others. No problem!
Just make a directory in ~/.config/custompatterns/
(or wherever) and put your .md
files in there.
When you're ready to use them, copy them into:
~/.config/fabric/patterns/
You can then use them like any other Patterns, but they won't be public unless you explicitly submit them as Pull Requests to the Fabric project. So don't worry—they're private to you.
This feature works with all openai and ollama models but does NOT work with claude. You can specify your model with the -m flag
Fabric also makes use of some core helper apps (tools) to make it easier to integrate with your various workflows. Here are some examples:
to_pdf
is a helper command that converts LaTeX files to PDF format. You can use it like this:
to_pdf input.tex
This will create a PDF file from the input LaTeX file in the same directory.
You can also use it with stdin which works perfectly with the write_latex
pattern:
echo "ai security primer" | fabric --pattern write_latex | to_pdf
This will create a PDF file named output.pdf
in the current directory.
To install to_pdf
, install it the same way as you install Fabric, just with a different repo name.
go install github.com/danielmiessler/fabric/plugins/tools/to_pdf@latest
Make sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on your system, as to_pdf
requires pdflatex
to be available in your system's PATH.
The examples use the macOS program pbpaste
to paste content from the clipboard to pipe into fabric
as the input. pbpaste
is not available on Windows or Linux, but there are alternatives.
On Windows, you can use the PowerShell command Get-Clipboard
from a PowerShell command prompt. If you like, you can also alias it to pbpaste
. If you are using classic PowerShell, edit the file ~\Documents\WindowsPowerShell\.profile.ps1
, or if you are using PowerShell Core, edit ~\Documents\PowerShell\.profile.ps1
and add the alias,
Set-Alias pbpaste Get-Clipboard
On Linux, you can use xclip -selection clipboard -o
to paste from the clipboard. You will likely need to install xclip
with your package manager. For Debian based systems including Ubuntu,
sudo apt update
sudo apt install xclip -y
You can also create an alias by editing ~/.bashrc
or ~/.zshrc
and adding the alias,
alias pbpaste='xclip -selection clipboard -o'
Fabric now includes a built-in web interface that provides a GUI alternative to the command-line interface and an out-of-the-box website for those who want to get started with web development or blogging.
You can use this app as a GUI interface for Fabric, a ready to go blog-site, or a website template for your own projects.
The web/src/lib/content
directory includes starter .obsidian/
and templates/
directories, allowing you to open up the web/src/lib/content/
directory as an Obsidian.md vault. You can place your posts in the posts directory when you're ready to publish.
The GUI can be installed by navigating to the web
directory and using npm install
, pnpm install
, or your favorite package manager. Then simply run the development server to start the app.
You will need to run fabric in a separate terminal with the fabric --serve
command.
From the fabric project web/
directory:
npm run dev
## or ##
pnpm run dev
## or your equivalent
To run the Streamlit user interface:
# Install required dependencies
pip install streamlit pandas matplotlib seaborn numpy python-dotenv
# Run the Streamlit app
streamlit run streamlit.py
The Streamlit UI provides a user-friendly interface for:
- Running and chaining patterns
- Managing pattern outputs
- Creating and editing patterns
- Analyzing pattern results
[!NOTE] Special thanks to the following people for their inspiration and contributions!
- Jonathan Dunn for being the absolute MVP dev on the project, including spearheading the new Go version, as well as the GUI! All this while also being a full-time medical doctor!
- Caleb Sima for pushing me over the edge of whether to make this a public project or not.
- Eugen Eisler and Frederick Ros for their invaluable contributions to the Go version
- David Peters for his work on the web interface.
- Joel Parish for super useful input on the project's Github directory structure..
-
Joseph Thacker for the idea of a
-c
context flag that adds pre-created context in the./config/fabric/
directory to all Pattern queries. -
Jason Haddix for the idea of a stitch (chained Pattern) to filter content using a local model before sending on to a cloud model, i.e., cleaning customer data using
llama2
before sending on togpt-4
for analysis. - Andre Guerra for assisting with numerous components to make things simpler and more maintainable.
fabric
was created by Daniel Miessler in January of 2024.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for fabric
Similar Open Source Tools
fabric
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
telemetry-airflow
This repository codifies the Airflow cluster that is deployed at workflow.telemetry.mozilla.org (behind SSO) and commonly referred to as "WTMO" or simply "Airflow". Some links relevant to users and developers of WTMO: * The `dags` directory in this repository contains some custom DAG definitions * Many of the DAGs registered with WTMO don't live in this repository, but are instead generated from ETL task definitions in bigquery-etl * The Data SRE team maintains a WTMO Developer Guide (behind SSO)
code2prompt
code2prompt is a command-line tool that converts your codebase into a single LLM prompt with a source tree, prompt templating, and token counting. It automates generating LLM prompts from codebases of any size, customizing prompt generation with Handlebars templates, respecting .gitignore, filtering and excluding files using glob patterns, displaying token count, including Git diff output, copying prompt to clipboard, saving prompt to an output file, excluding files and folders, adding line numbers to source code blocks, and more. It helps streamline the process of creating LLM prompts for code analysis, generation, and other tasks.
sage
Sage is a tool that allows users to chat with any codebase, providing a chat interface for code understanding and integration. It simplifies the process of learning how a codebase works by offering heavily documented answers sourced directly from the code. Users can set up Sage locally or on the cloud with minimal effort. The tool is designed to be easily customizable, allowing users to swap components of the pipeline and improve the algorithms powering code understanding and generation.
oterm
Oterm is a text-based terminal client for Ollama, a large language model. It provides an intuitive and simple terminal UI, allowing users to interact with Ollama without running servers or frontends. Oterm supports multiple persistent chat sessions, which are stored along with context embeddings and system prompt customizations in a SQLite database. Users can easily customize the model's system prompt and parameters, and select from any of the models they have pulled in Ollama or their own custom models. Oterm also supports keyboard shortcuts for creating new chat sessions, editing existing sessions, renaming sessions, exporting sessions as markdown, deleting sessions, toggling between dark and light themes, quitting the application, switching to multiline input mode, selecting images to include with messages, and navigating through the history of previous prompts. Oterm is licensed under the MIT License.
opencommit
OpenCommit is a tool that auto-generates meaningful commits using AI, allowing users to quickly create commit messages for their staged changes. It provides a CLI interface for easy usage and supports customization of commit descriptions, emojis, and AI models. Users can configure local and global settings, switch between different AI providers, and set up Git hooks for integration with IDE Source Control. Additionally, OpenCommit can be used as a GitHub Action to automatically improve commit messages on push events, ensuring all commits are meaningful and not generic. Payments for OpenAI API requests are handled by the user, with the tool storing API keys locally.
torchchat
torchchat is a codebase showcasing the ability to run large language models (LLMs) seamlessly. It allows running LLMs using Python in various environments such as desktop, server, iOS, and Android. The tool supports running models via PyTorch, chatting, generating text, running chat in the browser, and running models on desktop/server without Python. It also provides features like AOT Inductor for faster execution, running in C++ using the runner, and deploying and running on iOS and Android. The tool supports popular hardware and OS including Linux, Mac OS, Android, and iOS, with various data types and execution modes available.
fabrice-ai
A lightweight, functional, and composable framework for building AI agents that work together to solve complex tasks. Built with TypeScript and designed to be serverless-ready. Fabrice embraces functional programming principles, remains stateless, and stays focused on composability. It provides core concepts like easy teamwork creation, infrastructure-agnosticism, statelessness, and includes all tools and features needed to build AI teams. Agents are specialized workers with specific roles and capabilities, able to call tools and complete tasks. Workflows define how agents collaborate to achieve a goal, with workflow states representing the current state of the workflow. Providers handle requests to the LLM and responses. Tools extend agent capabilities by providing concrete actions they can perform. Execution involves running the workflow to completion, with options for custom execution and BDD testing.
aiid
The Artificial Intelligence Incident Database (AIID) is a collection of incidents involving the development and use of artificial intelligence (AI). The database is designed to help researchers, policymakers, and the public understand the potential risks and benefits of AI, and to inform the development of policies and practices to mitigate the risks and promote the benefits of AI. The AIID is a collaborative project involving researchers from the University of California, Berkeley, the University of Washington, and the University of Toronto.
garak
Garak is a vulnerability scanner designed for LLMs (Large Language Models) that checks for various weaknesses such as hallucination, data leakage, prompt injection, misinformation, toxicity generation, and jailbreaks. It combines static, dynamic, and adaptive probes to explore vulnerabilities in LLMs. Garak is a free tool developed for red-teaming and assessment purposes, focusing on making LLMs or dialog systems fail. It supports various LLM models and can be used to assess their security and robustness.
vectara-answer
Vectara Answer is a sample app for Vectara-powered Summarized Semantic Search (or question-answering) with advanced configuration options. For examples of what you can build with Vectara Answer, check out Ask News, LegalAid, or any of the other demo applications.
garak
Garak is a free tool that checks if a Large Language Model (LLM) can be made to fail in a way that is undesirable. It probes for hallucination, data leakage, prompt injection, misinformation, toxicity generation, jailbreaks, and many other weaknesses. Garak's a free tool. We love developing it and are always interested in adding functionality to support applications.
gpt-cli
gpt-cli is a command-line interface tool for interacting with various chat language models like ChatGPT, Claude, and others. It supports model customization, usage tracking, keyboard shortcuts, multi-line input, markdown support, predefined messages, and multiple assistants. Users can easily switch between different assistants, define custom assistants, and configure model parameters and API keys in a YAML file for easy customization and management.
WindowsAgentArena
Windows Agent Arena (WAA) is a scalable Windows AI agent platform designed for testing and benchmarking multi-modal, desktop AI agents. It provides researchers and developers with a reproducible and realistic Windows OS environment for AI research, enabling testing of agentic AI workflows across various tasks. WAA supports deploying agents at scale using Azure ML cloud infrastructure, allowing parallel running of multiple agents and delivering quick benchmark results for hundreds of tasks in minutes.
aider-composer
Aider Composer is a VSCode extension that integrates Aider into your development workflow. It allows users to easily add and remove files, toggle between read-only and editable modes, review code changes, use different chat modes, and reference files in the chat. The extension supports multiple models, code generation, code snippets, and settings customization. It has limitations such as lack of support for multiple workspaces, Git repository features, linting, testing, voice features, in-chat commands, and configuration options.
desktop
ComfyUI Desktop is a packaged desktop application that allows users to easily use ComfyUI with bundled features like ComfyUI source code, ComfyUI-Manager, and uv. It automatically installs necessary Python dependencies and updates with stable releases. The app comes with Electron, Chromium binaries, and node modules. Users can store ComfyUI files in a specified location and manage model paths. The tool requires Python 3.12+ and Visual Studio with Desktop C++ workload for Windows. It uses nvm to manage node versions and yarn as the package manager. Users can install ComfyUI and dependencies using comfy-cli, download uv, and build/launch the code. Troubleshooting steps include rebuilding modules and installing missing libraries. The tool supports debugging in VSCode and provides utility scripts for cleanup. Crash reports can be sent to help debug issues, but no personal data is included.
For similar tasks
fabric
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
ap-plugin
AP-PLUGIN is an AI drawing plugin for the Yunzai series robot framework, allowing you to have a convenient AI drawing experience in the input box. It uses the open source Stable Diffusion web UI as the backend, deploys it for free, and generates a variety of images with richer functions.
comflowy
Comflowy is a community dedicated to providing comprehensive tutorials, fostering discussions, and building a database of workflows and models for ComfyUI and Stable Diffusion. Our mission is to lower the entry barrier for ComfyUI users, promote its mainstream adoption, and contribute to the growth of the AI generative graphics community.
stability-sdk
The stability-sdk is a Python package that provides a client implementation for interacting with the Stability API. This API allows users to generate images, upscale images, and animate images using a variety of different models and settings. The stability-sdk makes it easy to use the Stability API from Python code, and it provides a number of helpful features such as command line usage, support for multiple models, and the ability to filter artifacts by type.
Building-AI-Applications-with-ChatGPT-APIs
This repository is for the book 'Building AI Applications with ChatGPT APIs' published by Packt. It provides code examples and instructions for mastering ChatGPT, Whisper, and DALL-E APIs through building innovative AI projects. Readers will learn to develop AI applications using ChatGPT APIs, integrate them with frameworks like Flask and Django, create AI-generated art with DALL-E APIs, and optimize ChatGPT models through fine-tuning.
comfyui-photoshop
ComfyUI for Photoshop is a plugin that integrates with an AI-powered image generation system to enhance the Photoshop experience with features like unlimited generative fill, customizable back-end, AI-powered artistry, and one-click transformation. The plugin requires a minimum of 6GB graphics memory and 12GB RAM. Users can install the plugin and set up the ComfyUI workflow using provided links and files. Additionally, specific files like Check points, Loras, and Detailer Lora are required for different functionalities. Support and contributions are encouraged through GitHub.
awesome-generative-ai
Awesome Generative AI is a curated list of modern Generative Artificial Intelligence projects and services. Generative AI technology creates original content like images, sounds, and texts using machine learning algorithms trained on large data sets. It can produce unique and realistic outputs such as photorealistic images, digital art, music, and writing. The repo covers a wide range of applications in art, entertainment, marketing, academia, and computer science.
painting-droid
Painting Droid is an AI-powered cross-platform painting app inspired by MS Paint, expandable with plugins and open. It utilizes various AI models, from paid providers to self-hosted open-source models, as well as some lightweight ones built into the app. Features include regular painting app features, AI-generated content filling and augmentation, filters and effects, image manipulation, plugin support, and cross-platform compatibility.
For similar jobs
facefusion
FaceFusion is a next-generation face swapper and enhancer that allows users to seamlessly swap faces in images and videos, as well as enhance facial features for a more polished and refined look. With its advanced deep learning models, FaceFusion provides users with a wide range of options for customizing their face swaps and enhancements, making it an ideal tool for content creators, artists, and anyone looking to explore their creativity with facial manipulation.
forge
Forge is a free and open-source digital collectible card game (CCG) engine written in Java. It is designed to be easy to use and extend, and it comes with a variety of features that make it a great choice for developers who want to create their own CCGs. Forge is used by a number of popular CCGs, including Ascension, Dominion, and Thunderstone.
latentbox
Latent Box is a curated collection of resources for AI, creativity, and art. It aims to bridge the information gap with high-quality content, promote diversity and interdisciplinary collaboration, and maintain updates through community co-creation. The website features a wide range of resources, including articles, tutorials, tools, and datasets, covering various topics such as machine learning, computer vision, natural language processing, generative art, and creative coding.
fabric
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
ColorPicker
ColorPicker Max is a powerful and intuitive color selection and manipulation tool that is designed to make working with color easier and more efficient than ever before. With its wide range of features and tools, ColorPicker Max offers an unprecedented level of control and customization over every aspect of color selection and manipulation.
ai-notes
Notes on AI state of the art, with a focus on generative and large language models. These are the "raw materials" for the https://lspace.swyx.io/ newsletter. This repo used to be called https://github.com/sw-yx/prompt-eng, but was renamed because Prompt Engineering is Overhyped. This is now an AI Engineering notes repo.
Neurite
Neurite is an innovative project that combines chaos theory and graph theory to create a digital interface that explores hidden patterns and connections for creative thinking. It offers a unique workspace blending fractals with mind mapping techniques, allowing users to navigate the Mandelbrot set in real-time. Nodes in Neurite represent various content types like text, images, videos, code, and AI agents, enabling users to create personalized microcosms of thoughts and inspirations. The tool supports synchronized knowledge management through bi-directional synchronization between mind-mapping and text-based hyperlinking. Neurite also features FractalGPT for modular conversation with AI, local AI capabilities for multi-agent chat networks, and a Neural API for executing code and sequencing animations. The project is actively developed with plans for deeper fractal zoom, advanced control over node placement, and experimental features.
ScribbleArchitect
ScribbleArchitect is a GUI tool designed for generating images from simple brush strokes or Bezier curves in real-time. It is primarily intended for use in architecture and sketching in the early stages of a project. The tool utilizes Stable Diffusion and ControlNet as AI backbone for the generative process, with IP Adapter support and a library of predefined styles. Users can transfer specific styles to their line work, upscale images for high resolution export, and utilize a ControlNet upscaler. The tool also features a screen capture function for working with external tools like Adobe Illustrator or Inkscape.