fabric
fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
Stars: 30329
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
README:
Updates • What and Why • Philosophy • Installation • Usage • Examples • Just Use the Patterns • Custom Patterns • Helper Apps • Meta
-
fabric
[!NOTE] February 24, 2025
- Fabric now supports Sonnet 3.7! Update and use
-Sto select it as your default if you want, or just use the shortcut-m claude-3-7-sonnet-latest. Enjoy!
Since the start of 2023 and GenAI we've seen a massive number of AI applications for accomplishing tasks. It's powerful, but it's not easy to integrate this functionality into our lives.
Fabric was created to address this by enabling everyone to granularly apply AI to everyday challenges.
Keep in mind that many of these were recorded when Fabric was Python-based, so remember to use the current install instructions below.
AI isn't a thing; it's a magnifier of a thing. And that thing is human creativity.
We believe the purpose of technology is to help humans flourish, so when we talk about AI we start with the human problems we want to solve.
Our approach is to break problems into individual pieces (see below) and then apply AI to them one at a time. See below for some examples.
Prompts are good for this, but the biggest challenge I faced in 2023——which still exists today—is the sheer number of AI prompts out there. We all have prompts that are useful, but it's hard to discover new ones, know if they are good or not, and manage different versions of the ones we like.
One of fabric's primary features is helping people collect and integrate prompts, which we call Patterns, into various parts of their lives.
Fabric has Patterns for all sorts of life and work activities, including:
- Extracting the most interesting parts of YouTube videos and podcasts
- Writing an essay in your own voice with just an idea as an input
- Summarizing opaque academic papers
- Creating perfectly matched AI art prompts for a piece of writing
- Rating the quality of content to see if you want to read/watch the whole thing
- Getting summaries of long, boring content
- Explaining code to you
- Turning bad documentation into usable documentation
- Creating social media posts from any content input
- And a million more…
To install Fabric, you can use the latest release binaries or install it from the source.
https://github.com/danielmiessler/fabric/releases/latest/download/fabric-windows-amd64.exe
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-darwin-arm64 > fabric && chmod +x fabric && ./fabric --version
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-darwin-amd64 > fabric && chmod +x fabric && ./fabric --version
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-amd64 > fabric && chmod +x fabric && ./fabric --version
curl -L https://github.com/danielmiessler/fabric/releases/latest/download/fabric-linux-arm64 > fabric && chmod +x fabric && ./fabric --version
NOTE: using Homebrew or the Arch Linux package managers makes fabric available as fabric-ai, so add
the following alias to your shell startup files to account for this:
alias fabric='fabric-ai'brew install fabric-ai
yay -S fabric-ai
To install Fabric, make sure Go is installed, and then run the following command.
# Install Fabric directly from the repo
go install github.com/danielmiessler/fabric@latestYou may need to set some environment variables in your ~/.bashrc on linux or ~/.zshrc file on mac to be able to run the fabric command. Here is an example of what you can add:
For Intel based macs or linux
# Golang environment variables
export GOROOT=/usr/local/go
export GOPATH=$HOME/go
# Update PATH to include GOPATH and GOROOT binaries
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATHfor Apple Silicon based macs
# Golang environment variables
export GOROOT=$(brew --prefix go)/libexec
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATHNow run the following command
# Run the setup to set up your directories and keys
fabric --setupIf everything works you are good to go.
In order to add aliases for all your patterns and use them directly as commands ie. summarize instead of fabric --pattern summarize
You can add the following to your .zshrc or .bashrc file.
# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in $HOME/.config/fabric/patterns/*; do
# Get the base name of the file (i.e., remove the directory path)
pattern_name=$(basename "$pattern_file")
# Create an alias in the form: alias pattern_name="fabric --pattern pattern_name"
alias_command="alias $pattern_name='fabric --pattern $pattern_name'"
# Evaluate the alias command to add it to the current shell
eval "$alias_command"
done
yt() {
if [ "$#" -eq 0 ] || [ "$#" -gt 2 ]; then
echo "Usage: yt [-t | --timestamps] youtube-link"
echo "Use the '-t' flag to get the transcript with timestamps."
return 1
fi
transcript_flag="--transcript"
if [ "$1" = "-t" ] || [ "$1" = "--timestamps" ]; then
transcript_flag="--transcript-with-timestamps"
shift
fi
local video_link="$1"
fabric -y "$video_link" $transcript_flag
}You can add the below code for the equivalent aliases inside PowerShell by running notepad $PROFILE inside a PowerShell window:
# Path to the patterns directory
$patternsPath = Join-Path $HOME ".config/fabric/patterns"
foreach ($patternDir in Get-ChildItem -Path $patternsPath -Directory) {
$patternName = $patternDir.Name
# Dynamically define a function for each pattern
$functionDefinition = @"
function $patternName {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline = `$true)]
[string] `$InputObject,
[Parameter(ValueFromRemainingArguments = `$true)]
[String[]] `$patternArgs
)
begin {
# Initialize an array to collect pipeline input
`$collector = @()
}
process {
# Collect pipeline input objects
if (`$InputObject) {
`$collector += `$InputObject
}
}
end {
# Join all pipeline input into a single string, separated by newlines
`$pipelineContent = `$collector -join "`n"
# If there's pipeline input, include it in the call to fabric
if (`$pipelineContent) {
`$pipelineContent | fabric --pattern $patternName `$patternArgs
} else {
# No pipeline input; just call fabric with the additional args
fabric --pattern $patternName `$patternArgs
}
}
}
"@
# Add the function to the current session
Invoke-Expression $functionDefinition
}
# Define the 'yt' function as well
function yt {
[CmdletBinding()]
param(
[Parameter()]
[Alias("timestamps")]
[switch]$t,
[Parameter(Position = 0, ValueFromPipeline = $true)]
[string]$videoLink
)
begin {
$transcriptFlag = "--transcript"
if ($t) {
$transcriptFlag = "--transcript-with-timestamps"
}
}
process {
if (-not $videoLink) {
Write-Error "Usage: yt [-t | --timestamps] youtube-link"
return
}
}
end {
if ($videoLink) {
# Execute and allow output to flow through the pipeline
fabric -y $videoLink $transcriptFlag
}
}
}This also creates a yt alias that allows you to use yt https://www.youtube.com/watch?v=4b0iet22VIk to get transcripts, comments, and metadata.
If in addition to the above aliases you would like to have the option to save the output to your favorite markdown note vault like Obsidian then instead of the above add the following to your .zshrc or .bashrc file:
# Define the base directory for Obsidian notes
obsidian_base="/path/to/obsidian"
# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in ~/.config/fabric/patterns/*; do
# Get the base name of the file (i.e., remove the directory path)
pattern_name=$(basename "$pattern_file")
# Remove any existing alias with the same name
unalias "$pattern_name" 2>/dev/null
# Define a function dynamically for each pattern
eval "
$pattern_name() {
local title=\$1
local date_stamp=\$(date +'%Y-%m-%d')
local output_path=\"\$obsidian_base/\${date_stamp}-\${title}.md\"
# Check if a title was provided
if [ -n \"\$title\" ]; then
# If a title is provided, use the output path
fabric --pattern \"$pattern_name\" -o \"\$output_path\"
else
# If no title is provided, use --stream
fabric --pattern \"$pattern_name\" --stream
fi
}
"
doneThis will allow you to use the patterns as aliases like in the above for example summarize instead of fabric --pattern summarize --stream, however if you pass in an extra argument like this summarize "my_article_title" your output will be saved in the destination that you set in obsidian_base="/path/to/obsidian" in the following format YYYY-MM-DD-my_article_title.md where the date gets autogenerated for you.
You can tweak the date format by tweaking the date_stamp format.
If you have the Legacy (Python) version installed and want to migrate to the Go version, here's how you do it. It's basically two steps: 1) uninstall the Python version, and 2) install the Go version.
# Uninstall Legacy Fabric
pipx uninstall fabric
# Clear any old Fabric aliases
(check your .bashrc, .zshrc, etc.)
# Install the Go version
go install github.com/danielmiessler/fabric@latest
# Run setup for the new version. Important because things have changed
fabric --setupThen set your environmental variables as shown above.
The great thing about Go is that it's super easy to upgrade. Just run the same command you used to install it in the first place and you'll always get the latest version.
go install github.com/danielmiessler/fabric@latestOnce you have it all set up, here's how to use it.
fabric -hUsage:
fabric [OPTIONS]
Application Options:
-p, --pattern= Choose a pattern from the available patterns
-v, --variable= Values for pattern variables, e.g. -v=#role:expert -v=#points:30
-C, --context= Choose a context from the available contexts
--session= Choose a session from the available sessions
-a, --attachment= Attachment path or URL (e.g. for OpenAI image recognition messages)
-S, --setup Run setup for all reconfigurable parts of fabric
-t, --temperature= Set temperature (default: 0.7)
-T, --topp= Set top P (default: 0.9)
-s, --stream Stream
-P, --presencepenalty= Set presence penalty (default: 0.0)
-r, --raw Use the defaults of the model without sending chat options (like temperature etc.) and use the user role instead of the system role for patterns.
-F, --frequencypenalty= Set frequency penalty (default: 0.0)
-l, --listpatterns List all patterns
-L, --listmodels List all available models
-x, --listcontexts List all contexts
-X, --listsessions List all sessions
-U, --updatepatterns Update patterns
-c, --copy Copy to clipboard
-m, --model= Choose model
--modelContextLength= Model context length (only affects ollama)
-o, --output= Output to file
--output-session Output the entire session (also a temporary one) to the output file
-n, --latest= Number of latest patterns to list (default: 0)
-d, --changeDefaultModel Change default model
-y, --youtube= YouTube video or play list "URL" to grab transcript, comments from it and send to chat or print it put to the console and store it in the output file
--playlist Prefer playlist over video if both ids are present in the URL
--transcript Grab transcript from YouTube video and send to chat (it is used per default).
--transcript-with-timestamps Grab transcript from YouTube video with timestamps and send to chat
--comments Grab comments from YouTube video and send to chat
--metadata Output video metadata
-g, --language= Specify the Language Code for the chat, e.g. -g=en -g=zh
-u, --scrape_url= Scrape website URL to markdown using Jina AI
-q, --scrape_question= Search question using Jina AI
-e, --seed= Seed to be used for LMM generation
-w, --wipecontext= Wipe context
-W, --wipesession= Wipe session
--printcontext= Print context
--printsession= Print session
--readability Convert HTML input into a clean, readable view
--input-has-vars Apply variables to user input
--dry-run Show what would be sent to the model without actually sending it
--serve Serve the Fabric Rest API
--serveOllama Serve the Fabric Rest API with ollama endpoints
--address= The address to bind the REST API (default: :8080)
--config= Path to YAML config file
--version Print current version
--listextensions List all registered extensions
--addextension= Register a new extension from config file path
--rmextension= Remove a registered extension by name
--strategy= Choose a strategy from the available strategies
--liststrategies List all strategies
Help Options:
-h, --help Show this help message
Fabric Patterns are different than most prompts you'll see.
-
First, we use
Markdownto help ensure maximum readability and editability. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. Importantly, this also includes the AI you're sending it to!
Here's an example of a Fabric Pattern.
https://github.com/danielmiessler/fabric/blob/main/patterns/extract_wisdom/system.md-
Next, we are extremely clear in our instructions, and we use the Markdown structure to emphasize what we want the AI to do, and in what order.
-
And finally, we tend to use the System section of the prompt almost exclusively. In over a year of being heads-down with this stuff, we've just seen more efficacy from doing that. If that changes, or we're shown data that says otherwise, we will adjust.
The following examples use the macOS
pbpasteto paste from the clipboard. See the pbpaste section below for Windows and Linux alternatives.
Now let's look at some things you can do with Fabric.
-
Run the
summarizePattern based on input fromstdin. In this case, the body of an article.pbpaste | fabric --pattern summarize -
Run the
analyze_claimsPattern with the--streamoption to get immediate and streaming results.pbpaste | fabric --stream --pattern analyze_claims -
Run the
extract_wisdomPattern with the--streamoption to get immediate and streaming results from any Youtube video (much like in the original introduction video).fabric -y "https://youtube.com/watch?v=uXs-zPc63kM" --stream --pattern extract_wisdom -
Create patterns- you must create a .md file with the pattern and save it to
~/.config/fabric/patterns/[yourpatternname]. -
Run a
analyze_claimspattern on a website. Fabric uses Jina AI to scrape the URL into markdown format before sending it to the model.fabric -u https://github.com/danielmiessler/fabric/ -p analyze_claims
If you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the /patterns directory and start exploring!
We hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.
You can use any of the Patterns you see there in any AI application that you have, whether that's ChatGPT or some other app or website. Our plan and prediction is that people will soon be sharing many more than those we've published, and they will be way better than ours.
The wisdom of crowds for the win.
Fabric also implements prompt strategies like "Chain of Thought" or "Chain of Draft" which can be used in addition to the basic patterns.
See the Thinking Faster by Writing Less paper and the Thought Generation section of Learn Prompting for examples of prompt strategies.
Each strategy is available as a small json file in the /strategies directory.
The prompt modification of the strategy is applied to the system prompt and passed on to the LLM in the chat session.
Use fabric -S and select the option to install the strategies in your ~/.config/fabric directory.
You may want to use Fabric to create your own custom Patterns—but not share them with others. No problem!
Just make a directory in ~/.config/custompatterns/ (or wherever) and put your .md files in there.
When you're ready to use them, copy them into ~/.config/fabric/patterns/
You can then use them like any other Patterns, but they won't be public unless you explicitly submit them as Pull Requests to the Fabric project. So don't worry—they're private to you.
Fabric also makes use of some core helper apps (tools) to make it easier to integrate with your various workflows. Here are some examples:
to_pdf is a helper command that converts LaTeX files to PDF format. You can use it like this:
to_pdf input.texThis will create a PDF file from the input LaTeX file in the same directory.
You can also use it with stdin which works perfectly with the write_latex pattern:
echo "ai security primer" | fabric --pattern write_latex | to_pdfThis will create a PDF file named output.pdf in the current directory.
To install to_pdf, install it the same way as you install Fabric, just with a different repo name.
go install github.com/danielmiessler/fabric/plugins/tools/to_pdf@latestMake sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on your system, as to_pdf requires pdflatex to be available in your system's PATH.
code_helper is used in conjunction with the create_coding_feature pattern.
It generates a json representation of a directory of code that can be fed into an AI model
with instructions to create a new feature or edit the code in a specified way.
See the Create Coding Feature Pattern README for details.
Install it first using:
go install github.com/danielmiessler/fabric/plugins/tools/code_helper@latestThe examples use the macOS program pbpaste to paste content from the clipboard to pipe into fabric as the input. pbpaste is not available on Windows or Linux, but there are alternatives.
On Windows, you can use the PowerShell command Get-Clipboard from a PowerShell command prompt. If you like, you can also alias it to pbpaste. If you are using classic PowerShell, edit the file ~\Documents\WindowsPowerShell\.profile.ps1, or if you are using PowerShell Core, edit ~\Documents\PowerShell\.profile.ps1 and add the alias,
Set-Alias pbpaste Get-ClipboardOn Linux, you can use xclip -selection clipboard -o to paste from the clipboard. You will likely need to install xclip with your package manager. For Debian based systems including Ubuntu,
sudo apt update
sudo apt install xclip -yYou can also create an alias by editing ~/.bashrc or ~/.zshrc and adding the alias,
alias pbpaste='xclip -selection clipboard -o'Fabric now includes a built-in web interface that provides a GUI alternative to the command-line interface and an out-of-the-box website for those who want to get started with web development or blogging. You can use this app as a GUI interface for Fabric, a ready to go blog-site, or a website template for your own projects.
The web/src/lib/content directory includes starter .obsidian/ and templates/ directories, allowing you to open up the web/src/lib/content/ directory as an Obsidian.md vault. You can place your posts in the posts directory when you're ready to publish.
The GUI can be installed by navigating to the web directory and using npm install, pnpm install, or your favorite package manager. Then simply run the development server to start the app.
You will need to run fabric in a separate terminal with the fabric --serve command.
From the fabric project web/ directory:
npm run dev
## or ##
pnpm run dev
## or your equivalentTo run the Streamlit user interface:
# Install required dependencies
pip install -r requirements.txt
# Or manually install dependencies
pip install streamlit pandas matplotlib seaborn numpy python-dotenv pyperclip
# Run the Streamlit app
streamlit run streamlit.pyThe Streamlit UI provides a user-friendly interface for:
- Running and chaining patterns
- Managing pattern outputs
- Creating and editing patterns
- Analyzing pattern results
The Streamlit UI supports clipboard operations across different platforms:
-
macOS: Uses
pbcopyandpbpaste(built-in) -
Windows: Uses
pypercliplibrary (install withpip install pyperclip) -
Linux: Uses
xclip(install withsudo apt-get install xclipor equivalent for your distro)
[!NOTE] Special thanks to the following people for their inspiration and contributions!
- Jonathan Dunn for being the absolute MVP dev on the project, including spearheading the new Go version, as well as the GUI! All this while also being a full-time medical doctor!
- Caleb Sima for pushing me over the edge of whether to make this a public project or not.
- Eugen Eisler and Frederick Ros for their invaluable contributions to the Go version
- David Peters for his work on the web interface.
- Joel Parish for super useful input on the project's Github directory structure..
-
Joseph Thacker for the idea of a
-ccontext flag that adds pre-created context in the./config/fabric/directory to all Pattern queries. -
Jason Haddix for the idea of a stitch (chained Pattern) to filter content using a local model before sending on to a cloud model, i.e., cleaning customer data using
llama2before sending on togpt-4for analysis. - Andre Guerra for assisting with numerous components to make things simpler and more maintainable.
fabric was created by Daniel Miessler in January of 2024.
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for fabric
Similar Open Source Tools
fabric
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
Fabric
Fabric is an open-source framework designed to augment humans using AI by organizing prompts by real-world tasks. It addresses the integration problem of AI by creating and organizing prompts for various tasks. Users can create, collect, and organize AI solutions in a single place for use in their favorite tools. Fabric also serves as a command-line interface for those focused on the terminal. It offers a wide range of features and capabilities, including support for multiple AI providers, internationalization, speech-to-text, AI reasoning, model management, web search, text-to-speech, desktop notifications, and more. The project aims to help humans flourish by leveraging AI technology to solve human problems and enhance creativity.
smartcat
Smartcat is a CLI interface that brings language models into the Unix ecosystem, allowing power users to leverage the capabilities of LLMs in their daily workflows. It features a minimalist design, seamless integration with terminal and editor workflows, and customizable prompts for specific tasks. Smartcat currently supports OpenAI, Mistral AI, and Anthropic APIs, providing access to a range of language models. With its ability to manipulate file and text streams, integrate with editors, and offer configurable settings, Smartcat empowers users to automate tasks, enhance code quality, and explore creative possibilities.
HuggingFaceGuidedTourForMac
HuggingFaceGuidedTourForMac is a guided tour on how to install optimized pytorch and optionally Apple's new MLX, JAX, and TensorFlow on Apple Silicon Macs. The repository provides steps to install homebrew, pytorch with MPS support, MLX, JAX, TensorFlow, and Jupyter lab. It also includes instructions on running large language models using HuggingFace transformers. The repository aims to help users set up their Macs for deep learning experiments with optimized performance.
torchchat
torchchat is a codebase showcasing the ability to run large language models (LLMs) seamlessly. It allows running LLMs using Python in various environments such as desktop, server, iOS, and Android. The tool supports running models via PyTorch, chatting, generating text, running chat in the browser, and running models on desktop/server without Python. It also provides features like AOT Inductor for faster execution, running in C++ using the runner, and deploying and running on iOS and Android. The tool supports popular hardware and OS including Linux, Mac OS, Android, and iOS, with various data types and execution modes available.
bia-bob
BIA `bob` is a Jupyter-based assistant for interacting with data using large language models to generate Python code. It can utilize OpenAI's chatGPT, Google's Gemini, Helmholtz' blablador, and Ollama. Users need respective accounts to access these services. Bob can assist in code generation, bug fixing, code documentation, GPU-acceleration, and offers a no-code custom Jupyter Kernel. It provides example notebooks for various tasks like bio-image analysis, model selection, and bug fixing. Installation is recommended via conda/mamba environment. Custom endpoints like blablador and ollama can be used. Google Cloud AI API integration is also supported. The tool is extensible for Python libraries to enhance Bob's functionality.
kwaak
Kwaak is a tool that allows users to run a team of autonomous AI agents locally from their own machine. It enables users to write code, improve test coverage, update documentation, and enhance code quality while focusing on building innovative projects. Kwaak is designed to run multiple agents in parallel, interact with codebases, answer questions about code, find examples, write and execute code, create pull requests, and more. It is free and open-source, allowing users to bring their own API keys or models via Ollama. Kwaak is part of the bosun.ai project, aiming to be a platform for autonomous code improvement.
node_characterai
Node.js client for the unofficial Character AI API, an awesome website which brings characters to life with AI! This repository is inspired by RichardDorian's unofficial node API. Though, I found it hard to use and it was not really stable and archived. So I remade it in javascript. This project is not affiliated with Character AI in any way! It is a community project. The purpose of this project is to bring and build projects powered by Character AI. If you like this project, please check their website.
sage
Sage is a tool that allows users to chat with any codebase, providing a chat interface for code understanding and integration. It simplifies the process of learning how a codebase works by offering heavily documented answers sourced directly from the code. Users can set up Sage locally or on the cloud with minimal effort. The tool is designed to be easily customizable, allowing users to swap components of the pipeline and improve the algorithms powering code understanding and generation.
WindowsAgentArena
Windows Agent Arena (WAA) is a scalable Windows AI agent platform designed for testing and benchmarking multi-modal, desktop AI agents. It provides researchers and developers with a reproducible and realistic Windows OS environment for AI research, enabling testing of agentic AI workflows across various tasks. WAA supports deploying agents at scale using Azure ML cloud infrastructure, allowing parallel running of multiple agents and delivering quick benchmark results for hundreds of tasks in minutes.
slack-bot
The Slack Bot is a tool designed to enhance the workflow of development teams by integrating with Jenkins, GitHub, GitLab, and Jira. It allows for custom commands, macros, crons, and project-specific commands to be implemented easily. Users can interact with the bot through Slack messages, execute commands, and monitor job progress. The bot supports features like starting and monitoring Jenkins jobs, tracking pull requests, querying Jira information, creating buttons for interactions, generating images with DALL-E, playing quiz games, checking weather, defining custom commands, and more. Configuration is managed via YAML files, allowing users to set up credentials for external services, define custom commands, schedule cron jobs, and configure VCS systems like Bitbucket for automated branch lookup in Jenkins triggers.
lexido
Lexido is an innovative assistant for the Linux command line, designed to boost your productivity and efficiency. Powered by Gemini Pro 1.0 and utilizing the free API, Lexido offers smart suggestions for commands based on your prompts and importantly your current environment. Whether you're installing software, managing files, or configuring system settings, Lexido streamlines the process, making it faster and more intuitive.
robocorp
Robocorp is a platform that allows users to create, deploy, and operate Python automations and AI actions. It provides an easy way to extend the capabilities of AI agents, assistants, and copilots with custom actions written in Python. Users can create and deploy tools, skills, loaders, and plugins that securely connect any AI Assistant platform to their data and applications. The Robocorp Action Server makes Python scripts compatible with ChatGPT and LangChain by automatically creating and exposing an API based on function declaration, type hints, and docstrings. It simplifies the process of developing and deploying AI actions, enabling users to interact with AI frameworks effortlessly.
actions
Sema4.ai Action Server is a tool that allows users to build semantic actions in Python to connect AI agents with real-world applications. It enables users to create custom actions, skills, loaders, and plugins that securely connect any AI Assistant platform to data and applications. The tool automatically creates and exposes an API based on function declaration, type hints, and docstrings by adding '@action' to Python scripts. It provides an end-to-end stack supporting various connections between AI and user's apps and data, offering ease of use, security, and scalability.
OpenAI-sublime-text
The OpenAI Completion plugin for Sublime Text provides first-class code assistant support within the editor. It utilizes LLM models to manipulate code, engage in chat mode, and perform various tasks. The plugin supports OpenAI, llama.cpp, and ollama models, allowing users to customize their AI assistant experience. It offers separated chat histories and assistant settings for different projects, enabling context-specific interactions. Additionally, the plugin supports Markdown syntax with code language syntax highlighting, server-side streaming for faster response times, and proxy support for secure connections. Users can configure the plugin's settings to set their OpenAI API key, adjust assistant modes, and manage chat history. Overall, the OpenAI Completion plugin enhances the Sublime Text editor with powerful AI capabilities, streamlining coding workflows and fostering collaboration with AI assistants.
For similar tasks
fabric
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
Fabric
Fabric is an open-source framework designed to augment humans using AI by organizing prompts by real-world tasks. It addresses the integration problem of AI by creating and organizing prompts for various tasks. Users can create, collect, and organize AI solutions in a single place for use in their favorite tools. Fabric also serves as a command-line interface for those focused on the terminal. It offers a wide range of features and capabilities, including support for multiple AI providers, internationalization, speech-to-text, AI reasoning, model management, web search, text-to-speech, desktop notifications, and more. The project aims to help humans flourish by leveraging AI technology to solve human problems and enhance creativity.
ap-plugin
AP-PLUGIN is an AI drawing plugin for the Yunzai series robot framework, allowing you to have a convenient AI drawing experience in the input box. It uses the open source Stable Diffusion web UI as the backend, deploys it for free, and generates a variety of images with richer functions.
comflowy
Comflowy is a community dedicated to providing comprehensive tutorials, fostering discussions, and building a database of workflows and models for ComfyUI and Stable Diffusion. Our mission is to lower the entry barrier for ComfyUI users, promote its mainstream adoption, and contribute to the growth of the AI generative graphics community.
stability-sdk
The stability-sdk is a Python package that provides a client implementation for interacting with the Stability API. This API allows users to generate images, upscale images, and animate images using a variety of different models and settings. The stability-sdk makes it easy to use the Stability API from Python code, and it provides a number of helpful features such as command line usage, support for multiple models, and the ability to filter artifacts by type.
Building-AI-Applications-with-ChatGPT-APIs
This repository is for the book 'Building AI Applications with ChatGPT APIs' published by Packt. It provides code examples and instructions for mastering ChatGPT, Whisper, and DALL-E APIs through building innovative AI projects. Readers will learn to develop AI applications using ChatGPT APIs, integrate them with frameworks like Flask and Django, create AI-generated art with DALL-E APIs, and optimize ChatGPT models through fine-tuning.
comfyui-photoshop
ComfyUI for Photoshop is a plugin that integrates with an AI-powered image generation system to enhance the Photoshop experience with features like unlimited generative fill, customizable back-end, AI-powered artistry, and one-click transformation. The plugin requires a minimum of 6GB graphics memory and 12GB RAM. Users can install the plugin and set up the ComfyUI workflow using provided links and files. Additionally, specific files like Check points, Loras, and Detailer Lora are required for different functionalities. Support and contributions are encouraged through GitHub.
awesome-generative-ai
Awesome Generative AI is a curated list of modern Generative Artificial Intelligence projects and services. Generative AI technology creates original content like images, sounds, and texts using machine learning algorithms trained on large data sets. It can produce unique and realistic outputs such as photorealistic images, digital art, music, and writing. The repo covers a wide range of applications in art, entertainment, marketing, academia, and computer science.
For similar jobs
facefusion
FaceFusion is a next-generation face swapper and enhancer that allows users to seamlessly swap faces in images and videos, as well as enhance facial features for a more polished and refined look. With its advanced deep learning models, FaceFusion provides users with a wide range of options for customizing their face swaps and enhancements, making it an ideal tool for content creators, artists, and anyone looking to explore their creativity with facial manipulation.
forge
Forge is a free and open-source digital collectible card game (CCG) engine written in Java. It is designed to be easy to use and extend, and it comes with a variety of features that make it a great choice for developers who want to create their own CCGs. Forge is used by a number of popular CCGs, including Ascension, Dominion, and Thunderstone.
latentbox
Latent Box is a curated collection of resources for AI, creativity, and art. It aims to bridge the information gap with high-quality content, promote diversity and interdisciplinary collaboration, and maintain updates through community co-creation. The website features a wide range of resources, including articles, tutorials, tools, and datasets, covering various topics such as machine learning, computer vision, natural language processing, generative art, and creative coding.
fabric
Fabric is an open-source framework for augmenting humans using AI. It provides a structured approach to breaking down problems into individual components and applying AI to them one at a time. Fabric includes a collection of pre-defined Patterns (prompts) that can be used for a variety of tasks, such as extracting the most interesting parts of YouTube videos and podcasts, writing essays, summarizing academic papers, creating AI art prompts, and more. Users can also create their own custom Patterns. Fabric is designed to be easy to use, with a command-line interface and a variety of helper apps. It is also extensible, allowing users to integrate it with their own AI applications and infrastructure.
ColorPicker
ColorPicker Max is a powerful and intuitive color selection and manipulation tool that is designed to make working with color easier and more efficient than ever before. With its wide range of features and tools, ColorPicker Max offers an unprecedented level of control and customization over every aspect of color selection and manipulation.
ai-notes
Notes on AI state of the art, with a focus on generative and large language models. These are the "raw materials" for the https://lspace.swyx.io/ newsletter. This repo used to be called https://github.com/sw-yx/prompt-eng, but was renamed because Prompt Engineering is Overhyped. This is now an AI Engineering notes repo.
Neurite
Neurite is an innovative project that combines chaos theory and graph theory to create a digital interface that explores hidden patterns and connections for creative thinking. It offers a unique workspace blending fractals with mind mapping techniques, allowing users to navigate the Mandelbrot set in real-time. Nodes in Neurite represent various content types like text, images, videos, code, and AI agents, enabling users to create personalized microcosms of thoughts and inspirations. The tool supports synchronized knowledge management through bi-directional synchronization between mind-mapping and text-based hyperlinking. Neurite also features FractalGPT for modular conversation with AI, local AI capabilities for multi-agent chat networks, and a Neural API for executing code and sequencing animations. The project is actively developed with plans for deeper fractal zoom, advanced control over node placement, and experimental features.
ScribbleArchitect
ScribbleArchitect is a GUI tool designed for generating images from simple brush strokes or Bezier curves in real-time. It is primarily intended for use in architecture and sketching in the early stages of a project. The tool utilizes Stable Diffusion and ControlNet as AI backbone for the generative process, with IP Adapter support and a library of predefined styles. Users can transfer specific styles to their line work, upscale images for high resolution export, and utilize a ControlNet upscaler. The tool also features a screen capture function for working with external tools like Adobe Illustrator or Inkscape.
