latex2ai
LaTeX Plugin for Adobe Illustrator
Stars: 174
LaTeX2AI is a plugin for Adobe Illustrator that allows users to use editable text labels typeset in LaTeX inside an Illustrator document. It provides a seamless integration of LaTeX functionality within the Illustrator environment, enabling users to create and edit LaTeX labels, manage item scaling behavior, set global options, and save documents as PDF with included LaTeX labels. The tool simplifies the process of including LaTeX-generated content in Illustrator designs, ensuring accurate scaling and alignment with other elements in the document.
README:
LaTeX2AI is a plugin for Adobe Illustrator (MacOS and Windows) that enables the use of editable text labels typeset in LaTeX inside an Illustrator document.
LaTeX2AI is under the MIT license, see ./LICENSE. If you use LaTeX2AI to create figures for your work, please acknowledge it with a link to the GitHub repository. For example:
- Sketches in this work have been created using the Adobe Illustrator plug-in LaTeX2AI (https://github.com/isteinbrecher/latex2ai).
Feel free to leave a ⭐ on GitHub. You can also add your work to Work that uses LaTeX2AI.
The following software dependencies are required to run LaTeX2AI:
- A LaTeX compiler such as TeX Live or MiKTeX
- Ghost script
-
Download LaTeX2AI from the GitHub release page
-
Unzip the
.zip
file -
Copy the user interface folder
com.isteinbrecher.latex2ai
- LaTeX2AI installation only for current user:
- Copy
com.isteinbrecher.latex2ai
toC:\Users\<USERNAME>\AppData\Roaming\Adobe\CEP\extensions
(the directory might have to be created)
- Copy
- LaTeX2AI installation for all users (requires administrator privileges):
- Copy
com.isteinbrecher.latex2ai
toC:\Program Files\Common Files\Adobe\CEP\extensions\
- Copy
- LaTeX2AI installation only for current user:
-
Copy the plugin
- LaTeX2AI installation only for current user:
- Copy
WIN/LaTeX2AI.aip
to an arbitrary directory. This directory has to be set as the Adobe Illustrator Plugin directory viaEdit/Preferences/Plug-ins & Scratch Disks.../Additional Plug-ins Folder/
- Copy
- LaTeX2AI installation for all users (requires administrator privileges):
- Copy
WIN/LaTeX2AI.aip
toC:\Program Files\Adobe\Adobe Illustrator <YOUR VERSION>\Plug-ins\
- Copy
- LaTeX2AI installation only for current user:
-
After a restart of Adobe Illustrator, you can display the LaTeX2AI tools with
Window/Toolbars/Advanced
.
To uninstall LaTeX2AI, delete the files you copied.
-
Download LaTeX2AI from the GitHub release page
-
Unzip the file
-
Copy the user interface folder
com.isteinbrecher.latex2ai
- LaTeX2AI installation only for current user:
- Copy
com.isteinbrecher.latex2ai
to~/Library/Application Support/Adobe/CEP/extensions/
- Copy
- LaTeX2AI installation for all users (requires administrator privileges):
- Copy
com.isteinbrecher.latex2ai
to/Library/Application Support/Adobe/CEP/extensions/
- Copy
- LaTeX2AI installation only for current user:
-
Copy the plugin
- LaTeX2AI installation only for current user:
- Copy
macOS/LaTeX2AI.aip
to an arbitrary directory. This directory has to be set as the Adobe Illustrator Plugin directory viaEdit/Preferences/Plug-ins & Scratch Disks.../Additional Plug-ins Folder/
- Copy
- LaTeX2AI installation for all users (requires administrator privileges):
- Copy
macOS/LaTeX2AI.aip
to/Applications/Adobe Illustrator <YOUR VERSION>/Plug-ins/
- Copy
- LaTeX2AI installation only for current user:
-
After a restart of Adobe Illustrator, you can display the LaTeX2AI tools with
Window/Toolbars/Advanced
.Depending on your system settings you might get the following error message when starting Illustrator
This can be resolved by explicitly allowing the gatekeeper to run LaTeX2AI (see also this thread). To do so, open the terminal and type:
xattr -d com.apple.quarantine <PATH TO LaTeXAI.aip>
If you have installed LaTeX2AI for all users you need to run this command with administrator privileges:
sudo xattr -d com.apple.quarantine <PATH TO LaTeXAI.aip>
To uninstall LaTeX2AI, delete the files you copied.
LaTeX2AI adds four buttons to the main toolbar:
- Create / Edit: Edit an existing label by clicking on it, or creating a new one by clicking somewhere in the document.
- Redo LaTeX2AI labels: This allows for the LaTeX recompilation and/or scaling reset of all existing LaTeX2AI labels.
- LaTeX2AI options: Open a form where the global LaTeX2AI options can be set. Also the LaTeX header can be opened in an external application.
-
Save as PDF: Save the current
.ai
document as a.pdf
document with the same name. The LaTeX2AI labels are included into the created.pdf
document.
These buttons are the main way of interacting with LaTeX2AI. Additionally, double clicking on a LaTeX2AI label will enable the edit mode for that label.
The following form appears when creating or editing LaTeX2AI labels:
This option defines how a label behaves when it its size changes.
Take for example the well-known formula $\sum_{k=0}^{\infty}\frac{x^k}{k!}$
which is placed inside a rectangle:
The green box indicates the size of the LaTeX2I label and the dot describes the placement of the label. If the size changes due to a change in the LaTeX code, the position of this dot relative to the label will stay the same.
If the previous label is now changed to \displaystyle
we get the following result:
The size of the label changed due to a change in the underlying LaTeX code, but the position (the indicated dot) stayed the same.
Additionally, there is the possibility of a baseline placement. This will result in a label where the baseline is exactly in the vertical center of the label. This label can now be easily adjusted and snapped to align with another baseline label or Illustrator text:
Per default, LaTeX2AI places the LaTeX labels inside Illustrator with a scale of 1:1, i.e., 1pt in the LaTeX label is 1pt in the Illustrator document. For example, if the LaTeX item is created with a font size of 12pt the font will match Illustrator text with 12pt. The user can resize the labels like any other Illustrator object, however, be aware that every time the LaTeX code of a label changes, the scale is reset to 1:1.
The recommended way of using LaTeX2AI is to always have items at a scale of 1:1.
By doing so, the exported Illustrator document can be simply included into a LaTeX document with the \includegraphics
option scale=1
, and the font size of the labels in the figures will exactly match the font size of the document (assuming that the header options are the same).
With the LaTeX2AI tool Redo items, one can easily reset the scaling of all LaTeX2AI items in the document.
LaTeX2AI assumed that all Illustrator files in the same directory use the same LaTeX header LaTeX2AI_header.tex
(if no one exits in the directory, it will be created the first time it is needed).
This header can be edited to include packages and macros needed for the labels.
The .pdf
files for the LaTeX labels are stored in the links
subfolder of the document directory.
It is not required to keep track of the files in the links
folder, LaTeX2AI manages and deletes unused label files.
LaTeX2AI currently does not work with Creative Cloud documents. However, if the document is stored in the Creative Cloud Files folder on the disk, LaTeX2AI should work as expected.
An incomplete list of work that uses LaTeX2AI
- Toenniessen, F.: Die Homotopie der Sphären - Eine Einführung in Spektralsequenzen, Lokalisierungen und Kohomologie-Operationen, Springer Verlag Heidelberg, 2023.
- Steinbrecher, I., Popp, A., Meier, C.: Consistent coupling of positions and rotations for embedding 1D Cosserat beams into 3D solid volumes. Comput Mech (2021), Open Access
- Steinbrecher, I., Mayr, M., Grill, M.J., Kremheller, J., Meier, C., Popp, A.: A mortar-type finite element approach for embedding 1D beams into 3D solid volumes, Comput Mech (2020), 66(6):1377-1398, Open Access
- Steinbrecher, I., Humer, A., Vu-Quoc, L.: On the numerical modeling of sliding beams: A comparison of different approaches, Journal of Sound and Vibration, 408:270-290, Open Access
If you are interested in contributing to LaTeX2AI, we welcome your collaboration. For general questions, feature request and bug reports please open an issue.
If you contribute actual code, fork the repository and make the changes in a feature branch.
Depending on the topic and amount of changes you also might want to open an issue.
To merge your changes into the main LaTeX2AI repository, create a pull request to the develop
branch (this branch will be merged into main
with the next release).
A few things to keep in mind:
- Compile a debug build and run the framework tests in Illustrator .
You might also consider adding tests for your changes (
./src/tests
). - Run the
python3
script./scripts/check_license.py
to ensure that all added source files have the correct license header. - LaTeX2AI uses
clang-format
to format the C++ code. Make sure to apply clang format to the changed source files:- With the Visual Studio solution in the repository this can be done with
Crtl-K
followed byCtrl-D
. - On macOS you can use the following command (run in the root directory of LaTeX2AI):
find src -iname '*.h' -o -iname '*.cpp' | xargs clang-format -i
- With the Visual Studio solution in the repository this can be done with
- Add a short description of your changes to the Changelog.
- Feel free to add yourself to the ./CONTRIBUTORS file.
Instructions on how to build LaTeX2AI from source can be found here
A detailed changelog can be found here
For Tasks:
Click tags to check more tools for each tasksFor Jobs:
Alternative AI tools for latex2ai
Similar Open Source Tools
latex2ai
LaTeX2AI is a plugin for Adobe Illustrator that allows users to use editable text labels typeset in LaTeX inside an Illustrator document. It provides a seamless integration of LaTeX functionality within the Illustrator environment, enabling users to create and edit LaTeX labels, manage item scaling behavior, set global options, and save documents as PDF with included LaTeX labels. The tool simplifies the process of including LaTeX-generated content in Illustrator designs, ensuring accurate scaling and alignment with other elements in the document.
geti-sdk
The Intel® Geti™ SDK is a python package that enables teams to rapidly develop AI models by easing the complexities of model development and enhancing collaboration between teams. It provides tools to interact with an Intel® Geti™ server via the REST API, allowing for project creation, downloading, uploading, deploying for local inference with OpenVINO, setting project and model configuration, launching and monitoring training jobs, and media upload and prediction. The SDK also includes tutorial-style Jupyter notebooks demonstrating its usage.
mosec
Mosec is a high-performance and flexible model serving framework for building ML model-enabled backend and microservices. It bridges the gap between any machine learning models you just trained and the efficient online service API. * **Highly performant** : web layer and task coordination built with Rust 🦀, which offers blazing speed in addition to efficient CPU utilization powered by async I/O * **Ease of use** : user interface purely in Python 🐍, by which users can serve their models in an ML framework-agnostic manner using the same code as they do for offline testing * **Dynamic batching** : aggregate requests from different users for batched inference and distribute results back * **Pipelined stages** : spawn multiple processes for pipelined stages to handle CPU/GPU/IO mixed workloads * **Cloud friendly** : designed to run in the cloud, with the model warmup, graceful shutdown, and Prometheus monitoring metrics, easily managed by Kubernetes or any container orchestration systems * **Do one thing well** : focus on the online serving part, users can pay attention to the model optimization and business logic
RepoAgent
RepoAgent is an LLM-powered framework designed for repository-level code documentation generation. It automates the process of detecting changes in Git repositories, analyzing code structure through AST, identifying inter-object relationships, replacing Markdown content, and executing multi-threaded operations. The tool aims to assist developers in understanding and maintaining codebases by providing comprehensive documentation, ultimately improving efficiency and saving time.
vscode-pddl
The vscode-pddl extension provides comprehensive support for Planning Domain Description Language (PDDL) in Visual Studio Code. It enables users to model planning domains, validate them, industrialize planning solutions, and run planners. The extension offers features like syntax highlighting, auto-completion, plan visualization, plan validation, plan happenings evaluation, search debugging, and integration with Planning.Domains. Users can create PDDL files, run planners, visualize plans, and debug search algorithms efficiently within VS Code.
kaito
Kaito is an operator that automates the AI/ML inference model deployment in a Kubernetes cluster. It manages large model files using container images, avoids tuning deployment parameters to fit GPU hardware by providing preset configurations, auto-provisions GPU nodes based on model requirements, and hosts large model images in the public Microsoft Container Registry (MCR) if the license allows. Using Kaito, the workflow of onboarding large AI inference models in Kubernetes is largely simplified.
agentok
Agentok Studio is a visual tool built for AutoGen, a cutting-edge agent framework from Microsoft and various contributors. It offers intuitive visual tools to simplify the construction and management of complex agent-based workflows. Users can create workflows visually as graphs, chat with agents, and share flow templates. The tool is designed to streamline the development process for creators and developers working on next-generation Multi-Agent Applications.
LlamaIndexTS
LlamaIndex.TS is a data framework for your LLM application. Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in Typescript and Javascript.
unitycatalog
Unity Catalog is an open and interoperable catalog for data and AI, supporting multi-format tables, unstructured data, and AI assets. It offers plugin support for extensibility and interoperates with Delta Sharing protocol. The catalog is fully open with OpenAPI spec and OSS implementation, providing unified governance for data and AI with asset-level access control enforced through REST APIs.
lmql
LMQL is a programming language designed for large language models (LLMs) that offers a unique way of integrating traditional programming with LLM interaction. It allows users to write programs that combine algorithmic logic with LLM calls, enabling model reasoning capabilities within the context of the program. LMQL provides features such as Python syntax integration, rich control-flow options, advanced decoding techniques, powerful constraints via logit masking, runtime optimization, sync and async API support, multi-model compatibility, and extensive applications like JSON decoding and interactive chat interfaces. The tool also offers library integration, flexible tooling, and output streaming options for easy model output handling.
cognita
Cognita is an open-source framework to organize your RAG codebase along with a frontend to play around with different RAG customizations. It provides a simple way to organize your codebase so that it becomes easy to test it locally while also being able to deploy it in a production ready environment. The key issues that arise while productionizing RAG system from a Jupyter Notebook are: 1. **Chunking and Embedding Job** : The chunking and embedding code usually needs to be abstracted out and deployed as a job. Sometimes the job will need to run on a schedule or be trigerred via an event to keep the data updated. 2. **Query Service** : The code that generates the answer from the query needs to be wrapped up in a api server like FastAPI and should be deployed as a service. This service should be able to handle multiple queries at the same time and also autoscale with higher traffic. 3. **LLM / Embedding Model Deployment** : Often times, if we are using open-source models, we load the model in the Jupyter notebook. This will need to be hosted as a separate service in production and model will need to be called as an API. 4. **Vector DB deployment** : Most testing happens on vector DBs in memory or on disk. However, in production, the DBs need to be deployed in a more scalable and reliable way. Cognita makes it really easy to customize and experiment everything about a RAG system and still be able to deploy it in a good way. It also ships with a UI that makes it easier to try out different RAG configurations and see the results in real time. You can use it locally or with/without using any Truefoundry components. However, using Truefoundry components makes it easier to test different models and deploy the system in a scalable way. Cognita allows you to host multiple RAG systems using one app. ### Advantages of using Cognita are: 1. A central reusable repository of parsers, loaders, embedders and retrievers. 2. Ability for non-technical users to play with UI - Upload documents and perform QnA using modules built by the development team. 3. Fully API driven - which allows integration with other systems. > If you use Cognita with Truefoundry AI Gateway, you can get logging, metrics and feedback mechanism for your user queries. ### Features: 1. Support for multiple document retrievers that use `Similarity Search`, `Query Decompostion`, `Document Reranking`, etc 2. Support for SOTA OpenSource embeddings and reranking from `mixedbread-ai` 3. Support for using LLMs using `Ollama` 4. Support for incremental indexing that ingests entire documents in batches (reduces compute burden), keeps track of already indexed documents and prevents re-indexing of those docs.
VoiceStreamAI
VoiceStreamAI is a Python 3-based server and JavaScript client solution for near-realtime audio streaming and transcription using WebSocket. It employs Huggingface's Voice Activity Detection (VAD) and OpenAI's Whisper model for accurate speech recognition. The system features real-time audio streaming, modular design for easy integration of VAD and ASR technologies, customizable audio chunk processing strategies, support for multilingual transcription, and secure sockets support. It uses a factory and strategy pattern implementation for flexible component management and provides a unit testing framework for robust development.
NeMo-Guardrails
NeMo Guardrails is an open-source toolkit for easily adding _programmable guardrails_ to LLM-based conversational applications. Guardrails (or "rails" for short) are specific ways of controlling the output of a large language model, such as not talking about politics, responding in a particular way to specific user requests, following a predefined dialog path, using a particular language style, extracting structured data, and more.
0chain
Züs is a high-performance cloud on a fast blockchain offering privacy and configurable uptime. It uses erasure code to distribute data between data and parity servers, allowing flexibility for IT managers to design for security and uptime. Users can easily share encrypted data with business partners through a proxy key sharing protocol. The ecosystem includes apps like Blimp for cloud migration, Vult for personal cloud storage, and Chalk for NFT artists. Other apps include Bolt for secure wallet and staking, Atlus for blockchain explorer, and Chimney for network participation. The QoS protocol challenges providers based on response time, while the privacy protocol enables secure data sharing. Züs supports hybrid and multi-cloud architectures, allowing users to improve regulatory compliance and security requirements.
RAVE
RAVE is a variational autoencoder for fast and high-quality neural audio synthesis. It can be used to generate new audio samples from a given dataset, or to modify the style of existing audio samples. RAVE is easy to use and can be trained on a variety of audio datasets. It is also computationally efficient, making it suitable for real-time applications.
LLMeBench
LLMeBench is a flexible framework designed for accelerating benchmarking of Large Language Models (LLMs) in the field of Natural Language Processing (NLP). It supports evaluation of various NLP tasks using model providers like OpenAI, HuggingFace Inference API, and Petals. The framework is customizable for different NLP tasks, LLM models, and datasets across multiple languages. It features extensive caching capabilities, supports zero- and few-shot learning paradigms, and allows on-the-fly dataset download and caching. LLMeBench is open-source and continuously expanding to support new models accessible through APIs.
For similar tasks
latex2ai
LaTeX2AI is a plugin for Adobe Illustrator that allows users to use editable text labels typeset in LaTeX inside an Illustrator document. It provides a seamless integration of LaTeX functionality within the Illustrator environment, enabling users to create and edit LaTeX labels, manage item scaling behavior, set global options, and save documents as PDF with included LaTeX labels. The tool simplifies the process of including LaTeX-generated content in Illustrator designs, ensuring accurate scaling and alignment with other elements in the document.
For similar jobs
learnhouse
LearnHouse is an open-source platform that allows anyone to easily provide world-class educational content. It supports various content types, including dynamic pages, videos, and documents. The platform is still in early development and should not be used in production environments. However, it offers several features, such as dynamic Notion-like pages, ease of use, multi-organization support, support for uploading videos and documents, course collections, user management, quizzes, course progress tracking, and an AI-powered assistant for teachers and students. LearnHouse is built using various open-source projects, including Next.js, TailwindCSS, Radix UI, Tiptap, FastAPI, YJS, PostgreSQL, LangChain, and React.
languagemodels
Language Models is a Python package that provides building blocks to explore large language models with as little as 512MB of RAM. It simplifies the usage of large language models from Python, ensuring all inference is performed locally to keep data private. The package includes features such as text completions, chat capabilities, code completions, external text retrieval, semantic search, and more. It outperforms Hugging Face transformers for CPU inference and offers sensible default models with varying parameters based on memory constraints. The package is suitable for learners and educators exploring the intersection of large language models with modern software development.
curriculum
The 'curriculum' repository is an open-source content repository by Enki, providing a community-driven curriculum for education. It follows a contributor covenant code of conduct to ensure a safe and engaging learning environment. The content is licensed under Creative Commons, allowing free use for non-commercial purposes with attribution to Enki and the author.
obsidian-arcana
Arcana is a plugin for Obsidian that offers a collection of AI-powered tools inspired by famous historical figures to enhance creativity and productivity. It includes tools for conversation, text-to-speech transcription, speech-to-text replies, metadata markup, text generation, file moving, flashcard generation, auto tagging, and note naming. Users can interact with these tools using the command palette and sidebar views, with an OpenAI API key required for usage. The plugin aims to assist users in various note-taking and knowledge management tasks within the Obsidian vault environment.
Neurite
Neurite is an innovative project that combines chaos theory and graph theory to create a digital interface that explores hidden patterns and connections for creative thinking. It offers a unique workspace blending fractals with mind mapping techniques, allowing users to navigate the Mandelbrot set in real-time. Nodes in Neurite represent various content types like text, images, videos, code, and AI agents, enabling users to create personalized microcosms of thoughts and inspirations. The tool supports synchronized knowledge management through bi-directional synchronization between mind-mapping and text-based hyperlinking. Neurite also features FractalGPT for modular conversation with AI, local AI capabilities for multi-agent chat networks, and a Neural API for executing code and sequencing animations. The project is actively developed with plans for deeper fractal zoom, advanced control over node placement, and experimental features.
commonplace-bot
Commonplace Bot is a modern representation of the commonplace book, leveraging modern technological advancements in computation, data storage, machine learning, and networking. It aims to capture, engage, and share knowledge by providing a platform for users to collect ideas, quotes, and information, organize them efficiently, engage with the data through various strategies and triggers, and transform the data into new mediums for sharing. The tool utilizes embeddings and cached transformations for efficient data storage and retrieval, flips traditional engagement rules by engaging with the user, and enables users to alchemize raw data into new forms like art prompts. Commonplace Bot offers a unique approach to knowledge management and creative expression.
AI-Prompt-Genius
AI Prompt Genius is a Chrome extension that allows you to curate a custom library of AI prompts. It is built using React web app and Tailwind CSS with DaisyUI components. The extension enables users to create and manage AI prompts for various purposes. It provides a user-friendly interface for organizing and accessing AI prompts efficiently. AI Prompt Genius is designed to enhance productivity and creativity by offering a personalized collection of prompts tailored to individual needs. Users can easily install the extension from the Chrome Web Store and start using it to generate AI prompts for different tasks.
Advanced-GPTs
Nerority's Advanced GPT Suite is a collection of 33 GPTs that can be controlled with natural language prompts. The suite includes tools for various tasks such as strategic consulting, business analysis, career profile building, content creation, educational purposes, image-based tasks, knowledge engineering, marketing, persona creation, programming, prompt engineering, role-playing, simulations, and task management. Users can access links, usage instructions, and guides for each GPT on their respective pages. The suite is designed for public demonstration and usage, offering features like meta-sequence optimization, AI priming, prompt classification, and optimization. It also provides tools for generating articles, analyzing contracts, visualizing data, distilling knowledge, creating educational content, exploring topics, generating marketing copy, simulating scenarios, managing tasks, and more.