transformer-tricks

transformer-tricks

A collection of tricks and tools to speed up transformer models

Stars: 179

Visit
 screenshot

A collection of tricks to simplify and speed up transformer models by removing parts from neural networks. Includes Flash normalization, slim attention, matrix-shrink, precomputing the first layer, and removing weights from skipless transformers. Follows recent trends in neural network optimization.

README:

Transformer Tricks

PyPI PyPI Downloads

A collection of tricks to simplify and speed up transformer models:

Many of these tricks follow a recent trend of removing parts from neural networks such as RMSNorm’s removal of mean centering from LayerNorm, PaLM's removal of bias-parameters, decoder-only transformer's removal of the encoder stack, and of course transformer’s revolutionary removal of recurrent layers.

For example, our FlashNorm removes the weights from RMSNorm and merges them with the next linear layer. And slim attention removes the entire V-cache from the context memory for MHA transformers.


Explainer videos

bla bla


Installation

Install the transformer tricks package:

pip install transformer-tricks

Alternatively, to run from latest repo:

git clone https://github.com/OpenMachine-ai/transformer-tricks.git
python3 -m venv .venv
source .venv/bin/activate
pip3 install --quiet -r requirements.txt

Documentation

Follow the links below for documentation of the python code in this directory:


Notebooks

The papers are accompanied by the following Jupyter notebooks:

  • Slim attention: Colab
  • Flash normalization: Colab Colab
  • Removing weights from skipless transformers: Colab

Newsletter

Please subscribe to our newsletter on substack to get the latest news about this project. We will never send you more than one email per month.

Substack


Contributing

We pay cash for high-impact contributions. Please check out CONTRIBUTING for how to get involved.


Sponsors

The Transformer Tricks project is currently sponsored by OpenMachine. We'd love to hear from you if you'd like to join us in supporting this project.


Please give us a ⭐ if you like this repo, and check out TinyFive


Star History Chart

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for transformer-tricks

Similar Open Source Tools

For similar tasks

For similar jobs