Reverse Dependencies of local-attention
The following projects have a declared dependency on local-attention:
- audiolm-pytorch — AudioLM - Language Modeling Approach to Audio Generation from Google Research - Pytorch
- audiolm-superfeel — AudioLM - Language Modeling Approach to Audio Generation from Google Research - Pytorch
- CoLT5-attention — Conditionally Routed Attention
- funcodec — FunCodec: A Fundamental, Reproducible and Integrable Open-source Toolkit for Neural Speech Codec
- geidiprime — Paper - Pytorch
- harmonai-tools — Training and inference tools for generative audio models from Harmonai
- linear-attention-transformer — Linear Attention Transformer
- meshgpt-pytorch — MeshGPT Pytorch
- mixture-of-attention — Mixture of Attention
- naturalspeech2-pytorch — Natural Speech 2 - Pytorch
- performer-pytorch — Performer - Pytorch
- reformer-pytorch — Reformer, the Efficient Transformer, Pytorch
- relay-transformer — Relay Transformer, a long-range transformer
- routing-transformer — Routing Transformer (Pytorch)
- rvq-vae-gpt — Yet another attempt at GPT in quantized latent space
- shiba-model — An efficient character-level transformer encoder, pretrained for Japanese
- simple-hierarchical-transformer — Simple Hierarchical Transformer
- simplified-transormer-torch — Paper - Pytorch
- sinkhorn-transformer — Sinkhorn Transformer - Sparse Sinkhorn Attention
- stable-audio-tools — Training and inference tools for generative audio models from Stability AI
- teragpt — Paper - Pytorch
- tikt-performer-pytorch — Performer - Pytorch
- transcript-transformer — Transformers for Transcripts
- zetascale — Rapidly Build, Optimize, and Deploy SOTA AI Models
1