Reverse Dependencies of einops
The following projects have a declared dependency on einops:
- self-rewarding-lm-pytorch — Self Rewarding LM - Pytorch
- self-supervised-dermatology — Unsupervised Pre-Training for Texture Segmentation in Dermatology
- selfclean — A holistic self-supervised data cleaning strategy to detect irrelevant samples, near duplicates and label errors.
- selfextend — SelfExtendAttn - Pytorch
- sensortransformer — Transformer Network for Time-Series and Wearable Sensor Data
- separability — LLM Tools for looking at separability of LLM Capabilities
- seperability — Seperability of LLM Capabilities
- serval-ml-commons — SerVal Machine learning commons is a tools box that ease the development of ML experiments at SerVal.
- sft-dpo-qlora — SFT-DPO-QLora Trainer Package
- shapeaxi — Shape Analysis Exploration and Interpretability
- ShapeChecker — ShapeChecker assist you when doing tensors manipulation
- shazbot — Sound Hierarchy Attribute Zeitgeist Before Oligarchy Take
- simba-torch — Paper - Pytorch
- simgen — no summary
- simple-deployment — no summary
- simple-hierarchical-transformer — Simple Hierarchical Transformer
- simplified-transormer-torch — Paper - Pytorch
- singd — KFAC-like Structured Inverse-Free Natural Gradient Descent
- singleline-models — ML Models for Single-Line Drawings
- siren-pytorch — Implicit Neural Representations with Periodic Activation Functions
- skillful-nowcasting — PyTorch Skillful Nowcasting GAN Implementation
- sldl — Single-line inference of SOTA deep learning models
- smalldiffusion — A minimal but functional implementation of diffusion model training and sampling
- smplkit — SMPL-KIT: Use SMPL models more easily.
- snac — Multi-Scale Neural Audio Codec
- soft-mixture-of-experts — soft-mixture-of-experts
- soft-moe-pytorch — Soft MoE - Pytorch
- solo-learn — no summary
- soundstorm-pytorch — SoundStorm - Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
- spacy-llm — Integrating LLMs into structured NLP pipelines
- spanda — Utilities to do research in soil spectroscopy harnessing the fastai framework and mindset
- spandrel — Give your project support for a variety of PyTorch model architectures, including auto-detecting model architecture from just .pth files. spandrel gives you arch support.
- spandrel-extra-arches — Implements extra model architectures for spandrel
- spandrel-foss — Give your project support for a variety of PyTorch model architectures, including auto-detecting model architecture from just .pth files. This version of Spandrel is FOSS compliant as it remove support for model architectures that are under a non-commercial license.
- sparrow-python — no summary
- sparse_autoencoder — Sparse Autoencoder for Mechanistic Interpretability
- sparseml — Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
- sparseml-nightly — Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
- spbnet — spbnet
- speaker-diarization-pyaudio — A speaker diarization pipeline made with pyannote
- spear-tts-pytorch — Spear-TTS - Pytorch
- speculative-decoding — Speculative Decoding
- speech_user_interface — A universal speech user interface for wrapping applications to provide them with a user interface that uses speech rather than text commands or traditional user interfaces
- speechtokenizer — Unified speech tokenizer for speech language model
- spliceai-pytorch — SpliceAI - Pytorch
- sportslabkit — A Python package for sports analytics.
- srf-attention — Simplex random feature attention in PyTorch for both training and inference
- st-moe-pytorch — ST - Mixture of Experts - Pytorch
- stable-audio-tools — Training and inference tools for generative audio models from Stability AI
- stable-diffusion-sdkit — High-Resolution Image Synthesis with Latent Diffusion Models. This is a wrapper around the original repo, to allow installing via pip.
- stam-pytorch — Space Time Attention Model (STAM) - Pytorch
- starlight-vision — Starlight - unprecedented photorealism × deep level of language understanding
- statecraft — Store, manage and remix states for SSMs and other Stateful models
- stDiff-sc — a diffusion model to impute ST data by learn scRNA-seq data
- step-kit — STEP, an acronym for Spatial Transcriptomics Embedding Procedure, is a deep learning-based tool for the analysis of single-cell RNA (scRNA-seq) and spatially resolved transcriptomics (SRT) data. STEP introduces a unified approach to process and analyze multiple samples of scRNA-seq data as well as align several sections of SRT data, disregarding location relationships. Furthermore, STEP conducts integrative analysis across different modalities like scRNA-seq and SRT.
- STFD — STFD: Series of deep learning-based foundation models for spatial transcriptomic data analysis
- studiosr — PyTorch library to accelerate super-resolution research
- stuned — Utility code from STAI (https://scalabletrustworthyai.github.io/)
- stylegan2-pytorch — StyleGan2 in Pytorch
- styletts2 — StyleTTS 2: Towards Human-Level Text-to-Speech through Style Diffusion and Adversarial Training with Large Speech Language Models. Original authors: Yinghao Aaron Li, Cong Han, Vinay S. Raghavan, Gavin Mischler, Nima Mesgarani.
- styletts2-fork — Fork of StyleTTS 2 Python packge. StyleTTS 2: Towards Human-Level Text-to-Speech through Style Diffusion and Adversarial Training with Large Speech Language Models. Original authors: Yinghao Aaron Li, Cong Han, Vinay S. Raghavan, Gavin Mischler, Nima Mesgarani, Sidharth Rajaram.
- super-gradients — SuperGradients
- svdiff-pytorch — Implementation of 'SVDiff: Compact Parameter Space for Diffusion Fine-Tuning'
- swarmalator — swarmalator - Pytorch
- swarms-cloud — Swarms Cloud - Pytorch
- swarms-torch — swarms-torch - Pytorch
- swin-transformer-pytorch — Swin Transformer - Pytorch
- SwissArmyTransformer — A transformer-based framework with finetuning as the first class citizen.
- switch-transformers — SwitchTransformers - Pytorch
- t2iadapter — T2I-Adapter
- tab-transformer-pytorch — Tab Transformer - Pytorch
- tacos — Transformer components
- tailors — no summary
- taker — Tools for Transformer Activations Knowledge ExtRaction
- taming-transformers-hugf — Taming Transformers for High-Resolution Image Synthesis, augmented with some utils of hugging-face
- tartanair — TartanAir
- taylor-series-linear-attention — Taylor Series Linear Attention
- tensor-parallel — Automatically shard your large model between multiple GPUs, works without torch.distributed
- tensorneko — Tensor Neural Engine Kompanion. An util library based on PyTorch and PyTorch Lightning.
- tensorneko-util — The Utils for Library TensorNeko.
- teragpt — Paper - Pytorch
- TerraByte — TerraByte - Pytorch
- testgailbot002 — GailBot API
- testgailbotapi — GailBot Test API
- testgailbotapi001 — GailBot Test API
- text-embeddings — zero-vocab or low-vocab embeddings
- tf-bind-transformer — Transformer for Transcription Factor Binding
- tf-lambda — A TensorFlow 2 implementation of LambdaNetworks.
- tf-lambda-resnet — LambdaResNet implementation in TensorFlow 2.2
- tf-x-transformers — TF-X-Transformers - TF2.x
- thebestllmever — andromeda - Pytorch
- threefiner — Threefiner: a text-guided mesh refiner
- tiggy — tiggy
- tikt-performer-pytorch — Performer - Pytorch
- timesformer-pytorch — TimeSformer - Pytorch
- titan-iris — no summary
- tl2 — A personal package for research
- tldream — A tiny little diffusion drawing app
- tnn-pytorch — Toeplitz Neural Network for Sequence Modeling
- tnt-tensorflow — An Implementation of of Transformer in Transformer for image classification, attention inside local patches