Reverse Dependencies of einops
The following projects have a declared dependency on einops:
- hamburger-pytorch — Hamburger - Pytorch
- harmonai-tools — Training and inference tools for generative audio models from Harmonai
- Havina — Havina is a Python library that can generate knowledge graphs triplets from an input text. Its implementation is based on the paper "Language models are open knowledge graphs" with some tweaks to improve performance. Havina can be used to evaluate the language comprehension of AI models or as a tool to extract triplets from text and build knowledge graphs.
- hcpdiff — A universal Stable-Diffusion toolbox
- heliumos.bixi — An open platform for load large language model.
- hgru — no summary
- hindsight-replay — Hindsight - Pytorch
- HITrack — 3D scene on a monocular video
- hlt-torch — Paper - Pytorch
- hnn-utils — Various utilities used throughout my research
- hordelib — A thin wrapper around ComfyUI to allow use by AI Horde.
- horqrux — Jax-based quantum state vector simulator.
- hourglass-transformer-pytorch — Hourglass Transformer
- hrtx — HRTX - Pytorch
- hsss — Paper - Pytorch
- htm-pytorch — Hierarchical Transformer Memory - Pytorch
- hugging-gan-test — Testing pip
- huixiangdou — Overcoming Group Chat Scenarios with LLM-based Technical Assistance
- hyperbox — Hyperbox: An easy-to-use NAS framework.
- i3dFeatureExtraction — This package helps extract i3D features with ResNet-50 backbone given a folder of videos
- igfold-pytorch — IgFold - Pytorch
- igniter — no summary
- imagen-pytorch — Imagen - unprecedented photorealism × deep level of language understanding
- imaginAIry — AI imagined images. Pythonic generation of stable diffusion images.
- imaginarium — no summary
- incendio — A mini-library of PyTorch utilities, wrappers, and miscellaneous tools for deep learning.
- infini-torch — infini - Pytorch
- infini-transformer-pytorch — Infini-Transformer in Pytorch
- iNNsole — This is a placeholder wheel until we get a usable version of this package working. Do not use..
- inox — Stainless neural networks in JAX
- instruct-goose — Implementation of Reinforcement Learning from Human Feedback (RLHF)
- Invariant-Attention — An implementation of Invariant Point Attention from Deepmind's Alphafold 2
- invariant-point-attention — Invariant Point Attention
- invoice-parser — Tools for parsing and extracting information from invoices.
- invokeai — An implementation of Stable Diffusion which provides various new features and options to aid the image generation process
- iqa-torch — Image quality assessment toolbox for Pytorch.
- iqapt — A Image Quality Assessment Package based on PyTorch
- irisml-tasks-llava — Irisml adapter tasks for LLAVA models
- isab — An implementation of Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks in TensorFlow
- isab-pytorch — Induced Set Attention Block - Pytorch
- iTransformer — iTransformer - Inverted Transformer Are Effective for Time Series Forecasting
- ITTR-pytorch — ITTR - Implementation of the Hybrid Perception Block and Dual-Pruned Self-Attention block
- ivy — The unified machine learning framework, enabling framework-agnostic functions, layers and libraries.
- ivy-web — Run Unified AI Framework ivy on web.
- jacksung — no summary
- jam_data — some data utils
- jam-dist — Distribution Research Toolbox
- jammy — A Versatile ToolBox
- jax-cfd — no summary
- jax-md — Differentiable, Hardware Accelerated, Molecular Dynamics
- jax-metrics — no summary
- jax-nca — Neural Cellular Automata (https://distill.pub/2020/growing-ca/ -- Mordvintsev, et al., "Growing Neural Cellular Automata", Distill, 2020) implemented in JAX
- jax-relax — JAX-based Recourse Explanation Library
- jaxtorch — A jax based nn library
- JJukE — Framework and utilities for Deep Learning models with Pytorch by JJukE
- jjuke-diffusion — Use diffusions for various models
- k-diffusion — Karras et al. (2022) diffusion models for PyTorch
- kani-vision — Kani extension for supporting vision-language models (VLMs). Comes with model-agnostic support for GPT-Vision and LLaVA.
- kappadata — pytorch dataset wrappers for in-memory caching
- kappamodules — efficient building blocks for pytorch models
- kappautils — utilities for training machine learning models with pytorch
- kindle — Kindle - Making a PyTorch model easier than ever!
- kolsol — Pseudospectral Kolmogorov Flow Solver
- koopmanlab — A library for Koopman Neural Operator with Pytorch
- kosmos-2 — kosmos-2 - Pytorch
- kosmosg — kosmosg - Pytorch
- KosmosX — Transformers at zeta scales
- kronecker-attention-pytorch — Kronecker Attention - Pytorch
- kronfluence — Influence Functions with (Eigenvalue-corrected) Kronecker-factored Approximate Curvature
- kubiki — kubiki
- labml-nn — 🧑🏫 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit), optimizers (adam, radam, adabelief), gans(dcgan, cyclegan, stylegan2), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, diffusion, etc. 🧠
- labml-python-autocomplete — A simple model that learns to predict Python source code
- lambda-networks — Lambda Networks - Pytorch
- langchain-llm — langchain llm wrapper
- languru — The general purpose LLM app stacks.
- last-asr — The LAttice-based Speech Transducer (LAST) library
- latentscope — Quickly embed, project, cluster and explore a dataset.
- latentshift — A method to generate counterfactuals
- lczerolens — Interpretability for LeelaChessZero networks.
- legrad-torch — LeGrad
- lerobot — Le robot is learning
- libcom — Image Composition Toolbox
- lie-transformer-pytorch — Lie Transformer - Pytorch
- light-cnns — Implementation of Lightweight Network in Pytorch
- lightning-attn — no summary
- lightning-thunder — Lightning Thunder project.
- lightning-uq-box — Lighning-UQ-Box: A toolbox for uncertainty quantification in deep learning
- lightweight-gan — Lightweight GAN
- lilac — Organize unstructured data
- limoe — LiMoE - Pytorch
- linear-attention-transformer — Linear Attention Transformer
- liquidnet — Liquid Net - Pytorch
- lisa-on-cuda — no summary
- lit-saint — Pytorch Lightning implementation of SAINT Model
- litGPT — Hackable implementation of state-of-the-art open-source LLMs
- llama-index-embeddings-nomic — llama-index embeddings nomic integration
- llamatune — Haven's Tuning Library for LLM finetuning
- llava-torch — Towards GPT-4 like large language and visual assistant.
- llm-foundry — LLM Foundry
- LLM-keyword-extractor — This is a python package to extract keywords from a given text using LLMs