Reverse Dependencies of torch-optimizer
The following projects have a declared dependency on torch-optimizer:
- asteroid — PyTorch-based audio source separation toolkit
- bitfount — Machine Learning and Federated Learning Library.
- cellpose-omni — cellpose fork developed for omnipose
- cellpose-omni-acdc — cellpose fork developed for omnipose
- crank-vc — Non-parallel Voice Conversion called crank
- deep-daze — Deep Daze
- disent — Vae disentanglement framework built with pytorch lightning.
- eir-dl — no summary
- enot-autodl — AutoDL framework for neural network compression & acceleration
- espnet — ESPnet: end-to-end speech processing toolkit
- high-order-implicit-representation — no summary
- high-order-layers-torch — High order layers in pytorch
- igfold — no summary
- lightning-flash — Your PyTorch AI Factory - Flash enables you to easily configure and run complex AI recipes.
- mosaicml — Composer is a PyTorch library that enables you to train neural networks faster, at lower cost, and to higher accuracy.
- netharn — Train and deploy pytorch models
- PaPie — A Framework for Joint Learning of Sequence Labeling Tasks, forked from Pie
- perceiver-io — Perceiver IO
- plato-learn — Packaged version of the Plato framework for federated learning research
- protoattend — Protoattend library for interpretable machine learning
- root-tissue-seg-package — An mlf-core prediction package for root tissue segmentation.
- satflow — Satellite Optical Flow
- stVAE — Style transfer variational autoencoder
- tailors-trainer — no summary
- torch-ecg — A Deep Learning Framework for ECG Processing Tasks Based on PyTorch
- torch-yolo3 — YOLO v3 in PyTorch