Reverse Dependencies of flax
The following projects have a declared dependency on flax:
- absl-extra — A wrapper to run and monitor absl app.
- ai2-tango — A library for choreographing your machine learning research.
- alpa — Alpa automatically parallelizes large tensor computation graphs and runs them on a distributed cluster.
- anacal — no summary
- aqtp — Accurate Quantized Training library.
- autobound — no summary
- aws-fortuna — A Library for Uncertainty Quantification.
- axlearn — AXLearn
- baxa — Models and examples built with JAX
- bayesnf — Scalable spatiotemporal prediction with Bayesian neural fields
- benchmarx — Tools for benchmarking optimization methods
- best-package — no summary
- bobbin — Tools for making training loops with flax.linen models.
- bottleneck-transformer-flax — Bottleneck Transformer - Flax
- brax — A differentiable physics engine written in JAX.
- cfrx — Counterfactual Regret Minimization in Jax
- chemise — Wrapper for training flax models
- chromatix — Differentiable computational optics library using JAX!
- ciclo — no summary
- clax — Prebuilt jax classifiers
- cleanrl — High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features
- clip-jax — Training of CLIP in JAX
- clu — Set of libraries for ML training loops in JAX.
- cody-adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- coix — Inference Combinators in JAX
- continuation-jax — Continuation Methods for Deep Neural Networks.
- coreax — Jax coreset algorithms.
- craftax — An open-world environment for training RL agents
- crikit — Constitutive Relation Inference Toolkit
- cupbearer — A library for mechanistic anomaly detection
- dalle-mini — DALLĀ·E mini - Generate images from a text prompt
- danling — Scaffold for experienced Machine Learning Researchers
- deluca — no summary
- dgenerate — Batch image generation and manipulation tool supporting Stable Diffusion and related techniques / algorithms, with support for video and animated image processing.
- diffqc — Diiferentiable Quantum Simulator
- diffusers — State-of-the-art diffusion in PyTorch and JAX.
- diffusers-unchained — Diffusers
- diffusersv — State-of-the-art diffusion in PyTorch and JAX.
- dinf — discriminator-based inference for population genetics
- dm-haiku — Haiku is a library for building neural networks in JAX.
- dMO — A package for learning cutting planes for mixed-integer optimization problems.
- dopamine-rl — Dopamine: A framework for flexible Reinforcement Learning research
- dynamax — Dynamic State Space Models in JAX.
- e3nn-jax — Equivariant convolutional neural networks for the group E(3) of 3 dimensional rotations, translations, and mirrors.
- EasyDeL — An open-source library to make training faster and more optimized in Jax/Flax
- evojax — EvoJAX: Hardware-accelerated Neuroevolution.
- evosax — JAX-Based Evolution Strategies
- evox — evox
- fiddle — Fiddle: A Python-first configuration library
- fjformer — Embark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax Functions and Utils that elevate your AI endeavors to new heights!
- flaim — Flax Image Models
- flashbax — Flashbax is an experience replay library oriented around JAX. Tailored to integrate seamlessly with JAX's Just-In-Time (JIT) compilation.
- flax-extra — The package provides extra flexibility to Flax using ideas originated at Trax
- flax-gated-linear-rnn — GatedLinearRNN Model
- flax-trainer — Flax Trainer
- flax-vision-models — A repository of Deep Learning models in Flax
- flaxmodels — A collection of pretrained models in Flax.
- flaxsr — Super Resolution tools with Jax/Flax
- fttjax — Feature Tokenizer + Transformer - JAX
- fusions — Diffusion meets sampling
- galilei — the galilei project.
- gemma-llm — Open weights large language model (LLM) from Google DeepMind.
- git-t5 — Open source machine learning framework for training T5 models on source code in JAX/Flax.
- git-theta — Version control system for machine learning model checkpoints.
- google-jetstream — JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
- google-vizier — Open Source Vizier: Distributed service framework for blackbox optimization and research.
- google-vizier-dev — Open Source Vizier: Distributed service framework for blackbox optimization and research.
- gplugins — gdsfactory plugins
- gtech-optimus — Optimus library for real-time marketing personalization using RL.
- gymnax — JAX-compatible version of Open AI's gym environments
- hamux — A Deep Learning framework built around ENERGY
- harmonic — Python package for efficient Bayesian evidence computation
- Helx — Interoperate among reinforcement learning libraries with jax, pytorch, gym and dm_env
- Helx-agents — Interoperate among reinforcement learning libraries with jax, pytorch, gym and dm_env
- HELX-base — Interoperate among reinforcement learning libraries with jax, pytorch, gym and dm_env
- hj-reachability — Hamilton-Jacobi reachability analysis in JAX.
- horqrux — Jax-based quantum state vector simulator.
- hugging-gan-test — Testing pip
- hyper-nn — Easy hypernetworks in Pytorch and Flax
- impt — Auto-diff Estimator of Lensing Perturbations
- iqa-jax — IQA library for Jax
- jax-codex — COders and DEcoders for jaX.
- jax-dataclasses — Dataclasses + JAX
- jax-dips — Differentiable 3D interfacial PDE solvers written in JAX using the Neural Bootstrapping Method.
- jax-fid — FID computation in Jax/Flax.
- jax-md — Differentiable, Hardware Accelerated, Molecular Dynamics
- jax-models — Unofficial JAX implementations of deep learning research papers
- jax-nca — Neural Cellular Automata (https://distill.pub/2020/growing-ca/ -- Mordvintsev, et al., "Growing Neural Cellular Automata", Distill, 2020) implemented in JAX
- jax-nerf — Jax implementation of neural radiance fields
- jax-resnet — Framework-agnostic library for checking array shapes at runtime.
- jax-sysid — jax-sysid - A Python package for linear and nonlinear system identification and nonlinear regression using Jax.
- jaxcam — no summary
- jaxformers — 'Attention is all you need' in JAX (Flax)
- jaxfss — JAX/Flax implementation of finite-size scaling
- jaxGW — Gravitatioanl wave data analysis tool in Jax
- jaxlie — Matrix Lie groups in JAX
- jaxmarl — Multi-Agent Reinforcement Learning with JAX
- jeometric — Graph Neural Networks in JAX
- jestimator — Implementation of the Amos optimizer from the JEstimator lib.
- jimm — JAX Image Models
1
2