Reverse Dependencies of jaxlib
The following projects have a declared dependency on jaxlib:
- dopamax — Reinforcement learning in pure JAX.
- dopamine-rl — Dopamine: A framework for flexible Reinforcement Learning research
- drpangloss — Python package to fit interferometric data, accelerated using Jax.
- dvoryan — Reproducible and efficient diffusion kurtosis imaging in Python.
- dynamax — Dynamic State Space Models in JAX.
- e3nn-jax — Equivariant convolutional neural networks for the group E(3) of 3 dimensional rotations, translations, and mirrors.
- EasyDeL — An open-source library to make training faster and more optimized in Jax/Flax
- econpizza — Solve nonlinear perfect foresight models with heterogeneous agents
- egnn-jax — E(3) GNN in jax
- einconv — Convolutions as tensor contractions (einsums) for PyTorch
- einshape — DSL-based reshaping library for JAX and other frameworks
- elle — Library of anonymous finite elements with analytic derivatives.
- elrpy — A lightweight package for individual level inference in elections backed by jax.
- emme — Object oriented finite element analysis.
- eulerpi — The eulerian parameter inference (eulerpi) returns a parameter distribution, which is consistent with the observed data by solving the inverse problem directly. In the case of a one-to-one mapping, this is the true underlying distribution.
- evermore — Differentiable (binned) likelihoods in JAX.
- evofr — Tools for evolutionary forecasting.
- evojax — EvoJAX: Hardware-accelerated Neuroevolution.
- evosax — JAX-Based Evolution Strategies
- evox — evox
- ex2mcmc — Local-Global MCMC kernels: the bost of both worlds (NeurIPS 2022)
- exoplanet-core — The compiled backend for exoplanet
- factopy — no summary
- fastax — A Jax based neural network library for research
- fastreg — Fast sparse regressions
- FContin — Numerical continuation using just the function
- fedjax — Federated learning simulation with JAX.
- fedml-afaf — A research and production integrated edge-cloud library for federated/distributed machine learning at anywhere at any scale.
- fennel-seed — Light-yields for tracks, and cascades
- fjformer — Embark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax Functions and Utils that elevate your AI endeavors to new heights!
- flaim — Flax Image Models
- flarejax — Pytree modules classes with easy manipulation and serialization
- flashbax — Flashbax is an experience replay library oriented around JAX. Tailored to integrate seamlessly with JAX's Just-In-Time (JIT) compilation.
- flax — Flax: A neural network library for JAX designed for flexibility
- flax-addons — flax addons
- flax-trainer — Flax Trainer
- flaxmodels — A collection of pretrained models in Flax.
- flaxsr — Super Resolution tools with Jax/Flax
- floral — the best neural network library
- flowMC — Normalizing flow exhanced sampler in jax
- fmmax — Fourier modal method with Jax
- folx — Forward Laplacian for JAX
- fouriax — A jax port of auraloss
- funsor — A tensor-like library for functions and distributions
- galilei — the galilei project.
- gaul — no summary
- gdec — Linear decoders for angled grating stimuli
- geometricalgebra — A package for conformal geometric algebra
- glworia — A package for wave-optics lensing calculations
- google-jetstream — JetStream is a throughput and memory optimized engine for LLM inference on XLA devices, starting with TPUs (and GPUs in future -- PRs welcome).
- google-vizier — Open Source Vizier: Distributed service framework for blackbox optimization and research.
- google-vizier-dev — Open Source Vizier: Distributed service framework for blackbox optimization and research.
- gpcm — Implementation of the GPCM and variations
- gpfy — Gaussian process with spherical harmonic features in JAX
- GPJax — Gaussian processes in JAX.
- gpjax-nightly — Didactic Gaussian processes in Jax.
- gplugins — gdsfactory plugins
- grad-info-opt — Implementation of Gradient Information Optimization for efficient and scalable training data selection
- grain — Grain: A library for loading and transforming data for neural network training.
- grain-nightly — Grain: A library for loading and transforming data for neural network training.
- gravlax — Basic training utils for JAX.
- grgrjax — Some generic tools for JAX
- grgrlib — Various insanely helpful functions
- GridPolator — Interpolate a grid of spectroscopic models.
- gsmvi — Implementation of Gaussian score matching for variational inference (arXiv:2307.07849)
- gwpopulation — Unified population inference
- gymnasium — A standard API for reinforcement learning and a diverse set of reference environments (formerly Gym).
- gymnax — JAX-compatible version of Open AI's gym environments
- halospec — Halo Spectroscopy for JWST/MIRI
- hamux — A Deep Learning framework built around ENERGY
- harmonic — Python package for efficient Bayesian evidence computation
- harmonix — Analytic interferometry of stellar surfaces using spherical harmonics in Jax
- helax — Python package for computing helicity amplitudes
- hgan — A package to infer interpretable dynamics from images of a mechanical system.
- hotaru — High performance Optimizer to extract spike Timing And cell location from calcium imaging data via lineaR impUlse
- hssm — Bayesian inference for hierarchical sequential sampling models.
- hxrate — HX Rate Fitting
- hybridq — Hybrid Simulator for Quantum Circuits
- hypervecs — no summary
- igenerator — Generating random numbers faster than numpy
- ikpls — no summary
- imax — Image augmentation library for Jax
- iminuit — Jupyter-friendly Python frontend for MINUIT2 in C++
- IMNN — Using neural networks to extract sufficient statistics from data by maximising the Fisher information
- impt — Auto-diff Estimator of Lensing Perturbations
- imt-tree_utils — Utilities for working with Pytrees
- invrs-gym — A collection of inverse design challenges
- invrs-opt — Algorithms for inverse design
- jaims — Library for jax based affine-invariant MCMC sampling
- jammer — Library for jax based affine-invariant MCMC sampling
- janus-sim — A JAX Neural Simulator
- jars — Next-generation objects for omics data
- javiche — A JAX wrapper around ceviche to make interoperability easier. In the future it might make sense to update ceviche itself to use JAX internally.
- jax — Differentiate, compile, and transform Numpy code.
- jax-am — GPU-accelerated simulation toolbox for additive manufacturing based on JAX.
- jax-autovmap — Automatically broadcast inputs by dynamically applying jax.vmap
- jax-data — Native data handling for JAX
- jax-dataclasses — Dataclasses + JAX
- jax-ddp — no summary
- jax-dimenet — DimeNet++ in Jax.