Reverse Dependencies of hf-doc-builder
The following projects have a declared dependency on hf-doc-builder:
- accelerate — Accelerate
- adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- alignment-handbook — The Alignment Handbook
- competitions — Hugging Face Competitions
- diffusers — State-of-the-art diffusion in PyTorch and JAX.
- diffusers-unchained — Diffusers
- diffusersv — State-of-the-art diffusion in PyTorch and JAX.
- mw-adapter-transformers — A friendly fork of HuggingFace's Transformers, adding Adapters to PyTorch language models
- nbquarto — A minimal nbdev version, focused on writing quarto extensions
- neurocache — Neurocache: A library for augmenting language models with external memory.
- optimum-habana — Optimum Habana is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU). It provides a set of tools enabling easy model loading, training and inference on single- and multi-HPU settings for different downstream tasks.
- optimum-neuron — Optimum Neuron is the interface between the Hugging Face Transformers and Diffusers libraries and AWS Tranium and Inferentia accelerators. It provides a set of tools enabling easy model loading, training and inference on single and multiple neuron core settings for different downstream tasks.
- peft — Parameter-Efficient Fine-Tuning (PEFT)
- peft-machinify — Parameter-Efficient Fine-Tuning (PEFT)
- setfit — Efficient few-shot learning with Sentence Transformers
- souJpg-diffusers — State-of-the-art diffusion in PyTorch and JAX.
- transformers — State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
- vfpeft — Parameter-Efficient Fine-Tuning (PEFT)
1