Improve model card: Add pipeline tag, paper link, code link, abstract, and sample usage (#3)
Browse files- Improve model card: Add pipeline tag, paper link, code link, abstract, and sample usage (66c163d9ab7047cc54225a6853c78c89f68f606e)
Co-authored-by: Niels Rogge <nielsr@users.noreply.huggingface.co>
README.md
CHANGED
|
@@ -1,11 +1,45 @@
|
|
| 1 |
---
|
| 2 |
-
license: mit
|
| 3 |
datasets:
|
| 4 |
- bayes-group-diffusion/GAS-teachers
|
|
|
|
| 5 |
tags:
|
| 6 |
- arxiv:2510.17699
|
|
|
|
|
|
|
| 7 |
---
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 9 |
## Citation
|
| 10 |
|
| 11 |
```bibtex
|
|
|
|
| 1 |
---
|
|
|
|
| 2 |
datasets:
|
| 3 |
- bayes-group-diffusion/GAS-teachers
|
| 4 |
+
license: mit
|
| 5 |
tags:
|
| 6 |
- arxiv:2510.17699
|
| 7 |
+
pipeline_tag: unconditional-image-generation
|
| 8 |
+
paper: https://huggingface.co/papers/2510.17699
|
| 9 |
---
|
| 10 |
|
| 11 |
+
# GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver
|
| 12 |
+
|
| 13 |
+
This repository contains the models and code for the paper [GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver](https://huggingface.co/papers/2510.17699).
|
| 14 |
+
|
| 15 |
+
Code: https://github.com/3145tttt/GAS
|
| 16 |
+
|
| 17 |
+

|
| 18 |
+
|
| 19 |
+
## Abstract
|
| 20 |
+
While diffusion models achieve state-of-the-art generation quality, they still suffer from computationally expensive sampling. Recent works address this issue with gradient-based optimization methods that distill a few-step ODE diffusion solver from the full sampling process, reducing the number of function evaluations from dozens to just a few. However, these approaches often rely on intricate training techniques and do not explicitly focus on preserving fine-grained details. In this paper, we introduce the **Generalized Solver (GS)**: a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches. We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhances detail fidelity. We call the resulting method the **Generalized Adversarial Solver (GAS)** and demonstrate its superior performance compared to existing solver training methods under similar resource constraints.
|
| 21 |
+
|
| 22 |
+
## Usage
|
| 23 |
+
|
| 24 |
+
To use the trained Generalized Solver (GS) or Generalized Adversarial Solver (GAS) for image generation, you can run the `generate.py` script from the project's GitHub repository.
|
| 25 |
+
|
| 26 |
+
First, ensure your environment is set up and the necessary data (including teacher models) are downloaded as per the [GitHub instructions](https://github.com/3145tttt/GAS#setup-environment).
|
| 27 |
+
|
| 28 |
+
Here is an example command to generate 50,000 images using 2 GPUs from a trained GS checkpoint on **CIFAR-10** with four sampling steps:
|
| 29 |
+
|
| 30 |
+
```bash
|
| 31 |
+
# Generate 50000 images using 2 GPUs and a checkpoint from checkpoint_path
|
| 32 |
+
torchrun --standalone --nproc_per_node=2 generate.py \
|
| 33 |
+
--config=configs/edm/cifar10.yaml \
|
| 34 |
+
--outdir=data/teachers/cifar10 \
|
| 35 |
+
--seeds=50000-99999 \
|
| 36 |
+
--batch=1024 \
|
| 37 |
+
--steps=4 \
|
| 38 |
+
--checkpoint_path=checkpoint_path
|
| 39 |
+
```
|
| 40 |
+
|
| 41 |
+
Replace `checkpoint_path` with the actual path to your trained GS/GAS model checkpoint. For more details and specific configurations, refer to the [GitHub repository's inference section](https://github.com/3145tttt/GAS#inference-with-trained-gs).
|
| 42 |
+
|
| 43 |
## Citation
|
| 44 |
|
| 45 |
```bibtex
|