@@ -14,7 +14,8 @@ specific language governing permissions and limitations under the License.
14
14
15
15
Diffusers contains multiple pre-built schedule functions for the diffusion process.
16
16
17
- ## What is a schduler?
17
+ ## What is a scheduler?
18
+
18
19
The schedule functions, denoted *Schedulers* in the library take in the output of a trained model, a sample which the diffusion process is iterating on, and a timestep to return a denoised sample.
19
20
20
21
- Schedulers define the methodology for iteratively adding noise to an image or for updating a sample based on model outputs.
@@ -23,73 +24,77 @@ The schedule functions, denoted *Schedulers* in the library take in the output o
23
24
- Schedulers are often defined by a *noise schedule* and an *update rule* to solve the differential equation solution.
24
25
25
26
### Discrete versus continuous schedulers
27
+
26
28
All schedulers take in a timestep to predict the updated version of the sample being diffused.
27
29
The timesteps dictate where in the diffusion process the step is, where data is generated by iterating forward in time and inference is executed by propagating backwards through timesteps.
28
- Different algorithms use timesteps that both discrete (accepting `int` inputs), such as the [`DDPMScheduler`] or [`PNDMScheduler`], and continuous (accepting ' float` inputs), such as the score-based schedulers [`ScoreSdeVeScheduler`] or [`ScoreSdeVpScheduler`].
30
+ Different algorithms use timesteps that both discrete (accepting `int` inputs), such as the [`DDPMScheduler`] or [`PNDMScheduler`], and continuous (accepting ` float` inputs), such as the score-based schedulers [`ScoreSdeVeScheduler`] or [`ScoreSdeVpScheduler`].
29
31
30
32
## Designing Re-usable schedulers
33
+
31
34
The core design principle between the schedule functions is to be model, system, and framework independent.
32
35
This allows for rapid experimentation and cleaner abstractions in the code, where the model prediction is separated from the sample update.
33
36
To this end, the design of schedulers is such that:
37
+
34
38
- Schedulers can be used interchangeably between diffusion models in inference to find the preferred trade-off between speed and generation quality.
35
39
- Schedulers are currently by default in PyTorch, but are designed to be framework independent (partial Numpy support currently exists).
36
40
37
41
38
42
## API
43
+
39
44
The core API for any new scheduler must follow a limited structure.
40
45
- Schedulers should provide one or more `def step(...)` functions that should be called to update the generated sample iteratively.
41
46
- Schedulers should provide a `set_timesteps(...)` method that configures the parameters of a schedule function for a specific inference task.
42
47
- Schedulers should be framework-agonstic, but provide a simple functionality to convert the scheduler into a specific framework, such as PyTorch
43
48
with a `set_format(...)` method.
44
49
45
- ### Core
46
50
The base class [`SchedulerMixin`] implements low level utilities used by multiple schedulers.
47
51
48
- #### SchedulerMixin
52
+ ### SchedulerMixin
49
53
[[autodoc]] SchedulerMixin
50
54
51
- #### SchedulerOutput
55
+ ### SchedulerOutput
52
56
The class [`SchedulerOutput`] contains the ouputs from any schedulers `step(...)` call.
57
+
53
58
[[autodoc]] schedulers.scheduling_utils.SchedulerOutput
54
59
55
- ### Existing Schedulers
60
+ ### Implemented Schedulers
56
61
57
62
#### Denoising diffusion implicit models (DDIM)
58
63
59
64
Original paper can be found here.
60
65
61
- [[autodoc]] schedulers.scheduling_ddim. DDIMScheduler
66
+ [[autodoc]] DDIMScheduler
62
67
63
68
#### Denoising diffusion probabilistic models (DDPM)
64
69
65
70
Original paper can be found [here](https://arxiv.org/abs/2010.02502).
66
71
67
- [[autodoc]] schedulers.scheduling_ddpm. DDPMScheduler
72
+ [[autodoc]] DDPMScheduler
68
73
69
74
#### Varience exploding, stochastic sampling from Karras et. al
70
75
71
76
Original paper can be found [here](https://arxiv.org/abs/2006.11239).
72
77
73
- [[autodoc]] schedulers.scheduling_karras_ve. KarrasVeScheduler
78
+ [[autodoc]] KarrasVeScheduler
74
79
75
80
#### Linear multistep scheduler for discrete beta schedules
76
81
77
82
Original implementation can be found [here](https://arxiv.org/abs/2206.00364).
78
83
79
84
80
- [[autodoc]] schedulers.scheduling_lms_discrete. LMSDiscreteScheduler
85
+ [[autodoc]] LMSDiscreteScheduler
81
86
82
87
#### Pseudo numerical methods for diffusion models (PNDM)
83
88
84
89
Original implementation can be found [here](https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181).
85
90
86
- [[autodoc]] schedulers.scheduling_pndm. PNDMScheduler
91
+ [[autodoc]] PNDMScheduler
87
92
88
93
#### variance exploding stochastic differential equation (SDE) scheduler
89
94
90
95
Original paper can be found [here](https://arxiv.org/abs/2011.13456).
91
96
92
- [[autodoc]] schedulers.scheduling_sde_ve. ScoreSdeVeScheduler
97
+ [[autodoc]] ScoreSdeVeScheduler
93
98
94
99
#### variance preserving stochastic differential equation (SDE) scheduler
95
100
0 commit comments