Diffusers documentation

HeliosDMDScheduler

You are viewing main version, which requires installation from source. If you'd like regular pip install, checkout the latest stable version (v0.36.0).
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

HeliosDMDScheduler

HeliosDMDScheduler is based on the pyramidal flow-matching sampling introduced in Helios.

HeliosDMDScheduler

class diffusers.HeliosDMDScheduler

< >

( num_train_timesteps: int = 1000 shift: float = 1.0 stages: int = 3 stage_range: list = [0, 0.3333333333333333, 0.6666666666666666, 1] gamma: float = 0.3333333333333333 prediction_type: str = 'flow_prediction' use_flow_sigmas: bool = True use_dynamic_shifting: bool = False time_shift_type: typing.Literal['exponential', 'linear'] = 'linear' )

init_sigmas

< >

( )

initialize the global timesteps and sigmas

init_sigmas_for_each_stage

< >

( )

Init the timesteps for each stage

set_begin_index

< >

( begin_index: int = 0 )

Parameters

  • begin_index (int) — The begin index for the scheduler.

Sets the begin index for the scheduler. This function should be run from pipeline before the inference.

set_timesteps

< >

( num_inference_steps: int stage_index: int | None = None device: str | torch.device = None sigmas: bool | None = None mu: bool | None = None is_amplify_first_chunk: bool = False )

Setting the timesteps and sigmas for each stage

time_shift

< >

( mu: float sigma: float t: Tensor ) torch.Tensor

Parameters

  • mu (float) — The mu parameter for the time shift.
  • sigma (float) — The sigma parameter for the time shift.
  • t (torch.Tensor) — The input timesteps.

Returns

torch.Tensor

The time-shifted timesteps.

Apply time shifting to the sigmas.

scheduling_helios_dmd

Update on GitHub