110
B\'ezierFlow: B\'ezier Stochastic Interpolant Schedulers for Few-Step Generation
arXiv:2512.13255v1 Announce Type: new
Abstract: We introduce B\'ezierFlow, a lightweight training approach for few-step generation with pretrained diffusion and flow models. B\'ezierFlow achieves a 2-3x performance improvement for sampling with $\leq$ 10 NFEs while requiring only 15 minutes of training. Recent lightweight training approaches have shown promise by learning optimal timesteps, but their scope remains restricted to ODE discretizations. To broaden this scope, we propose learning the optimal transformation of the sampling trajectory by parameterizing stochastic interpolant (SI) schedulers. The main challenge lies in designing a parameterization that satisfies critical desiderata, including boundary conditions, differentiability, and monotonicity of the SNR. To effectively meet these requirements, we represent scheduler functions as B\'ezier functions, where control points naturally enforce these properties. This reduces the problem to learning an ordered set of points in the time range, while the interpretation of the points changes from ODE timesteps to B\'ezier control points. Across a range of pretrained diffusion and flow models, B\'ezierFlow consistently outperforms prior timestep-learning methods, demonstrating the effectiveness of expanding the search space from discrete timesteps to B\'ezier-based trajectory transformations.
Abstract: We introduce B\'ezierFlow, a lightweight training approach for few-step generation with pretrained diffusion and flow models. B\'ezierFlow achieves a 2-3x performance improvement for sampling with $\leq$ 10 NFEs while requiring only 15 minutes of training. Recent lightweight training approaches have shown promise by learning optimal timesteps, but their scope remains restricted to ODE discretizations. To broaden this scope, we propose learning the optimal transformation of the sampling trajectory by parameterizing stochastic interpolant (SI) schedulers. The main challenge lies in designing a parameterization that satisfies critical desiderata, including boundary conditions, differentiability, and monotonicity of the SNR. To effectively meet these requirements, we represent scheduler functions as B\'ezier functions, where control points naturally enforce these properties. This reduces the problem to learning an ordered set of points in the time range, while the interpretation of the points changes from ODE timesteps to B\'ezier control points. Across a range of pretrained diffusion and flow models, B\'ezierFlow consistently outperforms prior timestep-learning methods, demonstrating the effectiveness of expanding the search space from discrete timesteps to B\'ezier-based trajectory transformations.
No comments yet.