Diffusion-based image super-resolution (SR) methods are mainly limited by the low inference speed due to the requirements of hundreds or even thousands of sampling steps. Existing acceleration sampling techniques inevitably sacrifice performance to some extent, leading to over-blurry SR results. To address this issue, we propose a novel and efficient diffusion model for SR that significantly reduces the number of diffusion steps, thereby eliminating the need for post-acceleration during inference and its associated performance deterioration. Our method constructs a Markov chain that transfers between the high-resolution image and the low-resolution image by shifting the residual between them, substantially improving the transition efficiency. Additionally, an elaborate noise schedule is developed to flexibly control the shifting speed and the noise strength during the diffusion process. Extensive experiments demonstrate that the proposed method obtains superior or at least comparable performance to current state-of-the-art methods on both synthetic and real-world datasets, even only with 15 sampling steps.
The motivation of this work is derived from an intuitive observation of that the transition from an high-resolution (HR) image to its low-resolution (LR) counterpart displays enhanced efficiency, characterized by a reduced number of diffusion steps, in comparison to the transition from the HR image to a Gaussian noise in existing works (e.g., DDPM, LDM). Based on such a motivation, we innovately design a diffusion model that enables a seamless transition between the HR-LR image pairs to achieve this goal. Besides, a more flexible noise schedule is also proposed to better control the perception-distortion trade-off. More detailed mathmatical formulations can be found in our paper.
@InProceedings{yue2023resshift,
author = {Zongsheng Yue, Jianyi Wang, and Chen Change Loy},
title = {ResShift: Efficient Diffusion Model for Image Super-resolution by Residual Shifting},
booktitle = {Advances in Neural Information Processing Systems (NeurIPS)},
year = {2023},
}