
Yingbin Liang
Convergence Theory: How Fast Do Discrete Diffusion Models Generate?
Summary
Discrete diffusion models, especially masked diffusion (often called diffusion language models), have rapidly emerged as a compelling alternative to autoregressive generation for discrete domains, including natural language, code, and molecular design. Alongside their striking empirical progress, rigorous theory on convergence behavior and error dynamics is drawing growing attention: understanding how quickly sampling converges to realistic data is not only a foundational question, but also directly informs inference-time compute, which ultimately governs real-world power efficiency of generative AI systems.
In this talk, I will present the current state of the art and our recent results providing non-asymptotic error bounds and convergence guarantees for discrete diffusion models, both for widely used samplers and for several newly developed accelerated sampling methods. I will cover the two major families, uniform-rate diffusion and absorbing-rate diffusion (the latter corresponding to masked diffusion), and highlight their different convergence behaviors. I will also introduce a set of general analysis techniques that can be applied broadly to other discrete diffusion formulations. I will conclude with open directions at the intersection of foundational theory and practical sampler design, including principled acceleration with provable guarantees and post-training/fine-tuning discrete diffusion models toward downstream objectives and constraints.
Short bio
Dr. Yingbin Liang is currently a Professor at the Department of Electrical and Computer Engineering at the Ohio State University (OSU), and a core faculty of the Ohio State Translational Data Analytics Institute (TDAI). She also serves as the Deputy Director of the NSF AI-EDGE Institute and the Co-Lead for Foundational AI Pillar of OSU AI^X Hub. Dr. Liang received the Ph.D. degree in Electrical Engineering from the University of Illinois at Urbana-Champaign in 2005, and served on the faculty of University of Hawaii and Syracuse University before she joined OSU. Dr. Liang’s research lies at the intersection of machine learning, large-scale optimization, statistical signal processing, information theory, and wireless networks, with their growing applications to other scientific domains. She received the National Science Foundation CAREER Award and the State of Hawaii Governor Innovation Award in 2009. She also received EURASIP Best Paper Award in 2014. She is currently an Information Theory Society Distinguished Lecturer for 2026–2027. Dr. Liang is an IEEE fellow.

















