Publications

You can also find my articles on my Google Scholar profile.

Preprints


Conference Papers


Discrete Markov Probabilistic Models: An Improved Discrete Score-Based Framework with sharp convergence bounds under minimal assumptions

Shariatian D.*, Pham L.T.N.*, Ocello A., Conforti G., Durmus A.O.

Published in ICML 2025, 2025

This paper introduces a novel framework for discrete data generation on the hypercube ${0, 1}^d$. We establish theoretical and methodological alignment with classical continuous score-based modesls. We demonstrate the effectiveness of this approach on low and high dimensional datasets (Binary MNIST), beating other state-of-the-art methods like Discrete Flow Matching

See paper | GitHub Repository

Denoising Levy Probabilistic Models (DLPM)

Shariatian D., Simsekli U., Durmus A.O.

Published in ICLR 2025, 2025

This paper introduces a novel framework to use heavy-tailed noise in the denoising diffusion paradigm, which constitutes a generalization of the original DDPM method. Using heavy-tailed noise is shown to bring benefits in various contexts: heavy-tailed data distributions, better robustness to class imbalance, and smaller computational time.

See paper | See slides | GitHub Repository

Piecewise Deterministic Generative Models

Bertazzi A., Shariatian D., Durmus A.O., Simsekli U., Moulines É.

Published in NeurIPS 2024, 2024

We introduce a novel class of generative models based on piecewise deterministic Markov processes (PDMPs), which combine deterministic motion with random jumps. Like diffusions, PDMPs can be reversed in time. We derive explicit expressions for jump rates and kernels in the time-reversed processes and propose efficient training methods and approximate simulation techniques. Additionally, we provide bounds on the total variation distance between the data and model distributions, supported by promising numerical simulations.

See paper | GitHub Repository