Posts by Collection

mini_projects

Denoising Levy Probabilistic Models (DLPM)

Shariatian, D., Simsekli, U., & Durmus, A.O. (2024). Denoising Lévy Probabilistic Models. ArXiv, abs/2407.18609.

Published:

This paper introduces a novel framework to use heavy-tailed noise in the denoising diffusion paradigm, which constitutes a generalization of the original DDPM method. Using heavy-tailed noise is shown to bring benefits in various contexts: heavy-tailed data distributions, better robustness to class imbalance, and smaller computational time.

See paper | See slides

Piecewise deterministic generative models

Bertazzi, A., Durmus, A.O., Shariatian, D., Simsekli, U., & Moulines, É. (2024). Piecewise deterministic generative models. ArXiv, abs/2407.19448.

Published:

We introduce a novel class of generative models based on piecewise deterministic Markov processes (PDMPs), which combine deterministic motion with random jumps. Like diffusions, PDMPs can be reversed in time. We derive explicit expressions for jump rates and kernels in the time-reversed processes and propose efficient training methods and approximate simulation techniques. Additionally, we provide bounds on the total variation distance between the data and model distributions, supported by promising numerical simulations.

See paper

news

MAA106 Teaching

Published:

Finished TA’ing MAA106 numerical analysis course in Ecole Polytechnique. Thanks Maxime Breden for the teaching experience and congrats to the students!

PDMP on the Road

Published:

Presenting PDMP at both NeurIPS in Paris and official NeurIPS in Vancouver.

Diffusion Reading Group Launch

Published:

Launching the diffusion model reading group in Inria Paris. Feel free to reach out if you want to join!

Oberwolfach Workshop

Published:

Attending the mini-workshop ‘Statistical Challenges for Deep Generative Models’ in legendary Oberwolfach, Germany.

Visit to Padova

Published:

Invited in Padova by Giovanni Conforti to explore cosmological applications of diffusion models. Looking forward to working with such a great person!

Sakana AI Internship

Published:

I am starting an internship at Sakana AI in Tokyo, advised by legendary Stefano Peluchetti (who discovered flow matching a year before… flow matching).

Sakana Research Retreat

Published:

I am co-organising a 5-day research retreat with Sakana’s research staff!

portfolio

publications

Piecewise Deterministic Generative Models

Bertazzi A., Shariatian D., Durmus A.O., Simsekli U., Moulines É.

Published in NeurIPS 2024, 2024

We introduce a novel class of generative models based on piecewise deterministic Markov processes (PDMPs), which combine deterministic motion with random jumps. Like diffusions, PDMPs can be reversed in time. We derive explicit expressions for jump rates and kernels in the time-reversed processes and propose efficient training methods and approximate simulation techniques. Additionally, we provide bounds on the total variation distance between the data and model distributions, supported by promising numerical simulations.

See paper | GitHub Repository

Denoising Levy Probabilistic Models (DLPM)

Shariatian D., Simsekli U., Durmus A.O.

Published in ICLR 2025, 2025

This paper introduces a novel framework to use heavy-tailed noise in the denoising diffusion paradigm, which constitutes a generalization of the original DDPM method. Using heavy-tailed noise is shown to bring benefits in various contexts: heavy-tailed data distributions, better robustness to class imbalance, and smaller computational time.

See paper | See slides | GitHub Repository

Discrete Markov Probabilistic Models: An Improved Discrete Score-Based Framework with sharp convergence bounds under minimal assumptions

Shariatian D.*, Pham L.T.N.*, Ocello A., Conforti G., Durmus A.O.

Published in ICML 2025, 2025

This paper introduces a novel framework for discrete data generation on the hypercube ${0, 1}^d$. We establish theoretical and methodological alignment with classical continuous score-based modesls. We demonstrate the effectiveness of this approach on low and high dimensional datasets (Binary MNIST), beating other state-of-the-art methods like Discrete Flow Matching

See paper | GitHub Repository

talks

teaching

Numerical Analysis MAA106

Undergraduate course, Ecole Polytechnique, CMAP, 2024

TA’d a 4 months course for 1st year students on numerical analysis.