February 19, 2026 at 11 AM
A common route to differentially private training is DP-SGD, which adds independent Gaussian noise at each iteration to bound any single example’s influence. Recent work shows that correlating noise across iterations can significantly improve utility. Matrix factorization provides a principled way to introduce such correlations; however, it comes with a limitation: correlating the noise incurs substantial memory and computational overhead. In this talk, I will present our recent work on memory-free, efficient noise correlation using pseudorandomness.
Inria Bâtiment B salle B21