Up next


Differentially Private Online to Batch

315 Views
Google TechTalks
1
Published on 30 May 2023 / In News & Politics

A Google TechTalk, presented by Ashok Cutkosky, 2023/02/15 ABSTRACT: Most algorithms for privacy preserving stochastic optimization fall into two camps. In the first camp, one applies some simple noisy preprocessing to privatize gradients that are then used with a standard non-private algorithm. This technique is relatively simple to implement and analyze, but may result in suboptimal convergence properties. In the second camp, one carefully designs a new optimization algorithm with privacy in mind in order to achieve the best possible convergence rates. This method can achieve optimal theoretical properties, but is difficult to generalize and employ in tandem with the heuristic improvements popular in optimization practice. The ideal solution would be a simple way to incorporate privacy into *any* non-private algorithm in such a way that optimal non-private convergence rates translate into optimal private convergence rates. In this talk, I will discuss significant progress towards the ideal goal: a variation on the classical online-to-batch conversion that converts any online optimization algorithm into a private stochastic optimization algorithm for which optimal regret guarantees imply optimal convergence guarantees.

Show more
0 Comments sort Sort By

Up next