Privacy Amplification by Random Allocation
AuthorsVitaly Feldman, Moshe Shenfeld†
Privacy Amplification by Random Allocation
AuthorsVitaly Feldman, Moshe Shenfeld†
We consider the privacy amplification properties of a sampling scheme in which a user’s data is used in steps chosen randomly and uniformly from a sequence (or set) of steps. This sampling scheme has been recently applied in the context of differentially private optimization(Chua et al., 2024; Choquette-Choo et al., 2024) and is also motivated by communication-efficient high-dimensional private aggregation (Asi et al., 2025). Existing analyses of this scheme either rely on privacy amplification by shuffling which leads to overly conservative bounds or require Monte Carlo simulations that are computationally prohibitive in most practical scenarios.
We give the first theoretical guarantees and numerical estimation algorithms for this sampling scheme. In particular, we demonstrate that the privacy guarantees of random -out-of- allocation can be upper bounded by the privacy guarantees of the well-studied independent (or Poisson) subsampling in which each step uses the user’s data with probability . Further, we provide two additional analysis techniques that lead to numerical improvements in several parameter regimes. Altogether, our bounds give efficiently-computable and nearly tight numerical results for random allocation applied to Gaussian noise addition.
A Survey on Privacy from Statistical, Information and Estimation-Theoretic Views
September 21, 2021research area Privacyconference IEEE BITS the Information Theory Magazine
The privacy risk has become an emerging challenge in both information theory and computer science due to the massive (centralized) collection of user data. In this paper, we overview privacy-preserving mechanisms and metrics from the lenses of information theory, and unify different privacy metrics, including f-divergences, Renyi divergences, and differential privacy, by the probability likelihood ratio (and the logarithm of it). We introduce…
A Simple and Nearly Optimal Analysis of Privacy Amplification by Shuffling
July 13, 2021research area Methods and Algorithms, research area Privacyconference FOCS
Recent work of Erlingsson, Feldman, Mironov, Raghunathan, Talwar, and Thakurta demonstrates that random shuffling amplifies differential privacy guarantees of locally randomized data. Such amplification implies substantially stronger privacy guarantees for systems in which data is contributed anonymously and has lead to significant interest in the shuffle model of privacy
We show that random shuffling of data records that are input to…