Search papers, labs, and topics across Lattice.
This paper resolves an open question regarding the separation of oblivious and adaptive differential privacy (DP) in the continual observation model. They present an $(\varepsilon,0)$-DP algorithm for the oblivious setting that maintains accuracy for exponentially many time steps for correlated vector queries. Conversely, they prove that any $(\varepsilon,\delta)$-DP adaptive algorithm becomes inaccurate after only a constant number of releases, thus demonstrating the first explicit separation.
Oblivious differential privacy can achieve exponential accuracy under continual observation, while adaptive differential privacy provably fails after a constant number of releases, revealing a stark separation.
We resolve an open question of Jain, Raskhodnikova, Sivakumar, and Smith (ICML 2023) by exhibiting a problem separating differential privacy under continual observation in the oblivious and adaptive settings. The continual observation (a.k.a. continual release) model formalizes privacy for streaming algorithms, where data is received over time and output is released at each time step. In the oblivious setting, privacy need only hold for data streams fixed in advance; in the adaptive setting, privacy is required even for streams that can be chosen adaptively based on the streaming algorithm's output. We describe the first explicit separation between the oblivious and adaptive settings. The problem showing this separation is based on the correlated vector queries problem of Bun, Steinke, and Ullman (SODA 2017). Specifically, we present an $(\varepsilon,0)$-DP algorithm for the oblivious setting that remains accurate for exponentially many time steps in the dimension of the input. On the other hand, we show that every $(\varepsilon,\delta)$-DP adaptive algorithm fails to be accurate after releasing output for only a constant number of time steps.