Two prevalent models in the data stream literature are the insertion-only and turnstile models. Unfortunately, many important streaming problems require a $\Theta(\log(n))$ multiplicative factor more in their memory usage for turnstile streams than for insertion-only streams. This complexity gap often arises because the underlying frequency vector $f$ is very close to $0$, after accounting for all insertions and deletions to items. Signal detection in such streams is difficult, given the large number of deletions.
In this talk we propose an intermediate model which, given a parameter $\alpha \geq 1$, lower bounds the norm $\|f\|_p$ by a $1/\alpha$-fraction of the $L_p$ mass of the stream had all updates been positive. We show that for streams with this $\alpha$-property, for many fundamental problems we can replace a $\log(n)$ factor in algorithms in the turnstile model with a $\log(\alpha)$ factor. This is true for identifying heavy hitters, inner product estimation, $L_0$ estimation, $L_1$ estimation, $L_1$ sampling, and support sampling. Since in practice many important turnstile data streams are in fact $\alpha$-property streams for small values of $\alpha$, these results represent significant improvements in efficiency for these applications
Joint work with David Woodruff