View a PDF of the paper titled Denoising the Future: Top-p Distributions for Moving Through Time, by Florian Andreas Marwitz and 3 other authors
View PDF
HTML (experimental)
Abstract:Inference in dynamic probabilistic models is a complex task involving expensive operations. In particular, for Hidden Markov Models, the whole state space has to be enumerated for advancing in time. Even states with negligible probabilities are considered, resulting in computational inefficiency and possibly increased noise due to the propagation of unlikely probability mass. We propose to denoise the future and speed up inference by using only the top-p transitions, i.e., the most probable transitions with accumulated probability p. We show that the error introduced by using only the top-p transitions is bound by $p$ and the so-called minimal mixing rate of the underlying model. We also show the same bound when using only the top-p states, which is the same, just for the states. Moreover, in our empirical evaluation, we show that we can, when using top-p transitions, expect speedups of at least an order of magnitude, while the error in terms of total variation distance is below 0.09. Using the top-p states is slower than top-p transitions since we iterate over all states in each time step and sometimes lead empirically to a higher error. With a more sophisticated implementation, the speed-up, if any, would be really small. While top-p transitions look really promising, we cannot recommend top-p states and discuss why it is of the slower, while the error does not necessarily decrease.
Submission history
From: Florian Andreas Marwitz [view email]
[v1]
Mon, 9 Jun 2025 09:23:09 UTC (928 KB)
[v2]
Mon, 20 Oct 2025 17:51:25 UTC (928 KB)
[v3]
Tue, 21 Oct 2025 12:18:34 UTC (887 KB)
[v4]
Tue, 31 Mar 2026 09:06:12 UTC (2,197 KB)


