[PDF][PDF] Interactive communication: Balanced distributions, correlated files, and average-case complexity
A Orlitsky - FOCS, 1991 - scholar.archive.org
FOCS, 1991•scholar.archive.org
Abstract (X, Y) is a pair of random variables distributed over a support set S. Person PX
knows X, Person Py knows Y, and both know S. Using a predetermined protocol, they
exchange binary messages in order for Py to learn X. Px may or may not learn Y. The m-
message complexity, C,, is the number of information bits that must be transmitted (by both
persons) in the worst case if only m messages are allowed. C, is the number of bits required
when there is no restriction on the number of messages exchanged. We consider a natural …
knows X, Person Py knows Y, and both know S. Using a predetermined protocol, they
exchange binary messages in order for Py to learn X. Px may or may not learn Y. The m-
message complexity, C,, is the number of information bits that must be transmitted (by both
persons) in the worst case if only m messages are allowed. C, is the number of bits required
when there is no restriction on the number of messages exchanged. We consider a natural …
Abstract (X, Y) is a pair of random variables distributed over a support set S. Person PX knows X, Person Py knows Y, and both know S. Using a predetermined protocol, they exchange binary messages in order for Py to learn X. Px may or may not learn Y. The m-message complexity, C,, is the number of information bits that must be transmitted (by both persons) in the worst case if only m messages are allowed. C, is the number of bits required when there is no restriction on the number of messages exchanged. We consider a natural class of random pairs., ii is the maximum number of X values possible with any given Y value. ij is the maximum number of Y values possible with any given X value. The random pair (X, Y) is balanced if fi= 6. The following hold for all balanced random pairs. One-way communication rcquires> t most twice the minimum number of bits: C1 _< 2C,+ 1. This bound is almost tight: for every a, there is a balanced random pair for which C1 2 2C,-6 2 9. Three message; are asymptotically optimum: C3 _< C,+ 3logC,+ 11. More importantly, the number of bits required is only negligibly larqer than that needed when PX knows Y in advance: C, 5 C3 5 logji+ 3loglogF+ 11. We apply these results to obtain efficient protocols for the correlated files problem where X and Y are binary strings (files) within a small edit distance from each other.
We also consider the average number of bits re-quired for Py to learn X when at most m messages are permitted. We show that for all random pairs, not only balanced ones, four messages are asymptotically optimum and that the number of bits required is only negligibly larger than that needed when PX knows Y in advance.
scholar.archive.org
Showing the best result for this search. See all results