Tight bounds for synchronous communication of information using bits and silence
We establish worst-case and average-case lower bounds on the trade-off between the time and bit complexity for two-party communication in synchronous networks. We prove that the bounds are tight by presenting protocols whose bit-time complexity match the ones expressed by the lower bounds. We actual...
Saved in:
Published in | Discrete Applied Mathematics Vol. 129; no. 1; pp. 195 - 209 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
15.06.2003
|
Online Access | Get full text |
ISSN | 0166-218X 1872-6771 |
DOI | 10.1016/S0166-218X(02)00240-8 |
Cover
Summary: | We establish worst-case and average-case lower bounds on the trade-off between the time and bit complexity for two-party communication in synchronous networks. We prove that the bounds are tight by presenting protocols whose bit-time complexity match the ones expressed by the lower bounds. We actually show that the algorithms are
everywhere optimal: at any point of the trade-off and for any universe of data to be communicated, no other solution has better complexity to communicate any element of that universe (within a fixed relabeling). Similar results are derived when transmissions are subject to corruptions.
In these results, the number of bits is a priori agreed upon. We also derive lower bounds on the worst case complexity of two-party communication when the number of bits is variable; the bounds prove that any improvement would be by an additive constant (even in presence of an oracle). |
---|---|
ISSN: | 0166-218X 1872-6771 |
DOI: | 10.1016/S0166-218X(02)00240-8 |