|
|
Заседания Санкт-Петербургского математического общества
5 июля 2011 г. 18:00, г. Санкт-Петербург, ПОМИ, Фонтанка, 27, аудитория 311
|
|
|
|
|
Совместное заседание Санкт-Петербургского математического общества и лаборатории Чебышева СПбГУ
|
|
Information and interactive communication
M. Braverman University of Toronto
|
Количество просмотров: |
Эта страница: | 207 |
|
Аннотация:
Notions of entropy and information, pioneered by Shannon, have been very powerful tools in coding theory. Coding theory aims to solve the problem of one-way communication: sending a message from Alice to Bob using as little communication as possible, sometimes over a noisy channel.
We will discuss several extensions of information-theoretic notions to the two-way communication setting. We use them to prove a direct sum theorem for randomized communication complexity, showing that implementing $k$ copies of a functionality requires substantially more communication than just one copy.
More generally, we will show that information cost $I(f)$ can be defined as a natural fundamental property of a functionality $f$. We will describe several new tight connections between $I(f)$, direct sum theorems, interactive compression schemes, and amortized communication complexity.
|
|