2011 Barbados Workshop on Complexity
Information Theory in Theoretical Computer Science and Discrete Mathematics
Outline of topics (tentative)
Michael Saks

Note: This outline is a work in progress. I will be adjusting topics between now and the time of the workshop, and would appreciate feedback.

As for prerequisities: I plan to develop material ``from scratch'' but of course it will be helpful to have some prior exposure. For information theory, I suggest reviewing the basic definitions and properties of entropy, conditional entropy, mutual information, Kullback-Liebler divergence, and Shannon's noiseless coding theorem. There are many sources for this material; one good one is chapter 2 of the book "Elements of Information Theory" by Cover and Thomas. It would also be good to have some familiarity (which I expect most participants do), with Communcation complexity such as the first three chapters of the Kushilevitz-Nisan book "Communication Complexity".

For those ambitious enough to read some of the papers below in advance of the workshop, I have marked with a (**) those papers I think would be most useful to read in advance.

I. Communication complexity

  1. Lower bounds for communication complexity and streaming:
    1. (**) Z. Bar-Yossef, T.S. Jayram, R. Kumar, D. Sivakumar, An information statistics approach to data stream and communication complexity, JCSS 68 (2004) 702-732.
    2. Z. Bar-Yossef, T.S. Jayram, R. Kumar, and D. Sivakumar, Information theory methods in communication complexity.
    3. T.S. Jayram, Hellinger strikes back: A note on the multi-party information complexity of AND.
    4. T.S. Jayram and D.P. Woodruff, The data stream space complexity of cascaded norms
    5. A. Andoni, T. S. Jayram, M. Patrascu. Lower Bounds for Edit Distance and Product Metrics via Poincaré-Type Inequalities. SODA'2010. pp.184-192.
    6. N. Leonardos, M. Saks, Lower Bounds on the Randomized Communication Complexity of Read-Once Functions. Computational Complexity 19(2010), 153-181.
  2. Compressing interactive computation
    1. (**) B. Barak, M. Braverman, X. Chen and A. Rao, How to compress interactive communication.
    2. M. Braverman and A. Rao, Information equals amortized computation.
  3. Direct product theorems for communication complexity
    1. B. Barak, M. Braverman, X. Chen and A. Rao, How to compress interactive communication.
    2. R. Jain, Strong direct product conjecture holds for all relations in public coin randomized one-way communication complexity.
  4. Noisy broadcast
      N.Goyal, G. Kindler, M. Saks, Lower Bounds for the Noisy Broadcast Problem. SIAM J. Comput. 37(2008), 1806-1841.
II. Data structure lower bounds
  1. A. Chakrabarti, O. Regev, An optimal randomised cell probe lower bound for approximate nearest neighbor searching.
  2. (**) M. Patrascu, Unifying the landscape of cell-probe lower bounds.
III.Sample complexity of approximation algorithms

  1. (**) Z. Bar-Yossef, R. Kumar, D. Sivakumar, Sampling algorithms: Lower bounds and algorithms.
  2. Z. Bar-Yossef, Sampling lower bounds via information theory.
IV. Graph Entropy
  1. Basic notions and alternative formulations; connection to perfect graphs
    1. (**) G. Simonyi, Graph Entropy: A survey, DIMACS Series in Discrete Mathematics and Theoretical Computer Science.
    2. I. Csisz\'ar, J. K\"orner, L. Lov\'asz, K. Marton, and G. Simonyi, Entropy Splitting for Antiblocking Corners and Perfect Graphs, Combinatorica 10 (1) (1990) 27-40.
    3. J. K\"orner, Coding of an information source having ambiguous alphabet and the entropy of graphs, in Transactions of the 6th Prague Conference on Informati on Theory, etc. Academia, Prague, 1973, 411-425.
    4. N. Alon and A. Orlitsky, Source coding and graph entropies,
  2. Applications
    1. Perfect hashing families, formula size and circuit size lower bounds
    2. G. Simonyi, Graph Entropy: A survey, DIMACS Series in Discrete Mathematics
    3. J. K\"orner, Fredman-Koml\'os bounds and information theory, SIAM J. on Alg ebraic and Discrete Methods 7 (1986) 560-570.
    4. I. Newman, P. Ragde, and A. Wigderson, Perfect hasing Graph Entropy, and Circuit Complexity. Structure in Complexity Theory Conference 1990: 91-99
    5. Jaikumar Radhakrishnan: Better Lower Bounds for Monotone Threshold Formulas. J. Comput. Syst. Sci. 54(2): 221-226 (1997)/ol>
V. Comparison searching and sorting
  1. Sorting
    1. J. Kahn and J.H. Kim, Entropy and sorting, J. Comput. Syst. Sci. 51 (1995) 390-399.
    2. J. Cardinal, S. Fiorini, G. Joret, R.M. Jungers and I. Munro, An efficient algorithm for partial order production, SIAM J. Computing 10 (2010) 2927-2940.
    3. J. Cardinal, S. Fiorini, G. Joret, R. M. Jungers, and J. I. Munro. Sorting under Partial Information (without the Ellipsoid Algorithm). In Proc. ACM Symposium on Theory of Computing (STOC), pages 359-368, 2010. ACM.
  2. Searching
      N. Linial and M. Saks, Every poset has a central element, JCT A 40 (1985) 195-210. (Note: This paper has no explicit information theory, I plan to Present an unpublished information theoretic version of the proof Suggested by James Shearer)
VI. Other applications in (discrete) mathematics
  1. Inequalities via information theory: Cauchy-Schwartz, Holder, Bregman, Shearer, hypercontractive inequalities, concentration bounds
    1. Ehud Friedgut, Hypergraphs, Entropy and Inequalities. The American Mathematical Monthly, 111 (2004), no. 9, 749--760..
    2. E. Friedgut and V. Rödl, Proof of a Hypercontractive Estimate via Entropy Israel J. Math. 125 (2001), 369--380 .
    3. J. Radhakrishnan, Entropy and Counting
    4. R. Impagliazzo, V. Kabanets, Constructive proofs of concentration bounds
  2. Approximate counting
    1. J. Kahn, Entropy, independent sets and antichains: a new approach to Dedekind's problem, Proc. of the AMS 130 (2) (2002), 371-378.