Speaker: Kannan Ramachandran, University of California, Berkeley, USA
In this talk, we will highlight some snippets from our research journey, both recent and past, that share the common attribute of being unmistakably Shannon-inspired. There will be four short stories dedicated to this theme and unified by the Shannon spirit: (i) Duality between source coding and channel coding; (ii) Compression and Encryption done in the wrong order; (iii) Sampling theory meets Coding theory; and (iv) Network codes for the modern age of large-scale distributed storage systems.
Speaker: Emre Telatar, EPFL, Switzerland
Shannon's 1948 paper "A Mathematical Theory of Communication" put
telecommunications on a sound scientific footing by identifying the
elements of a communication system and defining its operational
concepts, creating the entire discipline of information theory in one
extraordinary stroke. The talk will walk through the paper and attempt
to highlight various vistas we encounter.
Speaker: Daniel J. Costello, Jr., University of Notre Dame, USA
This talk gives an historical overview of the field of channel coding dating back to the work of Shannon in 1948. The major advances in
coding theory since 1948 are viewed from a common perspective: the
power and bandwidth efficiencies needed to achieve a targeted level of
performance. The most important contributions in coding over the last
60 plus years are highlighted, including Hamming codes, Reed-Muller
codes, Reed-Solomon codes, convolutional codes, soft decision
decoding, trellis coded modulation, multilevel coding, concatenated
codes, turbo codes, low-density parity-check codes, spatially coupled
codes, polar codes, and iterative decoding. Finally, areas of potential
future research in channel coding are briefly discussed.
Speaker: Himanshu Tyagi, Indian Institute of Science Bangalore
Shannon formalized the mathematical definition of a "secret"
and ushered in the era of modern cryptography. In this talk, we shall review Shannon's notion of information theoretic secrecy and track its evolution into the modern avatar of semantic security due to Goldwasser and Micali. Along the way, we shall review some major breakthroughs in
information theoretic security and cryptography.
Speaker: Ajit K. Chaturvedi, Indian Institute of Technology Kanpur
At the undergraduate level, and often at the postgraduate level also, information theory and practical communication systems are taught in different courses and it is not easy for the student to decipher the relation between the two. The talk will start by pointing out some connections between fundamental limits provided by information theory and practical communication systems. A physical layer centric
perspective on information theory will be presented. In the second half of the talk, some examples and interesting scenarios from wireless communications, including standards, will be discussed to bring out how various results from information theory, like capacity of channels, not only influence but often act as a guide in designing modern
communication systems.
Speaker: Michelle Effros, California Institute of Technology, Pasadena, USA
Claude Shannon changed the world. By demonstrating the possibility of substantial and reliable communication even in the presence of noise, he sparked a communication revolution. While the impact of Shannon's solutions can be seen in the devices that shape our every day lives, the impact of his questions is, perhaps, even more profound. This talk will explore Shannon's question of reliable communication and some of the questions that it continues to inspire to this day.
Speaker: Thomas Schneider, National Institute of Health, USA
In this talk I will sweep across the major ideas I have
developed using information theory to understand biology (see
https://alum.mit.edu/www/toms/ ). We will begin with measuring the
information of protein or RNA binding sites on DNA or RNA (Rsequence,
bits per site) using Claude Shannon's information theory. The
resulting information curve can be displayed by the now-popular
graphical method of sequence logos which we invented. The total
information of binding sites (area under a logo) is predicted by the
genome size and number of sites (Rfrequency, bits per site), and this
leads to a model for the evolution of binding sites which you can run
(
https://alum.mit.edu/www/toms/papers/ev/). I will then introduce how
to apply the same theory to individual binding sites, sequence
walkers. An important question is the relationship between binding
site information and the binding energy. This lead to my discovery
that many molecular systems are 70% efficient. Surprisingly, the
mathematics that explains 70% efficiency applies to all biological
systems that have distinct states.
Speaker: Satyadev Nandkumar, Indian Institute of Technology Kanpur
Shannon's pathbreaking work in laying the foundations information theory
inspired Kolmogorov to tackle the seemingly paradoxical problem of
defining which individual finite strings are random. Kolmogorov was able
to achieve this without assuming an underlying probablity space, by using
the theory of algorithms. The notion of information thus introduced,
Kolmogorov complexity, is formally analogous to Shannon Entropy. We survey the formal analogies between the theory of Kolmogorov
complexity of finite strings and Shannon's entropy, as established by
Solomonoff, Kolmogorov, Martin-Lof, Levin, C.-P. Schnorr, P. Gacs and G.
Chaitin. We describe a mechanical way to convert many inequalities in
Shannon theory to their corresponding versions in Kolmogorov complexity,
and indicate some open areas of research in the relationship between
Shannon Information and Kolmogorov Complexity.
Speaker: Adrish Banerjee, Indian Institute of Technology Kanpur
In this talk we will explore some of Shannon's exploits outside of his academic pursuits. In particular, we will talk about his work on juggling, on solving Tower of Hanoi puzzle, his mind reading machine for playing game of matching pennies, and on chess playing machine. Finally, we will conclude our talk with a poem written by Shannon on Rubik's Cube.