WHAT IS INFORMATION?
Wojciech Szpankowski - Purdue University
Jan 24, 2007Size: 219.6MB
Download: MP4 Video
Watch in your Browser Watch on YouTube
AbstractInformation permeates every corner of our lives and shapes our
universe. Understanding and harnessing information holds the potential for
significant advances. The breadth and depth of underlying concepts of
the science of information transcend traditional disciplinary boundaries
of scientific and commercial endeavors. Information can be manifested
in various forms: business information is measured in dollars;
chemical information is contained in shapes of molecules;
biological information stored and processed in our cells prolongs life.
So what is information? In this talk we first attempt to identify the
most important features of information and define it in the broadest
possible sense. We subsequently turn to the notion and theory of information
introduced by Claude Shannon in 1948 that served as the backbone for
digital communication. We go on to bridge Shannon information with
Boltzmann's entropy, Maxwell's demon, Landauer's principle and
Bennett's irreversible computations. We point out, however,
that while Shannon created a successful and beautiful theory
of information for communication, a wide spread application of information
theory to economics, biology, life science and complex networks seems to be
still awaiting us. We shall discuss some examples that recently crop up in
biology, chemistry, computer science, and quantum physics. We conclude
with a list of challenges for future research.
We hope to put forward some educated questions, rather than answers,
to the issues and tools that lay before researchers interested in information.
About the SpeakerBefore coming to Purdue, Wojciech Szpankowski was assistant professor at the Technical University of Gdansk, and in 1984 he was assistant professor at the McGill University, Montreal. During 1992-93, he was professeur invité at INRIA, Rocquencourt, France. His research interests cover analysis of algorithms, data compression, information theory, analytic combinatorics, random structures, networking, stability problems in distributed systems, modeling of computer systems and computer communication networks, queueing theory, and operations research. His recent work is devoted to the probabilistic analysis of algorithms on words, analytic information theory, and designing efficient multimedia data compression schemes based on approximate pattern matching.
He is a recipient of the Humboldt Fellowship. He has been a guest editor for special issues in IEEE Transactions on Automatic Control, Theoretical Computer Science, Random Structures & Algorithms, and Algorithmica. Currently, he is editing a special issue on "Analysis of Algorithms" in Algorithmica. He serves on the editorial boards of Theoretical Computer Science, Discrete Mathematics and Theoretical Computer Science, and the book series Advances in the Theory of Computation and Computational Mathematics.
The views, opinions and assumptions expressed in these videos are those of the presenter and do not necessarily reflect the official policy or position of CERIAS or Purdue University. All content included in these videos, are the property of Purdue University, the presenter and/or the presenter’s organization, and protected by U.S. and international copyright laws. The collection, arrangement and assembly of all content in these videos and on the hosting website exclusive property of Purdue University. You may not copy, reproduce, distribute, publish, display, perform, modify, create derivative works, transmit, or in any other way exploit any part of copyrighted material without permission from CERIAS, Purdue University.