Mike's Notes
A great summary introduction from Quanta Magazine.
Resources
- https://www.quantamagazine.org/authors/brubaker_ben/
- https://en.wikipedia.org/wiki/Claude_Shannon
- https://www.quantamagazine.org/how-claude-shannons-concept-of-entropy-quantifies-information-20220906/
- https://www.linkedin.com/pulse/math-keeps-messages-error-free-quanta-magazine-v3tve/
- https://www.quantamagazine.org/how-math-can-improve-your-wordle-score-20220525/
- https://www.quantamagazine.org/how-claude-shannons-information-theory-invented-the-future-20201222/
- https://www.quantamagazine.org/what-is-entropy-a-measure-of-just-how-little-we-really-know-20241213/
- https://www.quantamagazine.org/quantum-theory-rebuilt-from-simple-physical-principles-20170830/
- https://www.quantamagazine.org/the-information-theory-of-life-20151119/
- https://www.quantamagazine.org/why-everything-in-the-universe-turns-more-complex-20250402/
- https://spectrum.ieee.org/claude-shannon-tinkerer-prankster-and-father-of-information-theory
- https://www.youtube.com/watch?v=fzTVHxLhjDc
- https://www.cell.com/fulltext/S0092-8674(13)00453-4
References
- The Mathematical Theory of Communication (1949). Claude E. Shannon and Warren Weaver.
Repository
- Home > Ajabbi Research > Library > Subscriptions > Fundamentals from Quanta Magazine
- Home > Handbook >
Last Updated
06/06/2025
What Is Information? The Answer Is a Surprise.
Ben Brubaker is a staff writer covering computer science for Quanta Magazine. He previously covered physics as a freelance journalist, and his writing has also appeared in Scientific American, Physics Today, and elsewhere. He has a Ph.D. in physics from Yale University and conducted postdoctoral research at the University of Colorado, Boulder..
It’s often said that we live in the information age. But what exactly is information? It seems a more nebulous resource than iron, steam and other key substances that have powered technological transformations. Indeed, information didn’t have a precise meaning until the work of the computer science pioneer Claude Shannon in the 1940s.
Shannon was inspired by a practical problem: What’s the most efficient way to transmit a message over a communication channel like a telephone line? To answer that question, it’s helpful to reframe it as a game. I choose a random number between 1 and 100, and your goal is to guess the number as quickly as possible by asking yes-or-no questions. “Is the number greater than zero?” is clearly a bad move — you already know that the answer will be yes, so there’s no point in asking. Intuitively, “Is the number greater than 50?” is the best opening move. That’s because the two possible answers are equally likely: Either way, you’ll learn something you couldn’t have predicted.
In his famous 1948 paper “A Mathematical Theory of Communication,” Shannon devised a formula that translated this intuition into precise mathematical terms, and he showed how the same formula can be used to quantify the information in any message. Roughly speaking, the formula defines information as the number of yes-or-no questions needed to determine the contents of a message. More predictable messages, by this measure, contain less information, while more surprising ones are more informative. Shannon’s information theory laid the mathematical foundation for data storage and transmission methods that are now ubiquitous (including the error correction techniques that I discussed in the August 5, 2024, issue of Fundamentals). It also has more whimsical applications. As Patrick Honner explained in a 2022 column, information theory can help you win at the online word-guessing game Wordle.
In a 2020 essay for Quanta, the electrical engineer David Tse reflected on a curious feature of information theory. Shannon developed his iconic formula to solve a real-world engineering problem, yet the underlying mathematics is so elegant and pervasive that it increasingly seems as if he hit upon something more fundamental. “It’s as if he discovered the universe’s laws of communication, rather than inventing them,” Tse wrote. Indeed, Shannon’s information theory has turned out to have unexpected connections to many different subjects in physics and biology.
What’s New and Noteworthy
The first surprising link between information theory and physics was already present in Shannon’s seminal paper. Shannon had previously discussed his theory with the legendary mathematician John von Neumann, who observed that Shannon’s formula for information resembled the formula for a mysterious quantity called entropy that plays a central role in the laws of thermodynamics. Last year, Zack Savitsky traced the history of entropy from its origins in the physics of steam engines to the nanoscale “information engines” that researchers are developing today. It’s a beautiful piece of science writing that also explores the philosophical implications of introducing information — an inherently subjective quantity — into the laws of physics.
Such philosophical questions are especially relevant for researchers studying quantum theory. The laws of quantum physics were devised in the 1920s to explain the behavior of atoms and molecules. But in the past few decades, researchers have realized that it’s possible to derive all the same laws from principles that don’t seem to have anything to do with physics — instead, they’re based on information. In 2017, Philip Ball explored what researchers have learned from these attempts to rebuild quantum theory.
Physics isn’t the only field influenced by ideas from information theory. Soon after Shannon’s paper, information became central to the way researchers think about genetics. More recently, some researchers have brought principles from information theory to bear on some of the thorniest questions in biology. In a 2015 Q&A with Kevin Hartnett, the biologist Christoph Adami described how he uses information theory to explore the origins of life. In April, Ball wrote about a new effort to reframe biological evolution as a special case of a more fundamental “functional information theory” that drives the emergence of complexity in the universe. This theory is still speculative, but it illustrates the striking extent of information theory’s influence.
As the astrobiologist Michael Wong told Ball, “Information itself might be a vital parameter of the cosmos, similar to mass, charge and energy.” One thing seems certain: Researchers studying information can surely expect more surprises in the coming years.