What is information theory? | Journey into information theory | Computer Science | Khan Academy

What is information theory? | Journey into information theory | Computer Science | Khan Academy


Imagine Alice has an idea
and she wants to share it. There are so many ways to share an idea. She could draw a picture, make an engraving, write a song, (piano music) send a telegraph or an email. But how are these things different? And more importantly, why are they the same? This story is about a fundamental particle of all forms of communication. It begins with a special skill
you likely take for granted. Language. All language allows you to take a thought or mental object and break it down into a
series of conceptual chunks. These chunks are externalized using a series of signals or symbols. Humans express themselves using a variation in sound and physical action, as do chirping birds and dancing bees. And man-made machines exchanging a dancing stream
of electrical vibrations. Even our bodies are built
according to instructions stored inside microscopic
books known as DNA. All are different forms of one thing. Information. In simplest terms, information is what allows
one mind to influence another. It’s based on the idea of
communication as selection. Information, no matter the form, can be measured using a fundamental unit, in the same way we can measure
the mass of different objects using a standard measure
such as kilograms or pounds. This allows us to precisely
measure and compare the weight of say rocks, water, or wheat using a scale. Information too can be
measured and compared using a measurement called entropy. Think of it as an information scale. We intuitively know that a single page from some unknown book has less information than the entire book. We can describe exactly how much using a unit called the bit, a measure of surprise. So no matter how Alice wants to communicate a specific message, hieroglyphics, music, computer code, each would contain the
same number of bits, though in different densities. And a bit is linked to a very simple idea. The answer to a yes or no question. Think of it as the language of coins. So how is information actually measured? Does information have a speed limit? A maximum density? Information theory holds the exciting answer to these questions. It’s an idea over 3,000
years in the making. But before we can understand this, we must step back and explore perhaps the most powerful
invention in human history. The alphabet. And for this, we return to the cave.

17 thoughts on “What is information theory? | Journey into information theory | Computer Science | Khan Academy”

  1. Dejan Bogosavljev

    I must say that this is the best intro to a topic I have ever heard,  it did it's task, to make me curious about the topic and willing to learn, perfectly. I don't know what exactly caught me, but there's something special in this particular combination of words, pronunciations and sounds.

  2. How many bits? Have fun answering the question with me? (I think research is fun.) How many bits in a hard disk? How many bits in the human mind? How any bits in the universe? If you would like to have a session of pondering about these questions by having a conversation and researching the web, contact me.

  3. Again, a wrong definition of what "amount of information" is. "A bit is linked to a very simple idea, the answer to a YES or NO question.Think of it as the language of coins." That YES OR NO question is assuming that the yes and no correct answers refer to one of two events which are exactly equally probable. IF you do not explain the PROBABILITY then your definition is TOTALLY incorrect and confusing. Too bad it has so many views.

  4. Minimum information theory. In a situation when none of the people have context how much information do you minimum need to share if you can not use your body language like pointing a finger. You want the person in the room to look at the thing you found interesting how much information do you need to share. Look at the ugly clock hanging over the exit door on the left side

  5. Thinking of reality as information relies on the fallacy that all ideas are on the same level of abstraction.

    For example, saying that an almost perfectly defined, not very abstract concept, like a specific chair, is on the same level of abstraction as a very loosely defined, very abstract concept, like pride, is fallacious.

    This is why computers still cannot think as critically and as creatively as humans do. Artificial intelligence relies on the false assumption that all concepts are perfectly defined and at the same level of abstraction. This presumed single level of abstraction is called “information”.

    For everything to be on the same level of abstraction, one could only look at the Universe at the infinitesimal level. Here, the level of abstraction is zero. It’s the most fundamental level of reality.

    But of course that would be computationally impossible, as well as nonsensical because at the infinitesimal level, there are no such things as observers, like humans or computers.

  6. 2:22 based on 2:42 is simply not true. "Information" refers to the length, complexity and meaning of the message, while bits are just one unit among many others for transmitting/encoding state changes. For information, it's not valid that its encoding is necessarily binary, and if so, to encode/express an information message in a binary format/medium, the amount of bits needed can explode, because in reality, information is hardly emitted by an exclusively binary source without any context, so encoding/translating/converting such messages into the artificial binary encoding scheme increases the amount of bits needed in order to represent the message loss-lessly, preserving its informational meaning.

Leave a Reply

Your email address will not be published. Required fields are marked *