Ultimate magazine theme for WordPress.

Information entropy | Journey into information theory | Computer Science | Khan Academy

23


Finally we arrive at our quantitative measure of entropy

Watch the next lesson:

Missed the previous lesson?

Computer Science on Khan Academy: Learn select topics from computer science – algorithms (how we solve common problems in computer science and measure the efficiency of our solutions), cryptography (how we protect secret information), and information theory (how we encode and compress information).

About Khan Academy: Khan Academy is a nonprofit with a mission to provide a free, world-class education for anyone, anywhere. We believe learners of all ages should have unlimited access to free educational content they can master at their own pace. We use intelligent software, deep data analytics and intuitive user interfaces to help students and teachers around the world. Our resources cover preschool through early college education, including math, biology, chemistry, physics, economics, finance, history, grammar and more. We offer free personalized SAT test prep in partnership with the test developer, the College Board. Khan Academy has been translated into dozens of languages, and 100 million people use our platform worldwide every year. For more information, visit www.khanacademy.org, join us on Facebook or follow us on Twitter at @khanacademy. And remember, you can learn anything.

For free. For everyone. Forever. #YouCanLearnAnything

Subscribe to Khan Academy’s Computer Science channel:
Subscribe to Khan Academy:


Source: https://blogema.org
Read more all post Computer Technology : https://blogema.org/computer/
23 Comments
  1. 이영섭 says

    Wonderful idea of "bounce" that express the amount of information. It's so exciting.

  2. Samir EL ZEIN says

    if you just didnt read from a script

  3. Tyler Clark says

    Yessss I finally got the concept after this video.

  4. surya charan says

    What a video!!….This is how education should be.

  5. Zain Khandwala says

    Not to knock this, but I do want to voice an issue that I have with it and every other video I've found on the topic: They always use probabilities that are an integral power of 1/2, which greatly simplifies the explanation, but doesn't generalize well to understanding the majority of real-world scenarios, for which things are not adequately covered by this simplified exposition. I worry, then, that people come away thinking they understand the topic better than they actually do. Of course, I'm open to the perspective of others here…

  6. 서창범 says

    겁나 이해 잘됨

  7. Alexander Soare says

    This should have more likes!

  8. Vivek Subramanian says

    Normally, I would give this video a like or a pass without a second thought, but for a Khan Academy video, which I hold to a very high standard since I have seen high quality videos from them in the past, the fact that this video had mistakes and glossed over some equations made me drop a thumbs down. Sorry, you guys can do better than this.

  9. Vàng Văn Lợi says

    Phụ đề Tiếng Việt ở 4:34 sai rồi, máy 2 sản xuất ít thông tin hơn máy 1

  10. BlackStar Nae says

    Im really have a hard fucking time with Entropy. fuck

  11. Youssef Dirani says

    4:45 Markoff or Markov ?

  12. Kemp Isabel says

    this video blew my mind away. Thank you! I love these intelligent yet fun videos!

  13. Zain Ul Abydeen says

    Can anyone explain ,how the answer become 3/2 in solved example ? Any help will be appreciated

  14. Kartik Bansal says

    Loved the piano bit towards the conclusion!

  15. Sammy Z says

    Why can't we ask whether it is AB, for the second distribution, same as the first distribution?

  16. Pedro Gorilla says

    I have asked several professors in different universities and countries, why we adopted a binary system to process information and they all answered because you can modulate it with electricity, the state on or off. This never satisfied me. Today I finally understand the deeper meaning and the brilliance of binary states in computing and its interfacing with our reality.

  17. Jonathan Gasser says

    Wow, what a presentation!

  18. Jay Rar says

    so just to clarify, is the reason the decision tree for machine B is not the same as for A as you ask less questions overall? and how do you ensure that the structure of the decision tree is such that it asks the minimum number of questions?

  19. Грим Морген says

    The concept had been presented to me on some online course, but until this video I didn’t really understand it. Thank you!

  20. Maierdan Efan says

    Thank you!

  21. Shep Bryan says

    Why is the number of bounces the log of the outcomes?

  22. Mostafa Omar says

    Thank you. Explains the intuition behind Entropy very clearly.

Leave A Reply

Your email address will not be published.