Summer of Math Exposition

Entropy Actually Explained [Machine Learning | Information Theory]

Entropy is one of the most important, yet least understood statistical quantities. This video will clear up your questions, and is a deep dive into Entropy as it appears in different areas of mathematics, in particular Artificial Intelligence. We build an ML Decision Tree Classifier from scratch!

Analytics

6.1 Overall score*
25 Votes
11 Comments
Rank 40

Comments

I really enjoyed your video, learned a lot of new information from it, so thank you!

7

An excellent video. I like the historical context with Gibbs. An ambitious topic and did a great job of covering it.

8.8

I can tell you put a lot of effort into this video, but I had a very hard time following it. You assumed a lot of background familiarity with things without making it clear that that was expected, and you often presented equations without fully explaining them, resulting in me completely losing track of what was going on about 10 minutes in. Furthermore, there were several parts of the video where the visuals/equations completely disappeared for 10-20 seconds, which only exacerbated the issue (I myself always make a point of watching my entire video right before uploading to make sure I catch things like this).

3.1

very informative, but a bit dense

4.9

This is not really bad, but it's waaaay too long. It should be split in, say, four or five videos.

4.8

That's too long.

4.9

from basis to some complexity, clear explanation but this is a classroom teacher video (with good exemples). Long for somebody not interested, but I looked to the end (as it reminded me my signal processing at university.

5.7

Great video with a comprehensive introduction. Very logical and clearly explained. Seems to assume a good amount of mathematical maturity. If someone is mathematically proficient enough to know exactly what injective and random variable mean off the top of their head, they probably know a good portion of the introduction already. It is still important to lay the foundation for the other 80% of the video, so maybe make it clearer what the prerequisites are somewhere would help. Also a couple of references to various fields of math, physics, CS, etc that might fly over the average viewer's head which is not a huge deal but could be off putting? The video is interesting for someone who wants to dive into information theory, but maybe a bit dry for any more casual math enthusiast, I think it could benefit from more examples, real world applications or analogies, or motivation at the beginning, or meta-commentary. It was a bit difficult to follow at times because of the monotonous nature of formula after formula. It's already very long, but to be easier to follow it would need less assumptions ("We can see that ..." can the viewer necessarily easily see?), more walking through the full logic, etc.

4.5

I remember you posting this on the server. I think it's a crazy accomplishment of a video. Has some new creator dust, but overall I really enjoyed this one.

6.9

Feels a bit more like an average lecture, lots of concepts coming from seemingly nowhere. Audio quality can be improved with a bit of audio processing (to reduce noise).

4.2

I appreciate the history and sourcing of information you've produced with how to understand entropy as uncertainty, as well as how it applies to many fields of study. A lot was covered as you've said, thank you! I find that I'll need to rewatch this a few times, yet would not mind because it is engaging to learn comprehensive work. In terms of a SoME entry, I'd recommend to break this down into digestible chunks like 3b1b's, "The Essence of Linear Algebra" playlist.

6.2