Summer of Math Exposition

Latent Space Visualisation: PCA, t-SNE, UMAP | Deep Learning Animated

We explore 3 common dimensionality reduction techniques while keeping in mind the application to Deep Learning.

Analytics

7 Overall score*
44 Votes
12 Comments
Rank 6

Comments

This video was really fantastic. I am not a coder, and know very little about deep learning, but the presentation was very clear and beautifully presented. I was able to follow almost everything that was presented, though I did get a little confused with the some of the tSNE and UMAP details. I guess that's not surprising. However, I believe I got the gist of the three methods so I believe the author succeeded in their stated goal. Well done!

9

Great video! I really learned a lot, especially because I only really understood PCA before. You did go through the linear algebra somewhat quickly (as someone who hasn't take linear algebra yet), so maybe take a little more time to explain eigenvectors and such.

8.5

Good work. Very informative, fairly clear, though a few unsaid details could have improved accessibility for non-specialists. Visual style is very good.

7

the animations greatly support the exposition, the narration is clear and focused. seems like this kind of video might be very useful to a lot of interested folk motivation 10/10 clarity 8/10 novelty 8/10 memorability 5/10

5.7

This video is excellent! ... Before watching it, my only basic knowledge was that it exists a method in statistics and data mining called PCA... After wtching the video my idea is much precise about how and why one can reduce dimensions in data analysis and when is used each method to reach an accurate and proper visualization... Explanation is super clear and easy to follow... All in all this video, in my opinion, is representing the spirit of the SoME competition and it deserves to be in a hight qualification among all others that were presented this year ... Very very good work !!

8.7

It's been a while since I took Linear Algebra, but this was probably the clearest explanation of PCA I've seen! The explanations of the other 2 algorithms were also done well (though I imagine could be confusing for people who don't know what e.g. the KL divergence is)

6.2

Very nice explanation at an elementary level.

8.2

Very interesting topic, beautifully animated. I already knew about PCA, but I hadn't heard about the other methods yet. I learned a lot, thank you! I have downloaded one of the papers you link in the description; I plan to read it soon. If you want to see my personal approach to explaining PCA, here's a video I published recently: https://www.youtube.com/watch?v=iNEK9IEWKQI Thanks again, great video!

8.1

Well animated, well explained. Nice pace. Great topic. Enjoyed the video.

7.1

Great submission. Thanks. Years ago, I read this blog and enjoyed it. You might like it as well: https://colah.github.io/posts/2014-10-Visualizing-MNIST/

6.9

Excellent. Focuses on its topic and takes it time for a complete exposition. Visualization is always spot on.

9

Some more emphasis on motivation would have been helpful: Why and for what do we need dimension reduction? Nice visualisations!

7.2