This is a very nice presentation. I like the tie-in to data science and the covariance matrix. One small thing: at 27:22 you say you can compute V similarly to U. But there is a hazard: the eigenvectors of V are dependent on the choices made for U (even Gil Strang ran into this issue). It's best to substitute U back into the original decomposition definition and solve for V (the remaining unknown). I'm enjoying this series.
@AllAnglesMath
3 ай бұрын
Thanks for pointing that out!
@JobBouwman
18 күн бұрын
Some remarks: 1) At 9:50 you say that the height has a bigger variance than the shoe sizes. However, this depends on your unit of measurement! If you would have written your height in meters, this would not be the case. (A way tot standardize is to use the "coefficient of variation", which is the standarddeviation divided by the mean value.) 2) At around 12:30 you state that in the PCA analysis, the height would be the Principal Component. That's not the case either. The principal component is a new, diagonal axis that describes "the average relative bone length factor". This factor predicts the height from shoe size and vice versa. The part of the data which is not explained by this factor is the second principal component.
@AllAnglesMath
18 күн бұрын
Thank you so much for these corrections. I love it when I learn something in the comments.
@robin1826
4 күн бұрын
Thank you! Excellent series, the highest quality explanations. I signed up for the Patreon. Keep up the incredible work. What a tremendous gift to the world!
@rylieweaver1516
3 ай бұрын
Love your vids as always 🙌🏻
@oxbmaths
3 ай бұрын
Very nice video. At 10:00 should the green and purple lines correspond to the respective lengths of the major and minor axes instead?
@AllAnglesMath
3 ай бұрын
Yes, they probably should. That's a subtlety that escaped me. Thanks for sharing!
@oxbmaths
3 ай бұрын
@@AllAnglesMath 😊 Thanks. Keep up the good work!
@hellfishii
3 ай бұрын
If you are brave enough read Linear Algebra Done Right, this shit is not PG
@ngruhn
3 ай бұрын
18:24 subtly bashing the Last Jedi 👍
@AllAnglesMath
3 ай бұрын
Not even very subtle to be honest. It was just the perfect example for showing what happens to a column with all zeroes 😆
@desmondcampbell9358
3 ай бұрын
Fantastic exposition. Thanks very much for your great work and insights.
@williammartin4416
3 ай бұрын
Fantastic lecture
@mph8759
3 ай бұрын
Thank you for the well explained video. I wonder how this could be applied to financial modelling and risk analysis - my first thought is to run a Monte Carlo analysis with as many variables as possible and record all variable values with the output (for example profit or IRR). Then “just” do the “elipse thing” to figure out what variables are the most impactful?
@AllAnglesMath
3 ай бұрын
Sounds like an amazing application. Ambitious, but it can be done.
@mph8759
3 ай бұрын
@@AllAnglesMath unfortunately I’m not advanced enough at math.. really enjoyed the video though
@tiruarthanari3039
Ай бұрын
I enjoy going through your videos. Thank you for your effort and time. How are you annimatingt your slides? Or what software are you using?
@AllAnglesMath
Ай бұрын
Custom made Python library, with OpenCV to generate the final video.
@shahulrahman2516
3 ай бұрын
Keep doing such lectures. Kudos
@05degrees
3 ай бұрын
👏👍
@gfbtfbtfilyfxbtyewqqef
3 ай бұрын
Aw man if only this was uploaded before my linear algebra exam then I would've had a better understanding
@AllAnglesMath
3 ай бұрын
The video was already available on Patreon for several months. I hope your exam went well!
@APaleDot
3 ай бұрын
tfw The Last Jedi was the only good movie in the final trilogy 😔
Пікірлер: 26