I'm loving the frequent videos! Such a happy surprise to open KZitem and see a new functional analysis video every day! :)
@brightsideofmaths
4 жыл бұрын
Thank you! I am working hard at the moment :)
@saadtahir96
4 жыл бұрын
@@brightsideofmaths Thank you for this! Can you please share your email address or inbox me at saadtahir96@gmail.com? I have some useful material that you may like, and ultimately also help me with this course too! :D
@brightsideofmaths
4 жыл бұрын
@@saadtahir96 kzitem.infoabout
@NorwegianFr34k
4 жыл бұрын
Not even joking, yesterday I was about to ask whether you would make a video on this topic. So this was a nice surprise :D
@RepTheoAndFriends
3 жыл бұрын
I know that this is a meme, but: I really enjoy the statements of basic Functional Analysis, because they resemble stuff from representation theory (A continuous function H -> F seems to be an analytic version of an exact Functor from a 'nice' triangulated category T to k-Vect). For a certain triangulated category T one can show that K_0(T) is the power series ring k[[t]]. Alltogether we obtain a map from K_0(T)= k[[t]]-> k=K_0(k-Vect) (At least after tensoring with k). The statement is that every such Functor which is continuous (i.e. exact and sends arbitrary coproducts to arbitrary coproducts) is representable (This is Brown Representability theorem).
@jorgearturomartinezsanchez4882
2 жыл бұрын
Thank you, good sir. I'm writing my thesis and never took functional analysis so your videos help a lot
@psyspin
4 жыл бұрын
Congratulations man! This is an amazing intro to a topic that I like very much (although I am not a mathematician) but I struggle to understand through my self-study. It really helps me a lot! Again congrats and keep up the good work :)
@brightsideofmaths
4 жыл бұрын
Thank you! I am glad that I can help :)
@qiaohuizhou6960
3 жыл бұрын
Hi, thank you so much for your video! I am sorry if I throw too many questions on the same day... I am wondering could you please share insights on why x_l must belongs to the orthogonal complement of the kernel of l? I know the kernel is a subspace of a vector space, and I know the row space(or column space) is orthogonal to the null space. I can sort of following every step to where l(x)= but I don't get the insight of choosing x_l from orthogonal complement of the kernel. Also, it seems this special x_l chosen is analog to the singular vector in a finite space... are these two concepts somehow connected? Sorry I wasn't majored in Maths and have a very limited background in all sorts of maths subjects. I hope you don't find my question naive and lack in basic understanding. I am glad if you could point me to the right direction of study!
@brightsideofmaths
3 жыл бұрын
Don't worry at all. All questions are welcome here. Even naive ones can help other viewers here quite a lot. The choice of x_l makes sense here because in the inner product all elements in ker(l) have to be sent to 0 as well. This is then what the inner product can do.
@arturo3511
Жыл бұрын
at 4:45, is it always true that by continuity, the pre-image of closed sets are closed ? You said that with the continuity translates to closed sets for complements. I don't understand what you mean for complements , is there an extra-criterion for it to translate to closed sets? Or is it always true that if continuous the pre-image of closed sets is always closed? I'm simply asking this to know whether it's possible to have the preimage of an closed set being open which wouldn't go against the definition of continuity we saw. Thank you ! Additionally it seems that at 5:16, x_l can be defined by any x^ (x-hat) that satisfies given properties, is it true that only one x^ satisfies these properties since x_l is unique ?
@brightsideofmaths
Жыл бұрын
By the abstract continuity we have: preimages of open sets are also open. This translates to: preimages of closed sets are also closed. Please also note that a set can be closed and open at the same time.
@tensorfeld295
3 жыл бұрын
Can you do a course on differential geometry? Starting elementary then continuing with manifolds. Maybe you can do something with Banach- and Hilbert-Manifolds. Would be nice! ^^
@StratosFair
2 жыл бұрын
Wanted to give myself a quick refresher for the proof of Riesz representation theorem, and this was extremely clear and helpful, just like I remembered it to be ! I hope you will get the chance to cover orthogonal projections as some point
@MikhailBarabanovA
3 жыл бұрын
Finally an answer to why we can just transpose vector space elements and they become OK as an functional. Thanks!
@anne-catherine_gagne
5 ай бұрын
Thank you! Your video really helped me understand better the material. I feel more confident for my final tomorrow
@brightsideofmaths
5 ай бұрын
Nice! Good luck :)
@hoijanlai
3 жыл бұрын
The course I am taking also has a step that proves that the dimension of the ortho-complement of ker(l) is 1, do you know why is it? Thanks
@dibeos
3 жыл бұрын
I don’t understand why (lambda)*x(hat)-x is in the Kernel of l... 7:08
@brightsideofmaths
3 жыл бұрын
If you apply the map l, you get zero. This is same calculation as done by the blue brackets above.
@dibeos
3 жыл бұрын
Ahhh got it! And by the way, thanks for the videos. They are really amazing. I even already sign up to your your website steadyhq.
@brightsideofmaths
3 жыл бұрын
@@dibeos Thank you very much :)
@RangQuid
Жыл бұрын
The proofs are very elegant, they really bring out the beauty and bright side of mathematics!
@brightsideofmaths
Жыл бұрын
Glad you like them! And thanks for the support :)
@AadityaVicramSaraf
Жыл бұрын
I'm unsure if I'm being dumb but for 6:34, doesn't the complex conjugate come when we multiply the scalar in the second component? I might be confused but kindly clarify.
@AadityaVicramSaraf
Жыл бұрын
in wikipedia and conway's functional analysis as well, I saw that it is conjugates for second component and normal for first component. I checked out your previous video where you said linear in second component though.
@AadityaVicramSaraf
Жыл бұрын
Ok sorry i rewatched that video and found at 6:18 that you clarified that you had chosen this definition. I also understood that eventually it is there to ensure positivity so it is our choice to choose linearity in first (or second) argument. Thanks. Stuff is much clear now
@brightsideofmaths
Жыл бұрын
Great :)
@Hold_it
4 жыл бұрын
I hope you still get enough sleep with all these high quality videos coming out in a short time ;)
@luciaperez4400
Жыл бұрын
Excellent video! Would you reference where the proof that the orthogonal complement of a closed set in a Hilbert space contains elements other than 0?
@lonjezosithole6285
3 жыл бұрын
I am learning a lot from your videos, man. Thank you for posting this content
@h-bar8649
11 ай бұрын
No clue where you would put it, but it would be great if somehow Fréchet and Gateaux were discussed in this Functional Analysis series. Unless you think it should belong elsewhere? Thanks for the videos!
@brightsideofmaths
11 ай бұрын
Great suggestion!
@mathieumaticien
3 жыл бұрын
Why does the closedness of ker(l) imply that ker(l)^ortho is nontrivial?
@brightsideofmaths
3 жыл бұрын
We also have the assumption that ker(l) is not the whole space. Hence closedness means that ker(l) is a proper subset and a Hilbert space in the Hilbert space X. Does this already help you?
@mathieumaticien
3 жыл бұрын
@@brightsideofmaths hmmm now I'm wondering why the closedness is necessary. If we say ker(l) is a strict subset of X, and k is in ker(l) and let x be in X but not ker(l), then = 0 by definition, so x is in ker(l)^ortho. Since 0 is in ker(l) and we defined x to not be in ker(l), x is not 0, and ker(l)^ortho is nontrivial. Where does the closedness of ker(l) come into play?
@hanfsi
3 жыл бұрын
@@mathieumaticien To even be able to split up the whole space into a subspace and its orthogonal complement you need to apply the Hilbert projection theorem. (Which is done implicitly in the video) And the theorem requires a closed subspace. (Just look at its proof) So its really a condition imposed by that theorem if you want to be able to split the space up in the first place.
@hyperduality2838
2 жыл бұрын
Domain (pre-image) is dual to the co-domain (image) -- rank nullity theorem in linear algebra. Isomorphism (sameness) is dual to homomorphism (similar or relative sameness) -- Group Theory.
@scollyer.tuition
3 жыл бұрын
In a finite dimensional Euclidean space, we often represent linear functionals via row vectors, which map column vectors into the underlying field via a dot/inner product. I guess the Riesz Representation Theorem guarantees: a) that this operation can be justified rigorously b) that the analogue of this operation in infinite dimensional vector spaces also exists
@brightsideofmaths
3 жыл бұрын
I think that is a short rough summary one can always have in mind. However, in infinite-dimensional spaces some technical details are involved as well: We need completeness for example and the dual space consists of *continuous* functionals.
@anowarali668
Жыл бұрын
Thanks for the video. My doubt is "Whenever you are entering l(x^) in inner product , you are taking Conjugate of l(x^)" why? We know conjugate come if we take with second term of inner product. Please clear it.
@brightsideofmaths
Жыл бұрын
I defined the conjugate in the first term of the inner product.
@anowarali668
Жыл бұрын
@@brightsideofmaths Is it not against the inner product formula since we know = a and = b*.
@brightsideofmaths
Жыл бұрын
@@anowarali668 What is not against it?
@anowarali668
Жыл бұрын
@@brightsideofmaths l(x^)
@brightsideofmaths
Жыл бұрын
@@anowarali668 As I said: we defined the inner product with the property = b
@ecologypig
2 жыл бұрын
Thanks for your super helpful videos! 😀 I have a quick question: how do we know that $x_l := l(\hat{x}) \hat{x}$ is still inside the set $X$? Since we have scaled $\hat{x}$ by $l(\hat{x})$, and the scaling might be large, so it could be that $x_l$ now lies outside of the set $X$?
@brightsideofmaths
2 жыл бұрын
X is not just a set but a vector space. Hence you can never leave it just by scaling :)
@ecologypig
2 жыл бұрын
@@brightsideofmaths oh got you! Thanks very much for your quick reply!😃
@xwyl
2 жыл бұрын
With your constructed x\hat, the proof is done like a knife through butter. But it raises a bigger question, how did you come up with the construction?
@brightsideofmaths
2 жыл бұрын
Thanks. We know the start and the goal. One just tries to fill in the gaps and finds x_l.
@xwyl
2 жыл бұрын
@@brightsideofmaths I'm trying to understand this without any construction (for these constructions were perhaps invented after the theorem was proven, and may hinder deeper understanding) The original inspiration may be the Euclidean space R^n. Consider a vector r in R^3, r=(x,0,0)+(0,y,0)+(0,0,z). When we study l(r), just take l((x,0,0)) for example, the linearity of l(r) implies that l((x,0,0)) is just a multiple of x, therefore l(r) is just . This also implies that dim(ker(l))=n-1 for the space R^n. Knowing that x_l exists, then any vector is the sum of the parallel part and the orthogonal part with x_l. Then it's natural to propose the unified parallel component x\hat (meaning x_l is a multiple of x\hat) and then the parallel part is easily l(x)/l(x\hat)*x\hat = \lambda * x\hat. The next big leap is 6:17 where is miraculously put there. It's natural to approach from =, comparing to the equation l(x)=\lambda * l(x\hat), knowing that x_l is a multiple of x\hat, say x_l=a*x\hat, we finally get = \lambda * l(x\hat), and solve for a=l(x\hat), i.e. x_l=a*x\hat=l(x\hat)*x\hat. And this process can be generalized to Hilbert spaces. Sorry for the messy writing, but the reasoning is completely natural without any prior-construction, all from what we already have in the derivation. I prefer this derivation for it's more basic and learner-friendly.
@tigernov_425
2 жыл бұрын
why L-norm's bound is L(uni-vector of X)'s norm?
@Domzies
3 жыл бұрын
3:50 has made me realise I ddin't understand this. The professor at my functional analysis course did say that the theorem wouldn't work without X being a hilbert space but he didn't explicitly say why. Judging form your video I also probably don't quite understand orthogonal projectors as well as I'd like to. I've tryied looking into the book Functional Analysis by Peter Lax , but got even more confused. There it almost seems like you need a vector subspace (not jusz an arbitrary set) in order to even define an orthogonal complement. Besides this it would seem that perhaps the classical relation from linear algebra, namely that X=Y directsum Y^ortho, for any vector substace Y, only holds true in a general hilber space if Y is closed ?
@brightsideofmaths
3 жыл бұрын
This is something I really want to cover later :)
@zaccandels6695
3 ай бұрын
Excellent video
@brightsideofmaths
3 ай бұрын
Thank you very much!
@moritzbecker5703
3 жыл бұрын
Thank you very much for your excellent videos!
@hectormerinocruz7965
3 жыл бұрын
Bonita y muy bien explicada la demostración de este importante teorema.
@zazinjozaza6193
4 жыл бұрын
Wow this was a really cool topic, can't wait to see the applications.
@JR-iu8yl
2 жыл бұрын
Thank You
@weirdo-jw9kc
4 жыл бұрын
Do a series on topology and algebra too. If you have done it before please share the link. I like how you present the ideas and it gives right intuition.
@pan19682
2 жыл бұрын
we are looking forward to giving us a video series in topology
@brightsideofmaths
2 жыл бұрын
Check out my manifold series :)
@Zero-es-natural
3 жыл бұрын
Great video!
@chenliou2578
4 жыл бұрын
Thx
@munausef3891
2 жыл бұрын
thanks good explain can give me site to solve questions .thx
@jaimelima2420
4 жыл бұрын
Thank you so much!
@TheWombatGuru
4 жыл бұрын
Thank you for this video :)
@RohanKumar-zn4qg
4 жыл бұрын
Can you share your slides
@brightsideofmaths
4 жыл бұрын
Oh sorry! I totally forgot. Now they are all in :) steadyhq.com/en/brightsideofmaths/posts/c6641292-1666-4a24-a4b9-cd9c4147d7d3
@RohanKumar-zn4qg
4 жыл бұрын
@@brightsideofmaths it is asking for member access... please share on some open source platform
@brightsideofmaths
4 жыл бұрын
@@RohanKumar-zn4qg PDFs are a perk for my Steady members.
@lorenzougo6571
9 ай бұрын
want to cry, calculus 3 incoming hahahah
@sanjursan
3 жыл бұрын
My proof is much simpler than this. Of course, it is wrong!
@JaspreetSingh-zp2nm
Ай бұрын
Why orthogonal complement being closed has to contain something other than zero vector? Closed is something topological I am confused here. For finite dimension Gram- Schmidt process may help but in general I am not sure.
@brightsideofmaths
Ай бұрын
The orthogonal complement is always a closed set. So maybe you can clarify your question?
Пікірлер: 84