This channel is so underrated. There is lot of info if you are learning for first time. But best approach is to watch video 3-4 times.
@AICoffeeBreak3 жыл бұрын
Thanks, I appreciate your words! If you think the channel is underrated, then do not hesitate to share out the content! 😁 By doing so, you can actively help! Thank you so much!
@malekaburaddaha59103 жыл бұрын
I have been watching videos about GNNs for two days and I got the idea, but here I completely understood everything. Thank you very much I am glad that I found your channel, keep going.
@mirjunaid262 жыл бұрын
The way you break down and explain the mathematical formulae of GNN is amazing and beautiful. At the same time introducing and clearing the concept of permutation-invariance in such a short video is commendable. Thank you ❤️ Liking, sharing, & subscribing.
@AICoffeeBreak2 жыл бұрын
Wow, thank you! This comment made my day.
@vincent_hall3 жыл бұрын
Love it, thanks Coffee Bean. I'd worked with Graph Theory and worked a lot with NNs, but I didn't know what Graph convolutional NNs were. Thanks for updating my skills. I came here because I kept seeing the Graph Neutral Network term everywhere.
@AICoffeeBreak3 жыл бұрын
Ms. Coffee Bean is so glad this video was helpful for you!
@nicohambauer3 жыл бұрын
Nice! Keep up the good work! You are a true researcher and helping us other researchers to keep on track the real relevant things from literature!
@swarajshinde39504 жыл бұрын
Thank you for such Clear and Concise information !!
@DavenH3 жыл бұрын
Really well presented and animated. Keep it up!
@MCMelonslice2 жыл бұрын
i love your channel. Just found you yesterday and as a nearly complete scrub in ML this helps for a solid foundation!
@miladaghajohari23083 жыл бұрын
That is a nice intro. Thanks for taking the time to make it.
@AICoffeeBreak3 жыл бұрын
Glad you like it!
@firdawsyahya37493 жыл бұрын
I finally have the formula on lock. Thank you
@UnrecycleRubdish3 жыл бұрын
Very cute and entertaining idea with the coffee bean. Makes an otherwise dry subject a little bit... moist? Anyway thank you for the informative video. FYI I played it on 1.25x speed because she spoke a little bit too slow for me! Great work!
@AICoffeeBreak3 жыл бұрын
Thanks for watching! The speaking speed is something I am still adjusting to find my pace. Happy that you used the speedup functionality to help yourself. :)
@pureeight70033 жыл бұрын
I find this video very useful. I have subscribed!
@AICoffeeBreak3 жыл бұрын
Awesome! 👍
@pradyumnagupta39893 жыл бұрын
Oh my GOD , I have been trying to study gnns for so long and this is by far the best video I have seen on this topic. Thank you for clarifying that "convolution" is misleading in GCNs.
@AICoffeeBreak3 жыл бұрын
Thanks, I really tried to convey the important concepts. The other Schnick-schnack in GNNs is very good at scaring away people trying to learn about GNNs for the first time.
@chronomo972 жыл бұрын
Great intro!
@deepakravikumar6743 жыл бұрын
2.54 - I was happy 3.00 - RIP
@AICoffeeBreak3 жыл бұрын
😅
@boscojay13813 жыл бұрын
i was about to leave.. then her voice at the end said, "hey, do not forget to like & subscribe ..", and that's how she caught her fish!
@AICoffeeBreak3 жыл бұрын
Love your "fish-catching" formulation.😅 Is it right for her to assume from your profile picture that you like sailing?
@boscojay13813 жыл бұрын
@@AICoffeeBreak you’re absolutely right! lol
@AICoffeeBreak3 жыл бұрын
Cool! I will tell you a secret: Ms. Coffee Bean loves sailing too (but she is quite the beginner in the matter)!
@TheAIEpiphany3 жыл бұрын
Really useful and I like the creative animations! Keep up the great work, subscribed!
@AICoffeeBreak3 жыл бұрын
Thanks for the sub, really nice to have you around! I discovered your channel just yesterday and subscribed, but completely unrelated to this comment you left here! 😎
@TheAIEpiphany3 жыл бұрын
@@AICoffeeBreak heheh nice, glad to hear that and glad to be here!
@thomastorku90023 жыл бұрын
She is phenomenal
@AICoffeeBreak3 жыл бұрын
You are phenomenal!
@samarthagarwal72193 жыл бұрын
Nice explanation thanks!
@AICoffeeBreak3 жыл бұрын
Glad to help!
@omarmafia2343 жыл бұрын
I can not thank you enough!! Great great work!
@AICoffeeBreak3 жыл бұрын
So glad it helped! Means a lot to Ms. Coffee Bean.
@ИльясХарунов2 жыл бұрын
I can't quite understand how the weighted sum of neighbour vectors is permutation invariant. For example if we swap node 4 and node 2 vectors, then node 4 vector will now get weight Ci2 and the node 2 vector will get Ci4. Why won't the sum change?
@AICoffeeBreak Жыл бұрын
First we multiply node's 4 vector representation with a matrix W, we also multiply the vector representation of node to with the same matrix W. So swapping them in the sum does not change anything (commutative).
@wellisonraul582510 ай бұрын
Thank you!
@nikhilmanali920 Жыл бұрын
Thank you so much.
@abhinavmishra94013 жыл бұрын
This is so beautiful. I am crying. Thanks a lot
@AICoffeeBreak3 жыл бұрын
Thank you so much! It means a lot to me that it meant something to you!
@robertramji7613 жыл бұрын
Such a clear and accessible explanation, thank you!!!
@AICoffeeBreak3 жыл бұрын
Great to hear, thanks! ☺️
@DeepFindr3 жыл бұрын
I also made a video on GNNs but I have to admit yours is more compressed and gets faster "to the point" :) Well done!
@AICoffeeBreak3 жыл бұрын
Haha, "to the point" is kind of the motto of the whole channel! But especially for everyone keen to find out everything around the topic, they have you! 😉
@AICoffeeBreak3 жыл бұрын
BTW, great channel! Your GNN videos are also accompanied by a blog post! You have something for everyone!
@DeepFindr3 жыл бұрын
Thanks! Sounds good, I look forward to your future videos :)
@sunaryaseo2 жыл бұрын
Thanks very much for the explanation. I still didn't get it with the notation of 'W' and 'U' in the formula. Where is it exactly on the figure? is that in the edges of the graph so we can multiply with the 'H' ? or in somewhere else (another figure) you didn't show? If there is another NN after this graph, I am curious how do you connect this graph with that NN
@prachi07kgp Жыл бұрын
Very helpful video, such complicated concept explained so beautifully and in simple manner
@newbie8051 Жыл бұрын
Great explanation, loved this. Thanks a ton ma'am !
@sir_aken9706 Жыл бұрын
Just discovered this channel and ngl, I think im in love with coffee bean 😂 very good and succinct video
@rohith24542 жыл бұрын
Thanks a lot !
@7369393 жыл бұрын
Are you sure that for each "h" we need to apply weight matrix "W", and not scalar "w"???
@AICoffeeBreak3 жыл бұрын
Each h is a vector, therefore W is a matrix. 😃
@7369393 жыл бұрын
@@AICoffeeBreak Thank you.
@denshaSai2 жыл бұрын
So what is the groundtruth and loss function for graphs? (how do you actually learn the weights)?
@AICoffeeBreak2 жыл бұрын
Depends on what you are trying to do: For node classification, you have a cross-entropy loss for predicting the label of each node, for example.
@ditherycarbon86612 жыл бұрын
@@AICoffeeBreak Soo, we are still using backpropagation to learn the weights right? And how do we get the final output from the individual node embeddings? (Considering that the output is global for the entire graph and not local for each node) Thanks in advanced
@AICoffeeBreak2 жыл бұрын
@@ditherycarbon8661 Yes, it is still backprop and gradient descent. If we need to make a classification of the whole graph, we need to apply the few extra classification layers on the aggregation of the whole graph. Just an idea (GNNs are not my area of expertise): one could still do graph classification if the GNN layers are so many that the information had time to smooth out over the entire graph. Then we could still use one output node alone to say something about the whole graph.
@ditherycarbon86612 жыл бұрын
@@AICoffeeBreak Thanks, makes sense
@diwakerkumar5910 Жыл бұрын
Thank you
@VenkatIyer3 жыл бұрын
Is there an easy implementation of simple GNNs available somewhere? On github I can see only sophisticated approaches for wierd problems.
@AICoffeeBreak Жыл бұрын
If there is one, I would also like to know. :)
@sukanya44982 жыл бұрын
Great introduction with examples 👏🏽🙌🏽👍🏽!!
@AICoffeeBreak2 жыл бұрын
Glad you liked it!
@cesar73silva3 жыл бұрын
Yes. This is good.
@AICoffeeBreak3 жыл бұрын
Thanks!
@cesar73silva3 жыл бұрын
@@AICoffeeBreak A question, how do I get the initial representations for h_i ? Or what possible ways are there
@AICoffeeBreak3 жыл бұрын
@@cesar73silva Depends a lot what you want to do. An an example: for NLP it is quite common to initialize h_0 with word embeddings. In applications where you really have nothing to start with, h_0 could be even initialized randomly.
@scottt93822 жыл бұрын
This is excellent
@keithsanders63092 жыл бұрын
This was a helpful and stellar video!!
@nicholasliu-sontag15853 жыл бұрын
Thanks for this video. I wanted to ask - does this video not also describe GNNs with RNNs? Given that nodes you describe have some short term memory?
@st0a Жыл бұрын
Really cute channel. Why you don't have more subscribers is beyond my understanding.
@prathamprasoon25352 жыл бұрын
Awesome video!
@AICoffeeBreak2 жыл бұрын
Thanks! You are THE Pratham Prasoon! I know you from Twitter. 😄
@safaelaat18682 жыл бұрын
Thank you very much for all your videos. We always hear that computer vision performance has exceeded human performances. But where do we get this information from and how about other domains like NLP, Speech recognition, financial fraud detection, autonomous driving and malware detection ? thanks again
@zhangkin78962 жыл бұрын
Hi Ms. Coffee Ban, could I translate the video and pub to another video domain(bilibili) with that language with cite the origin from?
@AICoffeeBreak2 жыл бұрын
Hey, thanks for reaching out! I would love to upload on bilibili myself, the problem is that I do not get past the account creation verification due to my poor Chinese. I would love some help with that. Would you mind writing me an email? ☺️
@goldfishjy953 жыл бұрын
Thank you so much! #prayinghandsemoji
@rembautimes88083 жыл бұрын
The term \Sum_(j in Nbr(i))1/c_(i,j) * h_j^t * U , does this follow from the Kolmogorov-Arnold representation theorem for continuous mulitvariate functions? Excellent video I may add.
@AICoffeeBreak3 жыл бұрын
There is certainly a striking similarity to it. But I am not really sure, I can only cite a paper I found about this and point you to it: "The Kolmogorov-Arnold representation decomposes a multivariate function into an interior and an outer function and therefore has indeed a similar structure as a neural network with two hidden layers. But there are distinctive differences. One of the main obstacles is that the outer function depends on the represented function and can be wildly varying even if the represented function is smooth." export.arxiv.org/pdf/2007.15884.pdf
@ehsanelahi1902 жыл бұрын
Good explanation of such dry topic
@AICoffeeBreak2 жыл бұрын
🏜️ thanks!
@CallSaul4892 жыл бұрын
Really nice explanation!
@l3nn134 ай бұрын
i would love some links or names to the original papers you are referring to
@ckng61262 жыл бұрын
Very succinct explanation
@bibs20912 жыл бұрын
i still have many questions in mind, but this is surely a very good introduction
@heathenfire2 жыл бұрын
Wow This was such a good explanation.
@sarahjamal862 жыл бұрын
Well done 🥳
@marcusbluestone2822 Жыл бұрын
Well explained. Thanks!
@IndoPakComparison2 жыл бұрын
Very basic info but you still used terms and notations hard for a new learner. haha. but it is good one 加油。
@christianmoreno73902 жыл бұрын
You have a beautiful accent girl
@sachin634422 жыл бұрын
why can't you use XGboost or decision trees for node level classification instead of GCN?
@baqirhusain56522 жыл бұрын
I love you ma'am
@MalachiEleanore-s1t23 күн бұрын
White Elizabeth Jackson Brian Hernandez Deborah
@odiiibo Жыл бұрын
One way communication. No discussion in comments. What is the right word for it? - Lecture, thank you.
@AICoffeeBreak Жыл бұрын
Yeah, I try my best to keep up with many comments. I am one person and there are many questions in the world. But I do always find some time to respond to lots of questions. Is there any in particular you would have liked to see responded?
@odiiibo Жыл бұрын
@@AICoffeeBreak Sorry, I was wrong about discussions here. Just grew tired of this issue at other places. I see the problem of centralization in producing these models where prediction is actually influenced by data selection. Do you want to discuss this?
@AICoffeeBreak Жыл бұрын
@@odiiibo I mean sure, it is not only the data selection that matters the most, but also the prompt that for example ChatGPT always sees but is not shown in the conversation UI that matters. In other words, there is a lot of ways in which the large data- and model holders can influence predictions. Therefore it is great to see that open source and organizations gathered on the Internet (LAION; Eleuther, OpenAssistant) is breathing down their neck. :) Or do you think open source is not enough?
@odiiibo Жыл бұрын
@@AICoffeeBreak Open Source is fine but has its limits. I have a couple of concerns. One of them is fundamental. A board of experts is not able to create the tool that suits all applications. The obstacle is the same as with the semantic web. Life supplies us with unexpected situations. Even computer aided knowledge is limited. Material products can be produced by specialists, but knowledge and communication can't fit into a rigid frame. Word meanings are created each second by thousands. In any software the same function name can have drastically different meanings. And propmt history is not enough to overcome this shortcoming. Isn't it so?
@AICoffeeBreak Жыл бұрын
@@odiiibo this is why we need research into models that update their knowledge. Systems relying on knowledge bases would be one direction: you update the president in the knowledge graph and the model relying on that is up to date in this way.
@tallwaters9708 Жыл бұрын
GCNNs, what a horrible name!...
@AICoffeeBreak11 ай бұрын
Gone are the times of AlexNet... 😅
@kvnptl44002 ай бұрын
Honestly, couldn't understand in the first attempt :(
@amanoswal73913 жыл бұрын
The coffee bean acts as a distraction. Good explanation otherwise.