*My takeaways:* 1. Background 0:48 2. Graph neural networks (GNN) and neural message passing 6:35 - Gated GNN 26:35 - Graph convolutional networks 29:27 3. Expressing GGNNs as matrix operations 33:36 4. GNN application examples 41:25 5. Other models as special cases of GNNs 47:53 6. ML in practice 49:28
@shawnz98332 жыл бұрын
can you help with that: what is a MLP in "you multiply it with a single layer MLP" @23:29?
@leixun2 жыл бұрын
@@shawnz9833 Multilayer perceptron
@shawnz98332 жыл бұрын
@@leixun Cool, thank you mate!
@leixun2 жыл бұрын
@@shawnz9833 You are welcome mate, I am doing some deep learning research as well. You are welcome to check out our research talks on my KZbin channel.
@mehmetf.demirel86474 жыл бұрын
great talk! the audience questions were helpful, but i felt like they were a bit too many in that they kinda negatively affected the flow of the talk.
@susmitislam19103 жыл бұрын
So GNNs are basically something like calculating word embeddings in NLP. We have a dataset describing the relationships between pairs of words (nodes), and we want a vector representation that reflects how often they co-occur (weight of the edge between the nodes), i.e., how much relatedness the two words have. Once we have such vectors, we can build a vanilla, recurrent, or convolutional neural net to find out a mapping between the vectors and the output we desire.
@syllogismo3 жыл бұрын
Don't know why people are criticizing this video and the audience. Great introduction to graph neural networks!
@iltseng4 жыл бұрын
At time 34:11, the (dot product of matrix A and matrix N) should be [ b + c ; c ; 0 ]
@MingshanJia4 жыл бұрын
Actually, here what we use is the incoming edges (see 14:55), but that is true the slide is confusing about that.
@michaelkovaliov88774 жыл бұрын
It seems that the mistake is in the graph adjacency matrix, because the result vector is true given the drawing of the graph.
@PenguinMaths4 жыл бұрын
That is a mistake in the slide, A should be transposed to describe the incoming edges instead of the outgoing ones.
@khwajawisal12204 жыл бұрын
its using einstein notation, not the normal one that we use.
@kristofneys23493 жыл бұрын
There are many mistakes or confusing comments in this presentation, no wonder the audience keeps asking questions. Not a good talk at all....
@ashutoshukey38033 жыл бұрын
The explanation ability and use of high-level diagrams by the presenter were phenomenal. Questions from the audience definitely messed up the flow of the presentation quite a bit though.
@rembautimes88083 жыл бұрын
"Spherical Cow" - funniest analogy yet for a Neural Net layers. Great talk
@Peaceluvr183 жыл бұрын
Wow, what an excellent presentation, from someone with an ML background. Explains the basics a bit but also covers deep concepts. Super clear graphics! Seriously whoever made the graphics for this can I hire you to do my slide graphics? And thought it was very cool that the lecture attendees were bold enough to ask so many questions! Wish people asked more questions during my lectures+talks.
@runggp3 жыл бұрын
awesome talk! The MSR audience asked quite a few questions, which are actually helpful , eg, what are they, how they work/update, why they are created and designed this way, etc
@shawnz98332 жыл бұрын
can you help with that: what is a MLP in "you multiply it with a single layer MLP" @23:29?
@MobileComputing4 жыл бұрын
While the audience questions were mildly irritating (to put it, mildly), bombarding the speaker during his intro with questions that could reasonably be expected to be answered eventually in an 1-hour talk, why would the speaker give a talk on one of the most advanced neural network architecture to an audience without any machine learning background?
@ziranshuzhang68313 жыл бұрын
You are right, I am expecting to quickly adopt the GNN concept but the audience keeps asking irritating questions that I have to constantly hit the right button.
@ohmyumbrella3 жыл бұрын
I agree. I mean I do see the point of giving this lecture to an audience without previous exposure to ML , if it is for the purpose of attracting them to the subject but in that case there should have been another video of the same lecture without so much interruption. It would take extra time and effort but for people who are trying to effectively learn GNN and have some knowledge of basic ML, these questions are very annoying and hinders the learning experience.
@robbat12092 ай бұрын
The questions of the audience are absolutely valid, if they bother you there are plenty of other videos without an audience that you could watch
@MobileComputing2 ай бұрын
@@robbat1209 This comment is absolutely valid. If it bothers you, there are plenty of other comments that you could read. This talk was from 4 years ago. This was one of the only sources about GNN, and before GenAI video summaries would allow audience like myself to comfortably skip ahead without fear of missing important information.
@robbat12092 ай бұрын
@@MobileComputing 🤣🤣 true, my bad. The questions weren’t too horrible tho
@alphaO273 жыл бұрын
In 16:51, I think he meant for each node connecting to n (instead of n_j), because from the expression, we take all nodes n_j connected to n to be able to calculate the new state of node n h_t^n.
@heejuneAhn3 жыл бұрын
He explains using time progress, which make some cofusion to the audience and me.
@jonfe8 ай бұрын
I dont understand why GRU is used, the input in GRU is a (Node x Caracteristics) Matrix, where is the temporal dimension?
@bibiworm3 жыл бұрын
is there anyway to get access to the slides? Great talk! Thanks.
@eljangoolak3 жыл бұрын
Very good presentation, but it is very difficult to follow with all the interrupting questions
@arjitarora84187 ай бұрын
Great Introduction!
@Exhora Жыл бұрын
29:35 about CGNs, he said you multiply the sum of the messages with your own state. But in the equation, it is a sum. I didn't get which one is correct.
@instantpotenjoyer4 жыл бұрын
DiscussIon on the actual topic starts ~ 6:40
@mansurZ014 жыл бұрын
35:40 I think the dimensionality of M should be (num_nodes x D), unless D==M. EDIT: from what follows, it should be M = HE, and D can be different from M.
@sm_xiii4 жыл бұрын
Where can we get the slide deck please?
@losiu9983 жыл бұрын
Great presentation. But I have to point out something. I have no idea why you would use einstein notation instead of simple matrix multiplication? It raises unnecessary confusion and it's not related to GNNs.
@lidiias59763 ай бұрын
36:53 what is M in the shape (num_nodes by M)?
@sunaxes2 жыл бұрын
So GNN is just message passing on a graph or did I miss something? This has been around since way back, isnt it??
@codewithyouml89943 жыл бұрын
The miss, at 40:00 was right .... as i was alsoooo really confused, like all the matrix operations were seemed to be invalid if not swapped ... lol what kind of inverted conventions are these ....
@aladdin17763 ай бұрын
The talk is quite intriguing; however, the interruptions from the audience are quite disturbing.
@pielang55243 жыл бұрын
Let the speaker talk!
@klaimouad7403 жыл бұрын
can we propagate messages for example depending on the edges features for example if the distance from node n to m is greater than their distance to p then we propagate the message first to p then we perform the propagation to the other node m
@FsimulatorX3 жыл бұрын
OMG! For a second I thought he looked like the CEO of Google and was wondering to myself: why would the CEO of Google do a presentation about Neural Networks AT MICROSOFT!!
@peter-holzer-dev Жыл бұрын
I am pretty sure this is a great talk but unfortunately all the questions in between disturbs the flow a lot (also because most of them are hard to understand acoustically).
@hamishhall54234 жыл бұрын
What is the dimension M, for the msg_to_be_sent and received_messages etc. I get that D is the dimension of the node representation, N the num_nodes etc
@pharofx58844 жыл бұрын
dat cough frequency suspiciously high. Distance thyself socially sir.
@maloxi14724 жыл бұрын
Don't worry, it's from November 2019
@jmplaza49474 жыл бұрын
@@maloxi1472 COVID was already spreading then, right? I hope he's ok... wherever he is now...
@danielliu96164 жыл бұрын
He is maybe THE patient zero
@yanzexu69124 жыл бұрын
bless him
@stackexchange73533 жыл бұрын
@@danielliu9616oh shii
@arashjavanmard59113 жыл бұрын
It would be great if you could also publish the slides!
@mayankgolhar87613 жыл бұрын
Slides from the presenter's website: miltos.allamanis.com/files/slides/2020gnn.pdf
@arashjavanmard59113 жыл бұрын
@@mayankgolhar8761 thanks
@vritansh143 ай бұрын
Amazin Talk !
@mohammedamraoui41477 ай бұрын
and for Edge classification?
@BapiKAR4 жыл бұрын
could you please post the ppt here? Thanks
@leventesipos1456Ай бұрын
The notation is incomplete or incorrect in so many places on the presentation, that it was hard to follow.
@soniagupta28512 жыл бұрын
Excellent explanation
@halilibrahimakgun7569 Жыл бұрын
Can you share the slides please. I like them.
@minghan1114 жыл бұрын
too many questions, just wait for the speaker please.
@pharofx58844 жыл бұрын
In formulating their questions they re-explained what is going on an order of magnitude better than the speaker. thats kinda sad
@MobileComputing4 жыл бұрын
@@pharofx5884 I just watched the version from 2 years ago. Only 18 minutes long, but almost identical in content, yet that was much clearer. Really sad to see.
@sebamurgui4 жыл бұрын
OH MY GOD that audience
@miketurner34613 жыл бұрын
I had to stop watching because of them
@kellybrower3013 жыл бұрын
I wanted to believe it wasn’t true.
@NewtonInDaHouseYo2 жыл бұрын
Excellent introduction, thanks a lot!
@samuisuman4 жыл бұрын
There is no intuitive explaination, but quite informative
@r.dselectronics33494 жыл бұрын
i am a researcher..the video contain the beautiful concept ...i like very much....:)..specially binary classification part ...i am so excited about this concepts....
@arisioz2 жыл бұрын
Are those actual MS employers in the crowd? They are worse than 1st year CS students
@pielang55243 жыл бұрын
at time 35:51, the adjacency matrix A should also depend on edge type k imo.
@pielang55243 жыл бұрын
OK.. The presenter confirmed this shortly after...
@patrickadjei96763 жыл бұрын
People are not happy for the many questions. However, I'm kinda sad that he doesn't re-state the questions before answering :( like why?
@giannismanousaridis40103 жыл бұрын
I found the slide for everyone who asked here: miltos.allamanis.com/files/slides/2020gnn.pdf (Idk If I'm not supposed or allowed to post the link here, if not sorry for that, I'll delete my comment. Just let me know).
@warpdrive92293 ай бұрын
You can let it remain here :)
@blanamaxima3 жыл бұрын
Is this related to bread baking ?
@sherazbaloch16422 жыл бұрын
Need more tutorial on GNN
@yb8013 жыл бұрын
Is this a 2016 talk?
@goblue10112 жыл бұрын
Honestly some of the audience who raised questions have quite big ego and have no idea what they are talking about.
@rocking4joy3 жыл бұрын
Don't understand the praise in the comment section, I actually found it kind of sloppy with typo-s but the audience and the questions are really great.
@hfkssadfrew3 жыл бұрын
34:46 . it is NOT A* N, it is N' * A ....
4 жыл бұрын
Where is the inputs and outputs?
@miketurner34613 жыл бұрын
Clearly you would've been one of the people asking foolish questions they could answer using Google
@CodingwithIndy4 жыл бұрын
Is that SPJ I hear in the audience at 9:18
@RARa128123 жыл бұрын
How tp turn off questions
@losiu9983 жыл бұрын
34:22 - is it vector-matrix multiplication? if so, the result is wrong i guess @edit: Matrix A should have ones under diagonal, not above - then result is as presented
@2000sunnybunny4 жыл бұрын
Great session !
@lifelemons81763 жыл бұрын
horror crowd. this is something I see in every microsoft talk
@khwajawisal12204 жыл бұрын
thats why software engineers should not teach, you assume everything is a design/modeling detail when in reality they are part of the mathematics behind them. and i seriously miss those old days where professors used to teach with a chalk and board.
@BruceChen273 жыл бұрын
Seems several people were not healthy
@ZephyrineFreiberg3 жыл бұрын
It should be (A^T)*N
@WeeeAffandi Жыл бұрын
You can tell this lecture was recorded during the prime Covid by hearing the constant coughing from audience (and the speaker)
@aichemozenerator84462 жыл бұрын
good
@friendlywavingrobot4 жыл бұрын
I hear Simon Peyton Jones in the audience
@rherrmann3 жыл бұрын
Easily recognizable indeed!
@sirisaksirisak69813 жыл бұрын
At least it save time in doing strategy.
@JK-sy4ym4 ай бұрын
Unfortunately many good researchers can’t present their work well.
@barriesosinsky95663 жыл бұрын
An aromatic ring is not a "single bone" next to a "double bone." The bonds are a resonance form in a single state. Treating them with graph theory is not supported by current models.
@guest1754 Жыл бұрын
Wondering how many people had covid in that recording...
@kushalneo3 жыл бұрын
👍
@eaglesofmai3 жыл бұрын
are GNN's patented? does anyone know if using a paritcular ANN construct can be subject to litigation?
@carloderamo Жыл бұрын
Some good person should take this video and remove all the awful questions from the audience
@jebinjames63173 жыл бұрын
People need to remember they’re watching a free video on KZbin...it’s not your advanced ML private tutoring session...
@dalahmah3 ай бұрын
I found the lecture’s atmosphere dull and depressing. It seems that the lecturer was forced to give the lecture!
@kutilkol3 жыл бұрын
Rip ears. Wtf with the caughing. Use at least some compressor for the vocal audio omg.
@yanyipu40293 жыл бұрын
Great Video but annoying audience
@yiweijiang3 жыл бұрын
So that's machine learning! Haha, lol
@thevanisher46098 ай бұрын
Horrible audience, great talk!
@aglaiawong80583 жыл бұрын
the interruptions are so annoying...
@KeshavDial2 жыл бұрын
The audience ruined this presentation. I have never felt worse for a presenter.
@iSpades03 жыл бұрын
that audience was pretty annoying tbh
@МихаилМихайловичГенкин3 жыл бұрын
Kind of confusing for me. And the audience very annoying
@MrArmas5554 жыл бұрын
++
@miketurner34613 жыл бұрын
The audience needs to take a freaking ML 101 class before asking stupid questions
@williamashbee3 жыл бұрын
cough totally ruined the presentation.
@robodoc14463 жыл бұрын
Appalling talk! It shows why coders are terrible in public speaking or often fail to explain things in a transparent manner. Before explaining how message passing is done in an end-to-end learning architecture, he jumped to talk about Gated GNN leaving an impression that GRU may be an important part of GNN. This is one of the reasons why he got so many questions and confusion surrounding his lecture.....what is h_t? "well, this is not something that changes"... seriously Microsoft!
@Tyomas13 жыл бұрын
awful introduction
@TheDavidlloydjones2 жыл бұрын
Word salad. A hopeless mess of talking at and around a topic without actually touching it. Take it down. Tell the guy to try again.
@saulrojas26792 жыл бұрын
Slides can be found at: miltos.allamanis.com/files/slides/2020gnn.pdf