What an amazing series on Few-Shot learning! I've seen some papers, such as FUNIT, that discuss the generation of a few-shot image. This is the first time I've looked for few-shot image classification learning. I honestly believe that there is no better explanation than this series. I've just subscribed and look forward to seeing more series similar to these. Many thanks
@MicheleMaestrini6 ай бұрын
Thank you so much for this series of lectures and slides. I am doing a thesis on few-shot learning and this has really helped me understand the fundamentals of this algorithm.
@hjiang34563 жыл бұрын
For sure the best intro to few shot learning one can find on KZbin. Thank you for the great contents. Hope to see more contents like this in English.
@adityasrivastava89033 жыл бұрын
Want to seriously thankyou for making videos on few shot learning......please try making some videos on few shot generative modelling if possible
@marooncabbagemaroon65323 жыл бұрын
on of the most explicit few-shot learning lesson !!!!!
@impulse17123 ай бұрын
What an explanation in detail,loved the way you explain things , thank you very much sir.
@norman9174 Жыл бұрын
I am from India , Such a amazing lecture , it blows my mind how everything works . Now iIunderstand it not something very high class ...its just a bunch of vectors on which we have to deal with .
@jcrobin19913 жыл бұрын
Thank you for the amazing lecture series! I've learnt so much from them. Just one suggestion - it would be very nice if you can point us to some simple code example that applies those fine-tuning techniques mentioned in the lecture which we can play around with to have a better understanding of those concepts. Thanks again!
@dnnsdd54183 жыл бұрын
Hi! Thank you for a great video, very interesting topic! I have some questions. When you say that we could train the "pretrained CNN" using the siamese network, what is meant by that? Isn't the Siamese network made for embedding and then evaluating? Why would we need to pretrain another CNN using this, when the SNN is already doing it? Or is the "pretrained CNN" another name for the two "twin" CNN-models used in the SNN? Thanks in advance!
@gabiryoussef91943 жыл бұрын
Many thanks for your great work. please extend to other machine learning related topics.
@banalasaritha570 Жыл бұрын
amazing explanation.. bow to you👏
@alphonseinbaraj76023 жыл бұрын
Really wonderful and great explanation. Because i used to study books to learn ..but here all references available. Please will you give some tricks and steps for transfer learning techniques and few real time projects also ..My request .. Thanks
@RyanMcCoppin Жыл бұрын
Dude, you are a boss teacher. Thanks for sharing.
@fionamukimba19563 жыл бұрын
This is a great lecture prof. Please do a review lecturer on Siamese networks, trends and areas of application. I hope my request and others in the comments will be attended to.
@chanramouliseshadri5122 жыл бұрын
Thank you Shusen. Great explanation
@stewartmuchuchuti209 ай бұрын
Awesome. Well explained. Well simplified.
@sheikhshafayat69842 жыл бұрын
This video is really really good, wish you make more such!
@jasonwang99902 жыл бұрын
Amazing tutorials! Absolutely great job!
@t.pranav28342 жыл бұрын
Great explaination. Thanks for this series.
@stracci_56983 ай бұрын
Are the siamese networks not performing a fine-tunning? when the model weights are learned to perform the task?
@8eck2 жыл бұрын
What if there are 10.000 classes and we need to predict, to which class this quarrel belongs to? We will have to create a matrix of all 10.000 classes the same way as you have shown in your video?
@AbhishekSinghSambyal3 ай бұрын
Which app do you use to make presentations? How do you hide some images/arrows in the slides like an animation? Thanks.
@impactguide2 жыл бұрын
Thanks for this great lecture! I really like having a "real world" example, before going into the mathematical details. I was wondering something though... Is taking the mean of the support set vectors always justified or a good idea, or is there some possible generalization? As an example, for some bird species the males are brightly colored, while females are more plainly colored. If I have both a female and a male bird in my support set, the average of these two might be a "strange" vector which is relatively distant from either a male or female bird. I suspect using the discussed method would probably still work, as far as classifying birds goes, and of course you could work around this by having a female_bird and male_bird class in the support set, and adding the p_j values up. But on the other hand, they are stil examples of the same "thing", i.e. a bird, it's just that the "thing" in question has several forms. Is there some smart way of approaching such a problem?
@JuanMiguelMoralesOliva3 жыл бұрын
Me ha sido realmente útil esta excelente y didáctica explicación, muchas gracias!!
@lchunleo2 жыл бұрын
Can.few shot classification replace supervised classification even if there r data available?
Softmax associates while learning, and identifies while inference
@williamberriosrojas5953 жыл бұрын
Great videos!! Thanks a lot :)
@woddenhorse3 жыл бұрын
Amazing Playlist 🔥🔥
@amoldumrewal3 жыл бұрын
Hey, really nice series. Great work man! I have one question: Since we are now using cosine similarity the range for inputs for softmax is [-1,1]. This might hinder the max probability for the correct class as it can go upto ~89% in best case scenario. Are you aware of a way by which we can make the probabilities for a 100% sure prediction be equal to ~1.0?
@EranM2 жыл бұрын
take absolute value of cosine similarity
@felixschmid78493 жыл бұрын
Great explanations! Thanks!
@MrSupermonkeyman342 жыл бұрын
Does anyone know what the difference in accuracy is if you train the network as a siamese network compared to training in the standard way?
@zhalehmanbari6172 Жыл бұрын
Fantastic 🌸
@santanubanerjee5479Ай бұрын
What does it mean when the gradient propagates back to the CNN as well? What is changed in the CNN?
@santanubanerjee5479Ай бұрын
I think I need to relook CNN parameters!
@sonninh89873 жыл бұрын
Great explanation
@nacho79532 жыл бұрын
do you have any code example?
@serviofernandolimareina53652 жыл бұрын
Excellent!
@himalayasinghsheoran12553 жыл бұрын
Great explanation.
@8eck2 жыл бұрын
Can we keep 1000 classes as mean vectors of their 1000 images? So 1000 mean vectors for 1000 classes with 1000 images per class.
@EranM2 жыл бұрын
yes you can.
@feidu112 жыл бұрын
Many thanks
@Amir-tg9nf2 жыл бұрын
Thanks a lot
@EranM2 жыл бұрын
Did anyone in here implemented this?
@EranM2 жыл бұрын
Sorry but the fine tuning approach just degraded my model accuracy.