Have been learning from KZbin for quite some years, have never seen a teacher like you.. Hats off.
@NabidAlam360 Жыл бұрын
He explains it so precisely! The good thing about his teaching is that he does not make the video short just to finish the topic, instead, he explains each thing with patience! Hats off!
@shivampradhan6101 Жыл бұрын
first time in 2yrs of trying attention mechanism is clear to me now, thanks
@Ocean_77848 Жыл бұрын
Million dollars Lecture bro grab it.... clear concept in Hindi.... better lecture then the IIT'S and NIT'S Teacher
@WIN_13067 ай бұрын
aslo iiits
@vishutanwar Жыл бұрын
I am planning to start your course from 2jan2023, I just thought to check last video when you uploaded, this is just 10 days old, nice, I wish I will complete this course in 3-4 months,
@Sam-gl1md Жыл бұрын
Best explanation so far I have seen for Attention mechanism. Simple and easy to understand 👌👌👌👌👌👌👌👌👌👌
@utkarshtripathi9118 Жыл бұрын
Sir You are making ossm Videos with Excellent way of Teaching.
@Sandesh.Deshmukh Жыл бұрын
I have tried multiple resources to learn this difficult concept...But the way you have explained is God level!!!!!!.... Thanks for efforts Nitish Sir ❤❤ Super Excited for upcoming videos.
@kalyanikadu8996 Жыл бұрын
I am writing to kindly request if you could consider creating videos on topics such as BERT, and Distill BERT and Transformers etc as soon as possible because of the actively ongoing placement activities. Also guide us on how to use hugging face interface in context of different NLP usecases.. I understand that creating content requires time and effort, but I believe that your expertise would greatly enhance our understanding of these highly important and crucial topics. Thank you in advance and eagrly waiting for your future content.
@WIN_13067 ай бұрын
hii maam,how and where are you giving placement interviews.I am junior and i am getting confused on how to apply for internships and all
@haseebmohammed3728 Жыл бұрын
This was just amazing sir. I have watched so many videos to understand the "C" value, no one give a clear explanation except you. Looking forward for next video from you.
@PratyakshGautam-nc4mi5 ай бұрын
Best teacher I have ever seen in my life. This legend made a guy fall in love with Maths, who was scared of it during school. A BIG BIG BIG Thank you nitish sir, I have dream to meet you once in my life before i die ♥
@spandanmaity83523 ай бұрын
The best explanation of the Attention mechanism so far. 💯
@Shani_spamsАй бұрын
Thank you sir for making students life so easy from 1 hr i was trying to understand ... but apne jo makhhan laga ke samjhaya in a pitorial way ye ab kabhi nhe bhuly ga...
@ranaasad6887 Жыл бұрын
Hats off... such a clear and step by step approach. you connect concepts amazingly well. great teaching.
@koushik7604 Жыл бұрын
it's a gem, honestly.
@aritradutta953811 ай бұрын
what a fabulous explanation it was. Mind Blowing. Thanks a ton for explaining this much clear.
@rijulsharma814818 күн бұрын
Thanks a lot for this lucid explanation!
@sakshipote20 Жыл бұрын
Hello, sir! Pls keep uploading this playlist. Eagerly waiting for the next video!!!
@mohammadarif8057 Жыл бұрын
God Bless You ..Man what a session ..thanks a lot !!
@princekhunt18 күн бұрын
Respect to this man! What a efforts!
@SambitSatapathy-e4b Жыл бұрын
THank you for such precise and clear explanation videos. 🙏🙏🙏🙏🙏🙏
@sowmyak33269 ай бұрын
I have no words to express how greatful I am. You really inspire me. I would recommend you to sell some goodies(t-shirts) so that we can purchase and show our love towards you....
@hetparekh1556 Жыл бұрын
Amazing explanation of the topic
@technicalhouse982011 ай бұрын
Best Teacher Ever
@pyclassy Жыл бұрын
Hello Nitesh your explanations are really really awesome..I need to understand transformers so I am waiting for that.
@anmolpandey5692Ай бұрын
Kya bawli chiz hai yaar😵💫Perfect!!!😍
@gulamkibria3769 Жыл бұрын
You are a great teacher....more video like this...please
@nomannosher892811 ай бұрын
I first like the video, then watch it because I know the quality of the lecture.
@mentalgaming273911 ай бұрын
Well Sir , As Always The Video is Soo Perfect , But Still i think , i Need More Practise on it . Thanks for the Giving the Best Version of Each Topic . Love from Pakistan . Sir Nitish. 😍
@vijaypalmanit Жыл бұрын
First like then watch, because quality content is guaranteed
@shalinigarg8522 Жыл бұрын
Your all videos are exceptionally excellent and knowledgeable.... Please sir make videos on GNN's also with full detail and it's architecture and types..... It's humble request to you🙏
@aadityaadyotshrivastava203011 ай бұрын
One suggesion, please do not say in your video that this topic is tough to understand. The point here is that when you are teaching it doesn't seem to be tough to understand and secondly by saying "The topic is tough", physoclogically it impacts the learners... I have not found any tough video as of now. Your explanation is super!
@IshiShiv3 ай бұрын
Great video, amazingly explained.... Thank you!!!
@riyatiwari4767 Жыл бұрын
You mentioned in NLP playlist that once deep learning will be covered you will conver Topic Modelling & NER as well.. Please conver both these topics to complete your NLP playlist
@ranaasad6887 Жыл бұрын
super excited to learn transformers😍😍
@prathamagarwal85825 ай бұрын
Sir, you done a great job but after this type of explainations plz make a video in which code is done.....and make regular projects for each kind of mechanism for better understanding ...
@itsamankumar403 Жыл бұрын
TYSM Sir, Please also post Transformer lecture
@guptariya43 Жыл бұрын
Eagerly waiting for Self Attention and Multi head attention videos now. ..
@sheetalsharma43669 ай бұрын
Amazing explanation. Thank you very much sir.
@RashidAli-jh8zu Жыл бұрын
Thank you sir ❤
@sridharnomula383910 ай бұрын
Great. I feel its end to end explanation
@rockykumarverma9802 ай бұрын
Thank you so much sir 🙏🙏🙏
@atrijpaul4009 Жыл бұрын
Much awaited
@not_amanullah Жыл бұрын
Thanks
@KumR Жыл бұрын
Great one. Waiting for transformers. BTW can u also speak about tools like Llama,Langchain, RAG, LLM tuning , DPO, etc as well.... coming from the guru will help community
@riyatiwari4767 Жыл бұрын
Thank u so much for this video. Please provide videos for transformer atleast by christmas. I have an interview aligned. Its a humble request
@ALLABOUTUFC-ms8nt Жыл бұрын
Love you Sir.
@financegyaninstockmarket296010 ай бұрын
What a video ! 😀, maza aa gaya
@snehal77118 ай бұрын
insightful & helpful...thanks!
@shaktisd Жыл бұрын
Very good explanation, please make self attention video also
@shaktisd Жыл бұрын
@Shuraim843 really , that's a news . Do u mind explaining in detail what is your understanding of self attention and how llm use it to produce SOTA results ?
@Watchtower-h4u11 ай бұрын
doubt : 31:39 instead of ANN why we are not cosine simlarity score as you said alpha is score value .
@shyamjain8379 Жыл бұрын
looking to finish this bro nice work
@ariousvinx Жыл бұрын
Thanks a lot for this amazing video! I always wait for your next of of this series ❤. Can you tell when the Transformers will start?
@mrityunjaykumar28939 ай бұрын
Hi @campusX , Its great and clean explanation so far available on youtube, One confusion here, don"t you think--> to create Ci f(St-1,Hj), f= Neural Networ-->it will produce Eij, for all Eij, where i=1, j=1-4, will pass through softmax to generate the weights Aplha ij
@ParthivShah9 ай бұрын
Thank You Sir.
@bibhutibaibhavbora8770 Жыл бұрын
Maza aa gaya❤
@Amanullah-wy3ur7 ай бұрын
this is helpful 🖤🤗
@muhammadikram375 Жыл бұрын
sir please ap apni achievements or job k bary mai koi video banaen --- your job --- your monthly income (online earn by selling your skill, not from social media i.e. KZbin etc --- and many more etc
@ChronoChaos303 Жыл бұрын
tusi great ho sir
@eyeonmystery Жыл бұрын
Sir, Kindly complete playlist please.
@ayushmantomar21687 ай бұрын
sir great teaching, just if you would show some practicals
@Shisuiii696 ай бұрын
Free me aesi education aur kia chahiye zindgi me 🥳
@paritoshkumar74227 ай бұрын
Thank you SIr
@HimanshuSharma-we5li Жыл бұрын
Make some videos on who wants to start their ai/ml startups
@NarendraBME Жыл бұрын
Hello sir, please cover the topics from GANs, diffusion models as well.
@himeshsarkar325910 ай бұрын
he has a video on GAN.
@keshavpoddar9288 Жыл бұрын
can't wait for the Transformers video
@luson_18Ай бұрын
do we use attention mechanism during training?
@utkarshtripathi9118 Жыл бұрын
Sir Please Make videos on Generative AI and New AI Tools based on Generative AI
@gourabguha316711 ай бұрын
Sir MLOPs ka next videos pls upload karo, eagerly waiting Sir!!
@GauravKumar-dx5fp Жыл бұрын
Sir when will the next video in this series come...?
@akshayshelke5779 Жыл бұрын
Please upload next video soon we are waiting for transformers and bert
@Manishkumar-iw1cy Жыл бұрын
Worth it
@kalyanikadu8996 Жыл бұрын
Sir Please add next videos.
@muhammadaafaq6912 ай бұрын
Nitesh G ne end game kar diya
@not_amanullah Жыл бұрын
❤
@dattatreyanh612111 ай бұрын
Make video on point net architecture
@HARIS-q3n3 ай бұрын
Sir, I have a doubt, During the validation, what if the decoder outputs an output sequence shorter than the true output sequence. How is the loss calculated in such cases. will categorical cross entropy work in such cases.
@ankitgupta180611 ай бұрын
thnk u
@majidhabibkhan62510 ай бұрын
sir i am waiting for Attention Mechanism in 2 video
@not_amanullah Жыл бұрын
Understood++
@lakshmims759011 ай бұрын
Can make seperate video on LLM
@abromioitis Жыл бұрын
Sir please bring the implementation video also for encoder decoder
@saikatdaw6909 Жыл бұрын
When can we expect the next video in the series....Transformers
@teksinghayer5469 Жыл бұрын
when will video on transformer will come ?
@naveenpoliasetty954 Жыл бұрын
Sir waiting for Transformers lecture
@anupamgevariya341 Жыл бұрын
hii sir i have doubt like we are fedding input to our decoder as y1 and c1 so inside decoder it was rnn cell so how we can send two input os we are doing some operation before fed in. like dot prodeuct of y1*c1 so we get dimension back ??
@sagarbhagwani7193 Жыл бұрын
Thankusomuch sir❤
@not_amanullah Жыл бұрын
I didn't find better explanation Even in English
@eyeonmystery Жыл бұрын
Next vedio please sir
@amarendrakushwaha2206 Жыл бұрын
Awesome
@subhadwipmanna4130 Жыл бұрын
Thank sir, all clear, but one doubt only ,anyone pls answer. for i=2, there is one ANN or four ANN?....let say s1 = [1,2,3,4] and h1 = [5,6,7,8]..so to get alpha21, input in ANN will be [1,2,3,4,5,6,7,8] ?? or combination of all h ??
@neilansh Жыл бұрын
I think, there will be single ANN for a particular Ci, and total there will be 4 ANN (for C1,C2,C3 and C4). Regarding calculation of alpha21 => You will provide s1 and h1 and alpha21 will be calculated. Similarly for calculating alpha22 you will provide s1 and h2 and so on... ( I can't guarantee that this explanation is correct)
@gouravkatha91103 ай бұрын
wow
@WIN_13067 ай бұрын
i cant understand that now this video has 23k views but only 926 likes!!!!!!!!!!!!! like wtf
@campusx-official7 ай бұрын
Aap to kar do!
@mallemoinagurudarpanyadav4937 Жыл бұрын
❤
@vatsalshingala322511 ай бұрын
❤❤❤❤❤❤❤❤❤❤
@eyeonmystery Жыл бұрын
Sir next vedio
@debojitmandal8670 Жыл бұрын
Where are you passing only h then your c is nothing but addition of all h multiplied by respective weights so for eg if h4 is a vector say [1,2,3,4] then your c is not h4 instead it's a different number c1 = weights*h1+weights2*h2+weights3*h3+weights4*h4 which will a completely different vector and not 1,2,3,4