This playlist is like a time machine. I’ve watched you grow your hair from black to white, and I’ve seen the content quality continuously improve video by video. Great work!
@animatrix16315 ай бұрын
I feel the same as well but I guess he's not that old
@zerotohero10024 ай бұрын
Courage comes at a price ❤
@sachink91024 ай бұрын
Thank you, NitishJi, Eeagerly waiting to attend your Transformers sessions. Please complete this series.
@RamandeepSingh_044 ай бұрын
Another student added in the waiting list demanding for next video. Thank you sir.
@muhammadsheraz1775 ай бұрын
Please end this playlist as early as possible
@ayushrathore25705 ай бұрын
This whole playlist is the best thing I discovered on KZbin! Thank you so much, sir
@nvnurav18924 ай бұрын
Sir one small suggestion, aap apni videos pe speech to speech translation laga ke english mai convert kar lo and upload it on Udemy/youtube. it will help a lot of people jinko hindi nhi aati and will help your hard work get more and more attraction.🙂🙂. We are really very lucky that we are getting such rich content for free.. God bless you.
@yashshekhar5385 ай бұрын
Respected Sir, your playlist is the best. Kindly increase the frequency of videos.
@sahil51245 ай бұрын
this is really important topic. Thank you so much. Please cover everything about Transformer architecture
@sharangkulkarni17593 ай бұрын
जबरदस्त,! मजा आगया, जिस तरह से padding के zeroes को लपेटे मे ले लिया, मजा आगया
@akeshagarwal7945 ай бұрын
Congratulations for building a 200k Family you deserve even more reach🎉❤ We love you sir ❤
@ShivamSuri-lz5it3 ай бұрын
Excellent deep learning playlist , highly recommended !!
@PrathameshKhade-j2e5 ай бұрын
Sir try to complete this playlist as early as possible , you are the best teacher and we want to learn the deep learning concept from you
@SBhupendraAdhikari3 ай бұрын
Thanks a Lot Sir, Really enjoying the learning of Transformers
@udaysharma1382 ай бұрын
Thanks a lot Nitish Sir , best Explanation
@praneethkrishna6782Ай бұрын
@campusx Hi Nitish, thanks a lot for the elaborated explanation. But I have a query, Is it really that the values '0' representing the padding tokens really the reason (or the only reason) which is stopping from using Batch Normalization. because it can be internally handled to not consider '0' which calulating the mean and stadard deviation while calulating z across features. on the other hand I think, this technique (Batch Norm) is clubbing the embeddings of different sentences while calulating Z which seems little odd to me. and that is the reason for not using this technique. please correct me if I am wrong here
@AKSHATSHAW-tf3ow14 күн бұрын
Same doubt, I think there is both the reasons for this.
@vinaykumar-xh5pi4 ай бұрын
please release the next video very curious to complete ...... loved your content as always
@ryannflin1285Ай бұрын
bhai literally mujhe samajh nhi aa rha hai ki mujhe samajh kaise aa rha, koi itna accha kaise padha sakta hai yrr,love u sir ( from IITJ)
@AidenDsouza-ii8rb4 ай бұрын
Your DL playlist is like a thrilling TV series - can't wait for the next episode! Any chance we could get a season finale soon? Keep up the awesome work!
@just__arif2 ай бұрын
Top-quality content!
@Xanchs45934 ай бұрын
Can you pls explain what is the add in add and norm layer?
@ai_pie10005 ай бұрын
Congratulations Brother for 200k users Family ... 👏👏👏
@GanitSikho-xo2yx4 ай бұрын
Well, I am waiting for your next video. It's a gem of learning!
@Fazalenglish3 ай бұрын
I really like the way you explain things ❤
@taseer125 ай бұрын
Sir I can't describe your efforts Love from Pakistan
@saurabhbadole8215 ай бұрын
I am glad that I found this Channel! can't thank you enough, Nitish Sir! One more request: If you could create one-shot revision videos for machine learning, deep learning, and natural language processing (NLP).🤌
@WIN_13064 ай бұрын
at 46:10 ,why it is zero? as beta is added so it will prevent it from becoming zero?
@dilippokhrel4009Ай бұрын
Initially the gama value is kept 1 and beta is kept zero, hence initially the value will be zero. But during training process may be value will be other than zero
@Schrödinger34 ай бұрын
please complete this this playlist and add transformers tutorials as soon as possible
@shibrajdeb51775 ай бұрын
sir please upload regular video . This videos help me a lot. please sir upload regular videos
@ghousepasha41724 ай бұрын
Please sir update videos regularly, we wait a lot for your videos
@Shisuiii692 ай бұрын
Question: Sir agr kia ho ky jo padding wala vector hai isme B¹ ki value 0 ky bjae khuch aur ajae ky wo update hoti rehti hai to is se padding vector 0 nhi rhe ga to kia ye model me affect nhi kre ga ?
@slaypop-b5n29 күн бұрын
Bro Did u find the answer ? Had the same doubt
@Ishant87521 күн бұрын
Same doubt
@slaypop-b5n14 күн бұрын
@@Ishant875 any updates , bro ? Did u get the answer ?
@AmitBiswas-hd3js4 ай бұрын
Please cover this entire Transformer architecture as soon as possible
@1111Shahad5 ай бұрын
Thank you Nitish, Waiting for your next upload.
@ESHAANMISHRA-pr7dh4 ай бұрын
Respected sir, I request you to please complete the playlist. I am really thankful to you for your amazing videos in this playlist. I have recommended this playlist to a lot of my friends and they loved it too. Thanks for providing such content for free🙏🙏
@gurvgupta55155 ай бұрын
Thanks for this video sir. Can you also make a video on Rotary Positional Embeddings (RoPE) that is used in Llama as well as other LLMs for enhanced attention.
@AkashSingh-oz7qxАй бұрын
please also cover Generative and diffusion models.
@muhammadsheraz1775 ай бұрын
Sir kindly can you tell that when this playlist will complete.
@shreeyagupta57205 ай бұрын
Congratulations for 200k sir 👏 🎉🍺
@bmp-zz9pu4 ай бұрын
SIr krdo pls iss playlist ko poora!!!!!!!!!
@mayyutyagi5 ай бұрын
Amazing series full of knowledge...
@krisharora29594 ай бұрын
Next video is awaited more than anything
@arunkrishna10364 ай бұрын
Sir what if Beta value is updated during learning process? Then it will get added along with the padded zeros making it a non zero value in the further iterations
@Shisuiii692 ай бұрын
Same confusion, did you find the answer?
@physicskiduniya80544 ай бұрын
Bhaiya! Awaiting for your course upcoming videos please try to complete this playlist asap bhaiya
@znyd.5 ай бұрын
Congrats on the 200k subs, love from Bangladesh ❤.
@darkpheonix65924 ай бұрын
please upload remaining videos quickly
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost. God bless you.
@WIN_13064 ай бұрын
sir can u tell that around how many and which topics are left?
@princekhunt14 ай бұрын
Sir, Please complete this series.
@SulemanZeb.5 ай бұрын
Please start MLOPs playlist as we are desperately waiting for.......
@SaurabhKumar-t4s3 ай бұрын
At 46:04, if sigma4 is 0 then how do we divide with this value.
@Shisuiii692 ай бұрын
Mujhe bhi same confusion thi chatgpt se pta kiya usne kha ky hum ek error value add krty jo very close to zero hota hai to isliye hum zero likh dety hai after normalization
@ishika75855 ай бұрын
Kindly make video on Regex as well
@WIN_13064 ай бұрын
what is regex?
@himansuranjanmallick163 ай бұрын
thank you sir................
@UCS_B_DebopamDeyКүн бұрын
thank you sir
@rajnishadhikari92805 ай бұрын
Thanks for this amazing series.
@peace-it4rg4 ай бұрын
sir mera doubt that ki mai agar transformer architecture mai batchnorm use karoon kunki jo values matrix mai hai un sabka apna learning rate and bias factor hai to jo bias hai uskai karan to zero chala hi jayega fir layer norm kyun. kyunki ham ((x-u)/var)*lambda+bias krtai hi hain to bias to apne aap usko zero nhi hone dega. Please help sir
@RamandeepSingh_044 ай бұрын
still it will be a very small number and will affect the result and not represent the true picture of the feature in batch normalization.
@WIN_13064 ай бұрын
@@RamandeepSingh_04 compared to others who are without padding it will be small, but still sir wrote zero but zero to nhi hi hoga
@technicalhouse98205 ай бұрын
Sir love you so much from Pakistan
@intcvn4 ай бұрын
complete jaldi sir waiting asf
@rb47545 ай бұрын
Congratulations for 200k subscribers!!!!!!!!!!!!!!!!!!
@sagarbhagwani71935 ай бұрын
thanks sir plse complete this playlist asap
@hassan_sid5 ай бұрын
It would be great if you make a video on RoPE
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost.
@WIN_13064 ай бұрын
i am the 300th person to like this video sir plzz upload next vidoes we are eagerly waiting
@29_chothaniharsh625 ай бұрын
Sir can you please continue the 100 interview questions on ML playlist?
@shubharuidas26245 ай бұрын
Please also continue with vision transformer
@advaitdanade75385 ай бұрын
Sir please end this playlist fast placement season is nearby😢
@arpitpathak72765 ай бұрын
Thank you sir I am waiting for this video ❤
@SuperRia3311 күн бұрын
Thanks
@virajkaralay88445 ай бұрын
Absolute banger video again. Appreciate the efforts you're taking for transformers. Cannot wait for when you explain the entire transformer architecture.
@virajkaralay88445 ай бұрын
Also, congratulations for 200k subscribers. May you reach many more milestones
@vinayakbhat95302 ай бұрын
excellent
@not_amanullah5 ай бұрын
This is helpful 🖤
@oden40134 ай бұрын
sir please upload next video please its almost a month
@rose94665 ай бұрын
Can you give an estimate by when this playlist will be completed
@khatiwadaAnish3 ай бұрын
Awesome 👍👍
@manojprasad67815 ай бұрын
Waiting for the next video💌
@not_amanullah5 ай бұрын
Thanks ❤
@barryallen52435 ай бұрын
Just ignoring padded rows while performing batch normalization should also work, I feel like it that padded zeros are not the only reason we layer normalization instead of batch normalization.
@WIN_13064 ай бұрын
how would you ignore padding cols in batch normalisation?
@SANJAYTYAGI-bk6tx5 ай бұрын
Sir In batch normalization , in your example we have three mean and three variance along with same number of beta and gamma i.e. 3. But in layer normalization , we have eight mean and eight variance along with 3 beta and 3 gamma. That means number of beta and gamma are same in both batch and layer normalization. Is it correct? Pl elaborate on it .
@campusx-official5 ай бұрын
Yes
@WIN_13064 ай бұрын
mean and variance are used for normalisation ,beta and gamma are used for scaling
@teksinghayer54695 ай бұрын
when will you code transformer from scratch in pytorch
@adarshsagar98175 ай бұрын
sir please complete the NLP playlist
@WIN_13064 ай бұрын
which one? how many videos does it have?
@vikassengupta84274 ай бұрын
Sir next video ❤❤
@gauravbhasin26255 ай бұрын
Nitish, please relook at your covariate shift funds... yes, you are partially correct but how you explained covariate shift is actually incorrect. (example - Imagine training a model to predict if someone will buy a house based on features like income and credit score. If the model is trained on data from a specific city with a certain average income level, it might not perform well when used in a different city with a much higher average income. The distribution of "income" (covariate) has shifted, and the model's understanding of its relationship to house buying needs to be adjusted.)
@WIN_13064 ай бұрын
ig , the explanation that sir gave and your explanation are same with different example of covariate shift
@dharmendra_3975 ай бұрын
Very nice video
@zerotohero10024 ай бұрын
one month ho gya sir please upload eagarly waiting🥺🥺🥺
@harshgupta-w5y5 ай бұрын
Jldi next video dalo sir
@MrSat0015 ай бұрын
Great 👍
@turugasairam2886Ай бұрын
sir, why dont you translate it to english and upload , making a new channel like campusX english, i am sure it will attract more audience and reach. I am sure you would have thought of this already
@titaniumgopal4 ай бұрын
Sir PDF Update karo
@aksholic27975 ай бұрын
200k🎉
@bmp-zz9pu5 ай бұрын
A video after 2 weeks in this playlist.....itna zulam mat karo.....thoda tez kaam kro sirji..............
@ghousepasha41724 ай бұрын
Sir please complete playlist I will pay 5000 for that
@faizack3 ай бұрын
😂😂😂🎉
@space_ace77105 ай бұрын
Yeah!!
@not_amanullah4 ай бұрын
🖤🤗
@DarkShadow009725 ай бұрын
Bring some coding example bro
@ashutoshpatidar32885 ай бұрын
please be a little fast!
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost. God bless you.
@Amanullah-wy3ur4 ай бұрын
this is helpful 🖤
@Amanullah-wy3ur4 ай бұрын
thanks ❤
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost. God bless you.
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost. God bless you.
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost. God bless you.
@anonymousman30144 ай бұрын
Sir, is transformer architecture completed as I want to cover it ASAP, I have covered the topics till attention mechanism. I want to cover the topic in one go. Sir please tell please. And, sir I request to upload all video asap. I want to learn a lot. And thanks for the amazing course at 0 cost. God bless you.