CampusxBoost(consistency=True, hard_wrork=100%, quality = outstanding, future_Ml = bright) Salute to you sir ❤
@abhasmalguri2905 Жыл бұрын
Literally watching your 100th video in this series! Your content is really informative and addictive! The journey of learning from Day 1 to now has been full of excitement. Thank you for this Wonderful contribution to the Data Science community!
@sandeepthukral30189 ай бұрын
Hi bro , iss this playlist enough to learn complete ML? SInce I am also watching it from day 1
@lakshaybhati12346 ай бұрын
finally 100 videos completed
@RahulRaj-nw6rr2 ай бұрын
There is a slight logical mistake in sir's method of creating a new dataset after upsampling. The correct way is to reset the index of the new dataset after creating it by upsampling. Because if we do not reset the index, then the next dataset will not contain the desired rows. You can check it too. So, each time after creating the new dataset, you need to run this code before processing the new dataset for the next decision stump - new_df = new_df.reset_index(drop = True)
@core40322 жыл бұрын
great sir, I really enjoyed watching this series.
@charanpoojary48043 ай бұрын
Great🤩
@AltafAnsari-tf9nl Жыл бұрын
Hats off to you because i think you are the only one who has shown the algorithm working in a notebook from scratch.
@narendraparmar163111 ай бұрын
Nice Explanation Thanks
@anaykhator6475 Жыл бұрын
Why do we need to up-sample the dataset if we have already updated the weights of the correctly and incorrectly predicted samples. The incorrect ones would automatically have greater say in the subsequent samples, right? Also, if we up-sample the dataset and create new one in each iteration, few rows of data (correctly classified) will keep on disappearing from the dataset and the dataset will be left only with hand full of values copies of incorrectly classified samples. Wouldn't the diminishing representation of all the rows in each iteration not let the model understand the deeper trends/pattern in the data?
@sultanaparvin729 ай бұрын
I have the same question
@sumanthpichika5295 Жыл бұрын
Superb intuition
@abhinavkhandelwal-w4w9 ай бұрын
Hi Nitish, Thank you for your playlist on Machine Learning. Helping me alot as i am switching my career from Project Management to Data Science. In this video, in 2nd df you said that row 0 is wrongly classified but it was row 2 as you said at 12:54. But at 13:22 you said 0th row which was wrongly classified. In the upsampling we did't get 2nd row we got only 0. Need your clarification on this. Hope you will respond.
@abhranilpal980410 ай бұрын
brilliant tutorial
@sachin27252 жыл бұрын
Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?
@meenatyagi974011 ай бұрын
But how can we extend this code for multiclass classification problem as sign test is used for only binary one?
@pankajbhatt83152 жыл бұрын
Amazing video
@aditya_013 жыл бұрын
thank u very much sir really nice video
@rajgurubhosale868027 күн бұрын
i have a question why u he didnt created a video on adaboostRegressor?
@kindaeasy97977 ай бұрын
wow , besttt !!!
@just__arif2 жыл бұрын
Thank you sir! It helps a lot.
@murumathi43073 жыл бұрын
Hai sir...pls help one question sir.. I have a 1 input apply but my output max 10 ..it's possible sir ....
@akzork2 жыл бұрын
Why you trained dt3 on second_df instead of third_df ?
@mustafamarvat8632 жыл бұрын
Are we going to use the updated weights in the second ? Ya phir whi model 1 k weights use kerney hai ?
@IqraKhan-xh2cp4 ай бұрын
Done with the 100th video
@parthshukla10253 жыл бұрын
100th video OP
@aditiarora21282 жыл бұрын
sir can we use the same code for image dataset?
@yashjain63722 жыл бұрын
best sir
@pradeeptamohanty24575 ай бұрын
sir there is one doubt if error is more thn 0.5 then alpha is negative .so,it will start decreasing weights of wrong one and increasing weights of right one.
@LonerFactsClub3 ай бұрын
If the error of the weak learner selected is more than 0.5, then it's worse than Randomly guessing! We should adjust the weak learner so that it performs well. May be increasing the depth by one more level . According to my current knowledge, this is what I came up with. What are your views on this?🙂
@kislaykrishna55993 жыл бұрын
Sir, please upload video on daily basis. As I am following your classes and some blogs to learn data science 🙏
@a1x45h3 жыл бұрын
His quality of videos is top notch. It takes time man. Other youtubers pushing a new video everyday is below par and spammy. Their quality is garbage. Nitish spent 2 hours to explain the first gradient descent video also he told that he prepared for 2 days for that video. Let this man take his time. We can wait a bit.
@rajeevranjan50073 жыл бұрын
@@a1x45h Agree
@ankitbiswas83802 жыл бұрын
insaan hai ...machine nahi
@raj-nq8ke Жыл бұрын
Gold.
@MrYatinchamp2 жыл бұрын
Sir, the additional term : 0.00001 that you added in the denominator will make the whole model weight term very big mathematically. Isn't it ?
@Shobhitchoudhary321 Жыл бұрын
Nope
@singh63152 жыл бұрын
while training the third model you have taken values from second model because of which you got too many error value in 3rd training , anyways video was great
@akzork2 жыл бұрын
Yeah, I got confused because of that. can you please explain what is the right procedure here?
@yashjain63722 жыл бұрын
Thats correct. Random data from 2nd get passed to thrid( upsampling to error points and down sampling to correct pt)
@yashjain63722 жыл бұрын
okay, got it. Error in line 369 . Anyways Awesome video. The best