AdaBoost Algorithm | Code from Scratch

  Рет қаралды 29,069

CampusX

CampusX

Күн бұрын

Пікірлер: 42
@Shisuiii69
@Shisuiii69 Жыл бұрын
CampusxBoost(consistency=True, hard_wrork=100%, quality = outstanding, future_Ml = bright) Salute to you sir ❤
@abhasmalguri2905
@abhasmalguri2905 Жыл бұрын
Literally watching your 100th video in this series! Your content is really informative and addictive! The journey of learning from Day 1 to now has been full of excitement. Thank you for this Wonderful contribution to the Data Science community!
@sandeepthukral3018
@sandeepthukral3018 9 ай бұрын
Hi bro , iss this playlist enough to learn complete ML? SInce I am also watching it from day 1
@lakshaybhati1234
@lakshaybhati1234 6 ай бұрын
finally 100 videos completed
@RahulRaj-nw6rr
@RahulRaj-nw6rr 2 ай бұрын
There is a slight logical mistake in sir's method of creating a new dataset after upsampling. The correct way is to reset the index of the new dataset after creating it by upsampling. Because if we do not reset the index, then the next dataset will not contain the desired rows. You can check it too. So, each time after creating the new dataset, you need to run this code before processing the new dataset for the next decision stump - new_df = new_df.reset_index(drop = True)
@core4032
@core4032 2 жыл бұрын
great sir, I really enjoyed watching this series.
@charanpoojary4804
@charanpoojary4804 3 ай бұрын
Great🤩
@AltafAnsari-tf9nl
@AltafAnsari-tf9nl Жыл бұрын
Hats off to you because i think you are the only one who has shown the algorithm working in a notebook from scratch.
@narendraparmar1631
@narendraparmar1631 11 ай бұрын
Nice Explanation Thanks
@anaykhator6475
@anaykhator6475 Жыл бұрын
Why do we need to up-sample the dataset if we have already updated the weights of the correctly and incorrectly predicted samples. The incorrect ones would automatically have greater say in the subsequent samples, right? Also, if we up-sample the dataset and create new one in each iteration, few rows of data (correctly classified) will keep on disappearing from the dataset and the dataset will be left only with hand full of values copies of incorrectly classified samples. Wouldn't the diminishing representation of all the rows in each iteration not let the model understand the deeper trends/pattern in the data?
@sultanaparvin72
@sultanaparvin72 9 ай бұрын
I have the same question
@sumanthpichika5295
@sumanthpichika5295 Жыл бұрын
Superb intuition
@abhinavkhandelwal-w4w
@abhinavkhandelwal-w4w 9 ай бұрын
Hi Nitish, Thank you for your playlist on Machine Learning. Helping me alot as i am switching my career from Project Management to Data Science. In this video, in 2nd df you said that row 0 is wrongly classified but it was row 2 as you said at 12:54. But at 13:22 you said 0th row which was wrongly classified. In the upsampling we did't get 2nd row we got only 0. Need your clarification on this. Hope you will respond.
@abhranilpal9804
@abhranilpal9804 10 ай бұрын
brilliant tutorial
@sachin2725
@sachin2725 2 жыл бұрын
Hello Sir, XGBoost is not included in playlist, could you please make a video on XGBoost ?
@meenatyagi9740
@meenatyagi9740 11 ай бұрын
But how can we extend this code for multiclass classification problem as sign test is used for only binary one?
@pankajbhatt8315
@pankajbhatt8315 2 жыл бұрын
Amazing video
@aditya_01
@aditya_01 3 жыл бұрын
thank u very much sir really nice video
@rajgurubhosale8680
@rajgurubhosale8680 27 күн бұрын
i have a question why u he didnt created a video on adaboostRegressor?
@kindaeasy9797
@kindaeasy9797 7 ай бұрын
wow , besttt !!!
@just__arif
@just__arif 2 жыл бұрын
Thank you sir! It helps a lot.
@murumathi4307
@murumathi4307 3 жыл бұрын
Hai sir...pls help one question sir.. I have a 1 input apply but my output max 10 ..it's possible sir ....
@akzork
@akzork 2 жыл бұрын
Why you trained dt3 on second_df instead of third_df ?
@mustafamarvat863
@mustafamarvat863 2 жыл бұрын
Are we going to use the updated weights in the second ? Ya phir whi model 1 k weights use kerney hai ?
@IqraKhan-xh2cp
@IqraKhan-xh2cp 4 ай бұрын
Done with the 100th video
@parthshukla1025
@parthshukla1025 3 жыл бұрын
100th video OP
@aditiarora2128
@aditiarora2128 2 жыл бұрын
sir can we use the same code for image dataset?
@yashjain6372
@yashjain6372 2 жыл бұрын
best sir
@pradeeptamohanty2457
@pradeeptamohanty2457 5 ай бұрын
sir there is one doubt if error is more thn 0.5 then alpha is negative .so,it will start decreasing weights of wrong one and increasing weights of right one.
@LonerFactsClub
@LonerFactsClub 3 ай бұрын
If the error of the weak learner selected is more than 0.5, then it's worse than Randomly guessing! We should adjust the weak learner so that it performs well. May be increasing the depth by one more level . According to my current knowledge, this is what I came up with. What are your views on this?🙂
@kislaykrishna5599
@kislaykrishna5599 3 жыл бұрын
Sir, please upload video on daily basis. As I am following your classes and some blogs to learn data science 🙏
@a1x45h
@a1x45h 3 жыл бұрын
His quality of videos is top notch. It takes time man. Other youtubers pushing a new video everyday is below par and spammy. Their quality is garbage. Nitish spent 2 hours to explain the first gradient descent video also he told that he prepared for 2 days for that video. Let this man take his time. We can wait a bit.
@rajeevranjan5007
@rajeevranjan5007 3 жыл бұрын
@@a1x45h Agree
@ankitbiswas8380
@ankitbiswas8380 2 жыл бұрын
insaan hai ...machine nahi
@raj-nq8ke
@raj-nq8ke Жыл бұрын
Gold.
@MrYatinchamp
@MrYatinchamp 2 жыл бұрын
Sir, the additional term : 0.00001 that you added in the denominator will make the whole model weight term very big mathematically. Isn't it ?
@Shobhitchoudhary321
@Shobhitchoudhary321 Жыл бұрын
Nope
@singh6315
@singh6315 2 жыл бұрын
while training the third model you have taken values from second model because of which you got too many error value in 3rd training , anyways video was great
@akzork
@akzork 2 жыл бұрын
Yeah, I got confused because of that. can you please explain what is the right procedure here?
@yashjain6372
@yashjain6372 2 жыл бұрын
Thats correct. Random data from 2nd get passed to thrid( upsampling to error points and down sampling to correct pt)
@yashjain6372
@yashjain6372 2 жыл бұрын
okay, got it. Error in line 369 . Anyways Awesome video. The best
@tanmaygupta8288
@tanmaygupta8288 6 ай бұрын
adaboost seems to be a lot more complicated
AdaBoost Hyperparameters | GridSearchCV in Adaboost
11:13
CampusX
Рет қаралды 21 М.
AdaBoost - A Step by Step Explanation
19:23
CampusX
Рет қаралды 47 М.
GIANT Gummy Worm #shorts
0:42
Mr DegrEE
Рет қаралды 152 МЛН
How Adaboost Classifier Works? | Geometric Intuition
17:14
CampusX
Рет қаралды 44 М.
Diffusion models from scratch in PyTorch
30:54
DeepFindr
Рет қаралды 267 М.
Gradient Boosting Explained | How Gradient Boosting Works?
32:49
AdaBoost, Clearly Explained
20:54
StatQuest with Josh Starmer
Рет қаралды 796 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,2 МЛН
GIANT Gummy Worm #shorts
0:42
Mr DegrEE
Рет қаралды 152 МЛН