Machine learning - Random forests

  Рет қаралды 238,601

Nando de Freitas

Nando de Freitas

Күн бұрын

Пікірлер: 80
@lwoltersyt
@lwoltersyt 8 жыл бұрын
I simply love all explanatory video's of Nando de Freitas; clear and effective. Straight to the point
@siddarthjay3787
@siddarthjay3787 9 жыл бұрын
Excellent video. Professors like him are the reason learning is fun. They throw exciting ideas at you as if they were no thing.
@breckenpeter5343
@breckenpeter5343 3 жыл бұрын
I guess im randomly asking but does anyone know a trick to log back into an instagram account? I somehow lost my account password. I appreciate any tips you can offer me.
@arthurlayne6350
@arthurlayne6350 3 жыл бұрын
@Brecken Peter Instablaster ;)
@breckenpeter5343
@breckenpeter5343 3 жыл бұрын
@Arthur Layne I really appreciate your reply. I got to the site on google and I'm trying it out now. Seems to take a while so I will reply here later with my results.
@breckenpeter5343
@breckenpeter5343 3 жыл бұрын
@Arthur Layne it did the trick and I actually got access to my account again. I am so happy! Thank you so much, you saved my account :D
@arthurlayne6350
@arthurlayne6350 3 жыл бұрын
@Brecken Peter You are welcome :D
@meganmaloney192
@meganmaloney192 9 жыл бұрын
Incredibly helpful! Thank you for posting.
@comadano
@comadano 12 жыл бұрын
Thanks for posting these great lectures to be publicly available. I hope the remaining lecture videos get posted soon!
@ApiolJoe
@ApiolJoe 7 жыл бұрын
The ressource that helped me the most in least amount of time. Thanks for sharing.
@helenlundeberg
@helenlundeberg 9 жыл бұрын
Around 1:08:25, the professor talks about Bayesian optimization using a GP prior or something. What I'm not clear on is what is the prior on ?
@kellyli1920
@kellyli1920 10 жыл бұрын
This is so great clear and easy to understand!!! Thank you so much!!!
@junfu8695
@junfu8695 9 жыл бұрын
+Kelly Li it is
@drbhojrajghimire3908
@drbhojrajghimire3908 8 жыл бұрын
Very good classroom video which is very simple, interesting and clearly explained.
@LuisFelipeZeni
@LuisFelipeZeni 7 жыл бұрын
Excellent professor, thanks for sharing your Knowledge with us.
@alefranc100
@alefranc100 10 жыл бұрын
Tks Nando ... This lecture is really good and useful !
@GoBlue7171
@GoBlue7171 7 жыл бұрын
Wow this is amazing. Such a clear and informative lecture.
@rodrigo100kk
@rodrigo100kk 5 жыл бұрын
This course is amazing !
@leicaandrei
@leicaandrei 8 жыл бұрын
why is the bias very low at 40:20?
@shervin0
@shervin0 8 жыл бұрын
I'm not completely sure, but it could be because of the way trees are created. They try to maximize information gain with simple decisions. Due to the simple decisions and use of the information gain cannot cause very biased divisions (As I said, I'm not sure if this is the reason! :) )
@antonosipov100
@antonosipov100 9 жыл бұрын
Very good introduction to Random Forest. Thank you!
@tonyperez8878
@tonyperez8878 9 жыл бұрын
how to compute the info gain? what is the relation of it with the entropy and mutual information ?
@hyperzoanoid
@hyperzoanoid 11 жыл бұрын
Do random forests need classification trees with more than 1 node to be effective?
@kaleeswaranm2679
@kaleeswaranm2679 7 жыл бұрын
Very good lecture, but I would suggest watching in *1.5 speed.
@cfelixcfelix6568
@cfelixcfelix6568 9 жыл бұрын
I did not get the thing with the 3 trees and the historgrams. Does not each tree give a definite answer?
@AbhijeetSachdev
@AbhijeetSachdev 9 жыл бұрын
+Cfelix Cfelix Ofcourse it will give, but in case of classification you will just take the sign of average value. . which is exactly the same as majority. In case of regression, "averaging" actually comes into account
@nicooteiza
@nicooteiza 8 жыл бұрын
+Cfelix Cfelix Actually, not necessarily. In many real-life cases, data is not separable on the features, so even a very big tree would not be able to give definite answers. Moreover, each individual tree could be "pruned" to a certain depth, so the answers would not be definite classes, but "probabilities" for each class. For example, in a 3 class problem, for each tree you would have a [p1, p2, p3] result, where p1+p2+p3 = 1, and the final prediction can be obtained by adding all the tree's vectors element-wise and dividing by the number of trees.
@EmreOzanAlkan
@EmreOzanAlkan 10 жыл бұрын
Thank you! It's really nice and easy to understand!
@jiansenxmu
@jiansenxmu 10 жыл бұрын
A Perfect and Amazing Lecture on Random Forest, dank u wel !
@PrakashMatthew
@PrakashMatthew 7 жыл бұрын
Thank you @Nando de Freitas. Is the second part of the lecture, the one to be taken by the grad student, available online?
@satter87henne
@satter87henne 6 жыл бұрын
I looked up the ressource (Criminisi) and I wonder: Is there an R package or a Python package where this specific algorithm is implemented. There are many R packages and I know there is sci-kit learn in Python. However, I want to make sure I use the more general model as outlied by Criminisi.
@fangliren
@fangliren 11 жыл бұрын
I'm not sure I understand your question; with only one node (which would have to be the "parent" node, by definition), the tree wouldn't have any "children" nodes into which to sort the data based on the criterion in the parent node. The tree wouldn't do anything at all... Perhaps you are asking a different question?
@tadinglesby8997
@tadinglesby8997 8 жыл бұрын
Can one have a greater level of tree depth in their model (say D=10) with only 8 features. Or does it have to be a a maximum depth of 8 corresponding to 8 features?
@ecemilgun9867
@ecemilgun9867 7 жыл бұрын
You can actually use a feature in more than 1 splits. It sometimes causes overfitting but is sometimes needed: For instance both deficiency and abundance of glucose in blood levels indicate a disease.
@fatemehsaki9020
@fatemehsaki9020 11 жыл бұрын
Really Great lecture and teacher !!! Thanks so much............
@shubhamjha1
@shubhamjha1 8 жыл бұрын
The next lecture(the one about kinect and stuff) seems pretty interesting. Could i have a link to that?
@zenzafine4500
@zenzafine4500 5 жыл бұрын
did u get the link?
@zenzafine4500
@zenzafine4500 5 жыл бұрын
i think is this one kzbin.info/www/bejne/l4nUenacfZmNoqM
@ravimadhavan1984
@ravimadhavan1984 9 жыл бұрын
Great lecture!
@agnerraphael
@agnerraphael 8 жыл бұрын
Thanks for publising, Great Help for my Project
@sameenatasneemshaikh8349
@sameenatasneemshaikh8349 7 жыл бұрын
Thank You sir excellent lecture please share other videos like you said Bayesian optimization lecture please share that lecture also.
@PravinMaske1
@PravinMaske1 9 жыл бұрын
excellent video. very helpful easily understood...
@reneveloso
@reneveloso 9 жыл бұрын
Thank you! Great lecture! You have a very brazilian name... :-)
@tpinto9
@tpinto9 7 жыл бұрын
Portuguese?
@sheikhsaqib6323
@sheikhsaqib6323 10 жыл бұрын
Can anyone here please help me out, am trying to classify accelerometer and gyroscope values of a robot using random forests. The classification labels are walking and under peturbation using diffenrent forces. From what i understand this is data that evolves over time as the robot walks. What my question is can i use the technique described in the video to do the classification or do I need to do it a different way(if so please point me to the right material where i can read about how to classify such data)
@aidaelkouri7050
@aidaelkouri7050 7 жыл бұрын
Hi! Loved the video. Was wondering how the problem would work if you chose greater than 2 features to look at? Would you plot them in the 3d then? Thank you
@RobMartin-gz3zk
@RobMartin-gz3zk 12 күн бұрын
yeah if you wanted to visualise it you could plot it in 3d. but all the algorithm is doing is seeing how many points are within a certain region, for example if you have n features then you can pick one of them and see how many points are above or below a certain value for that feature. the algorithm itself doesn't need to plot anything to work
@RobMartin-gz3zk
@RobMartin-gz3zk 12 күн бұрын
sorry didn't realise this was 7 years ago lol
@aadilraf
@aadilraf 7 жыл бұрын
Amazing lecture!
@rezayousufi4200
@rezayousufi4200 4 жыл бұрын
Super explanation!!!!
@BoredFOMOape
@BoredFOMOape 11 жыл бұрын
Thanks for posting! Great lecture
@d0msch
@d0msch 8 жыл бұрын
thanks for publishing, helped me a lot :)
@deepakk1944
@deepakk1944 6 жыл бұрын
What is bagging?
@sourceschaudes9587
@sourceschaudes9587 6 жыл бұрын
boosting
@theamazingjonad9716
@theamazingjonad9716 6 жыл бұрын
Bagging is an ensemble technique whereby you randomly build k models from k subset data from the original data. The subset data are built using bootstrap resampling.
@BangsterDK
@BangsterDK 9 жыл бұрын
Amazing. Thank you so much!
@abhijeet24patil
@abhijeet24patil 11 жыл бұрын
in step 2 of algo Random Forest : "until the minimum node size n_min is reached."-i dint understand this line. when we should stop Selecting m variables at random from the p variables.
@BudiMulyo
@BudiMulyo 7 жыл бұрын
Thank you,, ! It's all very clear posting video.. !
@ronthomas8331
@ronthomas8331 9 жыл бұрын
excellent lecture!!!!
@aaronbrinker2613
@aaronbrinker2613 7 жыл бұрын
Nando for President
@jihoonkim6819
@jihoonkim6819 7 жыл бұрын
Thank you for the nice lecture.
@AbhijeetSachdev
@AbhijeetSachdev 9 жыл бұрын
Brilliant lecture :)
@fathiafaraj479
@fathiafaraj479 7 жыл бұрын
Hello sir: I am a graduate student in computer science My scientific research (object detection using random forest and local binary pattern ) I hope you help in RF code ;i useing MATLAB
@andreiherasimau7800
@andreiherasimau7800 6 жыл бұрын
excellent video!
@R1CH4RDGZA
@R1CH4RDGZA 6 жыл бұрын
Loquacious, now that's one good obscure word just off the cuff
@alkodjdjd
@alkodjdjd 8 жыл бұрын
Te enviei correio e nao obtive resposta. Obrigado
@KrishnaDN
@KrishnaDN 4 жыл бұрын
My dream is to work with professor like him
@yuzhou1
@yuzhou1 6 жыл бұрын
great video, watch at 1.5x speed
@jennifermew8386
@jennifermew8386 8 жыл бұрын
Thank you
@张超-y3w
@张超-y3w 7 жыл бұрын
great!
@donnasara70
@donnasara70 8 жыл бұрын
Спасибо, что без акцента
@melaxxl
@melaxxl 9 жыл бұрын
walla shaza.
@kikokimo2
@kikokimo2 7 жыл бұрын
play in x2 speed!
@muneebaadil1898
@muneebaadil1898 8 жыл бұрын
Although I appreciate giving away the knowledge and trying but I'm sorry, VERY slow lecture, leading to a boring class.
@prathameshaware7433
@prathameshaware7433 7 жыл бұрын
Yeah..it was slow but anyways there is an option in KZbin to increase the speed so u can use it next time :)
@dinofranceschelli3969
@dinofranceschelli3969 8 жыл бұрын
Excellent ! Really easy to understand thanks for sharing !
@naiden100
@naiden100 7 жыл бұрын
Thanks for a great lecture!
Machine learning - Random forests applications
51:42
Nando de Freitas
Рет қаралды 18 М.
Machine learning - Importance sampling and MCMC I
1:16:18
Nando de Freitas
Рет қаралды 86 М.
Маусымашар-2023 / Гала-концерт / АТУ қоштасу
1:27:35
Jaidarman OFFICIAL / JCI
Рет қаралды 390 М.
Machine Learning Lecture 31 "Random Forests / Bagging" -Cornell CS4780 SP17
47:25
Machine learning - Bayesian optimization and multi-armed bandits
1:20:30
Nando de Freitas
Рет қаралды 132 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 424 М.
Machine learning - linear prediction
1:04:20
Nando de Freitas
Рет қаралды 70 М.
Machine learning - Decision trees
1:06:06
Nando de Freitas
Рет қаралды 221 М.
Data Science - Part V -  Decision Trees & Random Forests
51:07
Derek Kane
Рет қаралды 78 М.
Machine learning - Introduction to Gaussian processes
1:18:55
Nando de Freitas
Рет қаралды 299 М.
The better way to do statistics
17:25
Very Normal
Рет қаралды 265 М.