Active Learning | Tutorial on Active Learning from Theory to Practice | ICML

  Рет қаралды 17,329

DSAI by Dr. Osbert Tay

DSAI by Dr. Osbert Tay

Күн бұрын

Пікірлер: 9
@beauzeta1342
@beauzeta1342 5 жыл бұрын
Scientific talk starts at 11:14
@jeroenritmeester73
@jeroenritmeester73 4 жыл бұрын
Passive learning is like watching KZbin's recommended videos, sometimes learning something new. Active learning is watching lectures on new technologies, ensuring you encounter something you haven't seen before.
@frederikzilly7998
@frederikzilly7998 3 жыл бұрын
Thank you for the illustrative talk. Just one question: In minute 22:00 there is the example of a probability p of y, given x p(y|x) potentially being below 0. Should it be better replaced by a threshold t ? I just stumbled seeing that negative probability...
@DrOsbert
@DrOsbert 3 жыл бұрын
Hello from the Editor of AIP, From what I understand, the talk is mostly referred to a generic binary classification problem, so for the case that probability, p(y|x) below 0, instances are classified as negative / -1 / 0 , and vice versa. Here, the threshold, t is 0, which is the default threshold and it may not represent an optimal interpretation of the predicted probabilities. (From Machine Learning Mastery machinelearningmastery.com/threshold-moving-for-imbalanced-classification/) This might be the case for a number of reasons, such as: 1. The predicted probabilities are not calibrated, e.g. those predicted by an SVM or decision tree. 2. The metric used to train the model is different from the metric used to evaluate a final model. 3. The class distribution is severely skewed. 4. The cost of one type of misclassification is more important than another type of misclassification. Therefore, the 0 threshold value can be replaced by any value depending on your dataset, domain and use cases. I think, for illustration purpose and digest-able, the presenter make a hard threshold of > 0 for positive instances. However, I might be wrong, if you have further doubt, I can help you to reach out the author for further clarification? Disclaimer: The above comments are solely my personal opinion.
@manohariisc
@manohariisc Жыл бұрын
@@DrOsbert p(y|x) becoming negative is problematic....are we not violating one of the axioms?
@zoeyx1998
@zoeyx1998 3 жыл бұрын
Thanks! This helps a lot.
@surflaweb
@surflaweb 4 жыл бұрын
only some cats can understand this
@BigAsciiHappyStar
@BigAsciiHappyStar 3 жыл бұрын
8 Out Of 10 Cats Does Active Learning (please don't delete me) 😊
@jakebrowning2373
@jakebrowning2373 2 жыл бұрын
OMG
My scorpion was taken away from me 😢
00:55
TyphoonFast 5
Рет қаралды 2,7 МЛН
1% vs 100% #beatbox #tiktok
01:10
BeatboxJCOP
Рет қаралды 67 МЛН
黑天使只对C罗有感觉#short #angel #clown
00:39
Super Beauty team
Рет қаралды 36 МЛН
NeurIPS 2020 Tutorial: Deep Implicit Layers
1:51:35
Zico Kolter
Рет қаралды 49 М.
Machine Learning for Nervous Beginners | By MIT, Purdue AI PhDs
1:37:24
Stanford Seminar - Information Theory of Deep Learning, Naftali Tishby
1:24:44
Bayesian Deep Learning and Probabilistic Model Construction - ICML 2020 Tutorial
1:57:07
What is Active Learning? Machine Learning Expert Shares
33:21
Snorkel AI
Рет қаралды 1,4 М.
Stanford CS229 I Machine Learning I Building Large Language Models (LLMs)
1:44:31
2. Bayesian Optimization
1:34:54
UAI 2018
Рет қаралды 64 М.
Active (Machine) Learning - Computerphile
6:11
Computerphile
Рет қаралды 116 М.
Active Learning - Machine Learning
20:11
The Brainy Computer
Рет қаралды 6 М.