13.4.3 Feature Permutation Importance Code Examples (L13: Feature Selection)

  Рет қаралды 8,973

Sebastian Raschka

Sebastian Raschka

Күн бұрын

Sebastian's books: sebastianrasch...
This video shows code examples for computing permutation importance in mlxtend and scikit-learn.
Permutation importance is a model-agnostic, versatile way for computing the importance of features based on a machine learning classifier or regression model.
Code notebooks:
Wine data example: github.com/ras...
learning-fs21/blob/main/13-feature-selection/05_permutation-importance.ipynb
Using a random feature as a control: github.com/ras...
Checking correlated features: github.com/ras...

Slides: sebastianrasch...
Random forest importance video: • 13.3.2 Decision Trees ...
-------
This video is part of my Introduction of Machine Learning course.
Next video: • 13.4.4 Sequential Feat...
The complete playlist: • Intro to Machine Learn...
A handy overview page with links to the materials: sebastianrasch...
-------
If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Пікірлер: 13
@lucarubinetti2523
@lucarubinetti2523 Жыл бұрын
Can you do some video about Shapley values for feature importance? Thanks a lot :)
@bezawitdagne5756
@bezawitdagne5756 Жыл бұрын
Thank you so very much 💙🙏
@andrewm4894
@andrewm4894 7 ай бұрын
Great video really useful explanations
@SebastianRaschka
@SebastianRaschka 6 ай бұрын
Glad you liked it
@ocamlmail
@ocamlmail Жыл бұрын
Very useful, thank you!
@khaledsrrr
@khaledsrrr Жыл бұрын
Keep them coming ❤❤❤ I liked it
@liss9597
@liss9597 Жыл бұрын
Very thankful for this video and the entire set of videos. In min 7:08 X_test and y_test must be numpy array, right? If yes, should I use X_test.values and y_test.values or X_test.to_numpy() and y_test.to_numpy() ?? Thanks again!
@mariajuanasadventures3672
@mariajuanasadventures3672 2 жыл бұрын
Thanks for the video
@SebastianRaschka
@SebastianRaschka 2 жыл бұрын
Glad you liked it!
@sahanaseetharaman1440
@sahanaseetharaman1440 2 жыл бұрын
Thanks for the video! In this case, two of the features are perfectly correlated. What if the correlation is less than |1|? Also, what happens in the case of categorical features? Suppose there is a feature column with multiple categorical features, and we one-hot encode it, does it make sense to sum their feature importances to get the importance of that feature?
@monashaaban2337
@monashaaban2337 Жыл бұрын
hi Sebastian Raschka, can you explain LDA with code please?
@SebastianRaschka
@SebastianRaschka Жыл бұрын
Coincidentally, I wrote about it here a few years back: sebastianraschka.com/Articles/2014_python_lda.html
@monashaaban2337
@monashaaban2337 Жыл бұрын
@@SebastianRaschka thank you.
13.4.4 Sequential Feature Selection (L13: Feature Selection)
30:00
Sebastian Raschka
Рет қаралды 11 М.
Don't look down on anyone#devil  #lilith  #funny  #shorts
00:12
Devil Lilith
Рет қаралды 18 МЛН
GIANT Gummy Worm Pt.6 #shorts
00:46
Mr DegrEE
Рет қаралды 117 МЛН
Кәсіпқой бокс | Жәнібек Әлімханұлы - Андрей Михайлович
48:57
Permutation Importance For Machine Learning Models in Python
9:41
13.4.1 Recursive Feature Elimination (L13: Feature Selection)
28:52
Sebastian Raschka
Рет қаралды 11 М.
13.4.2 Feature Permutation Importance (L13: Feature Selection)
16:56
Sebastian Raschka
Рет қаралды 13 М.
How to find Feature Importance in your model
12:24
Jonathan Perry
Рет қаралды 41 М.
Feature selection in machine learning | Full course
46:41
Data Science with Marco
Рет қаралды 27 М.
13.0 Introduction to Feature Selection (L13: Feature Selection)
16:10
Sebastian Raschka
Рет қаралды 5 М.
Feature Selection in Python | Machine Learning Basics | Boston Housing Data
27:45
Interpretable Machine Learning - Feature Importance - Permutation Feature Importance (PFI)
25:22
Statistical Learning and Data Science
Рет қаралды 443
Don't look down on anyone#devil  #lilith  #funny  #shorts
00:12
Devil Lilith
Рет қаралды 18 МЛН