Advanced Feature Engineering Tips and Tricks - Data Science Festival

  Рет қаралды 13,032

Data Science Festival

Data Science Festival

Күн бұрын

Пікірлер: 14
@jamescaldwell3207
@jamescaldwell3207 3 жыл бұрын
This is one of the best videos, if not the best, I've seen on this topic so far. Most people are too focused on code to get deep into the what and how.
@tommyj0059
@tommyj0059 3 жыл бұрын
This is GOLD and please keep in mind this metal was created by very big stars.
@gouravkumar9011
@gouravkumar9011 3 ай бұрын
This is amazing. I really like this.
@futureceltic00
@futureceltic00 2 ай бұрын
Can we get a linkedin post or video update regarding if these tips are still applicable today? Also, you highly encourage target mean encoding but I think it inherently leaks some information to the training set. Am I wrong to assume that? Thanks and nice video btw.
@iustingeorgevici5509
@iustingeorgevici5509 4 жыл бұрын
What do you think about boruta based on permutated random forest to help in feature selection. What about some "brute" feature engineering and then boruta, and the most important features "connects" with the dependent variable with a interpretable model like generalized additive models? Thank you
@TScottClendaniel
@TScottClendaniel 4 жыл бұрын
Unfortunately, I know almost nothing about "boruta," so I can't help you on this one.
@chrstfer2452
@chrstfer2452 Жыл бұрын
Whats funny looking back to this now is that moment google stepped back? That was when they first got BERT to a pre-RLHF GPT-3 level of competence, but the rumor is some execs got spooked and backburnered it. And 2.5ish years on people started unironically intentionally using bing for the first time since they downloaded chrome. I expect those execs got canned but i havent followed closely.
@dinoscheidt
@dinoscheidt 3 жыл бұрын
Awesome talk. Very well prepared. Thanks a lot from Berlin
@jonathanhexner
@jonathanhexner 3 жыл бұрын
Great talk! Can you please elaborate on the feature selection method you typically use?
@shyamsundarramadoss3567
@shyamsundarramadoss3567 3 жыл бұрын
Thanks a lot for a great session. Also, can you pls elaborate on this method of taking the leaves of a decision tree as a new feature?? As far as I know mostly leaves of a decision tree must be either labels (classification) or target numerical values (in cases of regression). But features can be categorical or numeric. But Im not sure whether if this will be after converting categorical to some form of numeric in any of the imputing ways possible. So according to you those feature vectors will be together converted to a single vector after passing them to the fitted decision tree. Am I right? Or is the method something different?
@iustingeorgevici5509
@iustingeorgevici5509 4 жыл бұрын
Another question : how can we take output of unsupervised algorithms and input them into supervised algorithms?
@TScottClendaniel
@TScottClendaniel 4 жыл бұрын
You can take the cluster assignment, as in the cluster ID, and use that as a new feature in the supervised algorithm. Thanks for asking!
Art of Feature Engineering for Data Science - Nabeel Sarwar
29:37
요즘유행 찍는법
0:34
오마이비키 OMV
Рет қаралды 12 МЛН
«Жат бауыр» телехикаясы І 26-бөлім
52:18
Qazaqstan TV / Қазақстан Ұлттық Арнасы
Рет қаралды 434 М.
Learn Machine Learning Like a GENIUS and Not Waste Time
15:03
Infinite Codes
Рет қаралды 384 М.
The Data Hour: An Overview of Feature Engineering for Data Science
59:51
Analytics Vidhya
Рет қаралды 1,2 М.
How to use Feature Engineering for Machine Learning, Equations
15:32
Feature Engineering Secret From A Kaggle Grandmaster
22:23
Mario Filho English
Рет қаралды 41 М.
Feature Engineering for Time Series Forecasting - Kishan Manani
1:02:14
DataTalksClub ⬛
Рет қаралды 28 М.
Feature selection in machine learning | Full course
46:41
Data Science with Marco
Рет қаралды 31 М.
Feature Engineering Techniques For Machine Learning in Python
47:58