Statistical Learning: 2.2 Dimensionality and Structured Models

  Рет қаралды 26,189

Stanford Online

Stanford Online

Күн бұрын

Пікірлер: 16
@MrWater2
@MrWater2 10 ай бұрын
Please improve the quality of image because the material is extremely good
@ashleyforte2885
@ashleyforte2885 Жыл бұрын
Love the topological breakdown.
@Tsicloh
@Tsicloh Жыл бұрын
I feel like the approach in the book and that of these videos is totally different.😅
@ashfaqmahmudshovon9097
@ashfaqmahmudshovon9097 5 ай бұрын
It seems so...
@Sama-mn3hh
@Sama-mn3hh 10 ай бұрын
For the curse of dimensionality, can't we also use dimensionality reduction?
@brucepackard2729
@brucepackard2729 Жыл бұрын
Income follows a power law, pareto distribution though. What happens when you include a billionaire who didn't finish university education as a datapoint? You could discard the billionaire as an outlier, however you lose a lot of representativeness of real life if you do that.
@Sama-mn3hh
@Sama-mn3hh 10 ай бұрын
In practice, you use the z-score remove outliers from the dataset as a part of data prep before fitting a model. If the model is focused on predicting the income of the working class, then that would be a solution. Depends on the scope of the project I suppose.
@PaulMugo-r9c
@PaulMugo-r9c Жыл бұрын
Why do you use a circle when getting the nearest neighbor in 2 dimensions
@mattetor6726
@mattetor6726 Жыл бұрын
The first neighborhood is just using X1, ignoring X2. When you also take X2 into account, you start at the point (x1, x2)( =(0,0) in this case) and spread out with a circle until 10% of the data points are covered.
@billsurrette6092
@billsurrette6092 Жыл бұрын
You don’t have to use a circle. What you must so is define how you’re going to measure “distance” when determining how far away the points are. When you use Euclidean distance you get a circle. But there are other distance measures: Minkowski, Manhattan, etc. These will create other shapes, not circles. So the circle is the byproduct of the important thing: your chosen distance metric.
@PaulMugo-r9c
@PaulMugo-r9c Жыл бұрын
Thank you all for replying! I understand
@yulu-520c
@yulu-520c 4 ай бұрын
do you guys feel confused about everything
@yulu-520c
@yulu-520c 4 ай бұрын
what is p
@yulu-520c
@yulu-520c 4 ай бұрын
oh it is the number of variables, no problems now
@anthronox4992
@anthronox4992 3 ай бұрын
You should read the book, too. It's explained in more detail there (and more easily, imo).
@yulu-520c
@yulu-520c 3 ай бұрын
@@anthronox4992 get it, thx
Statistical Learning: 2.3 Model Selection and Bias Variance Tradeoff
10:05
Statistical Learning: 4.8 Generalized Linear Models
9:35
Stanford Online
Рет қаралды 14 М.
World’s strongest WOMAN vs regular GIRLS
00:56
A4
Рет қаралды 32 МЛН
The IMPOSSIBLE Puzzle..
00:55
Stokes Twins
Рет қаралды 91 МЛН
Python: introduction to statistical modeling
31:24
Statistics Ninja
Рет қаралды 9 М.
Statistical Learning: 2.1 Introduction to Regression Models
11:42
Stanford Online
Рет қаралды 40 М.
All Machine Learning algorithms explained in 17 min
16:30
Infinite Codes
Рет қаралды 326 М.
Statistical Learning: 1.1 Opening Remarks
18:19
Stanford Online
Рет қаралды 149 М.
Regression: Crash Course Statistics #32
12:40
CrashCourse
Рет қаралды 719 М.
ML Tutorial: Gaussian Processes (Richard Turner)
1:53:32
Marc Deisenroth
Рет қаралды 136 М.
1. Introduction to Statistics
1:18:03
MIT OpenCourseWare
Рет қаралды 2 МЛН
World’s strongest WOMAN vs regular GIRLS
00:56
A4
Рет қаралды 32 МЛН