I am in love with this channel due to these kind of vids. This is my second my vid of this channel. need to explore more vids. kindly keep uploading these vids with amazing explanation. Kindly suggest a book or else will request you to write one book.
@datahat6423 жыл бұрын
For the third question however, we missed an important point which considering x>= u. Here we had |x-u| >= 0 which has two parts, x-u >= 0 when x>=u and x-u
@AppliedAICourse3 жыл бұрын
In our case x=mu as there is only one possible value for x. So x>= mu and mu>=x would simply translate to x=mu. So it would not change the math. In other cases, what you said is true.
@sonalnalawade63333 жыл бұрын
I HV enrolled your applied ml course ...and aftr studying stats from that I am able to ans correctly 🙏
@tagoreji21432 жыл бұрын
Thank you Sir
@rikinjain53093 жыл бұрын
This is a great video, please make it a series!
@rajbir_singh05173 жыл бұрын
Very depth and nice intuition of concept .
@suyashmisra74063 жыл бұрын
this channel is seriously underrated
@pratikaphale75473 жыл бұрын
Got the answer right thanks to Applied AI course. All of these basic concepts are thoroughly covered in course!!
@qaiserali67733 жыл бұрын
@Applied AI Course Thank you so much. Please make such short videos on interview questions on a regular basis.
@shivamjalotra79193 жыл бұрын
For the first question can we just say that to find the %of values between [mu - sigma] to [mu + sigma] is just the integral of Pdf from [mu - sigma] to [mu + sigma]. Which can be between [0% and 100%].
@AppliedAICourse3 жыл бұрын
This argument would not work well if the question was about two standard deviations instead of one standard deviation. Chebyshev’s inequality will give a much better bound on the probability than just saying 0-100%
@shivamjalotra79193 жыл бұрын
@@AppliedAICourse Got it thanks.
@sanjeevkumar-iw2lz3 жыл бұрын
Great👍
@thebackspace73453 жыл бұрын
I got the answer right😌
@omkiranmalepati16453 жыл бұрын
I'm trying to answer second question, I'm thinking Chebyshevs Inequality might fail in Pareto distribution as most of the data concentrate at lower x values.
@AppliedAICourse3 жыл бұрын
The distribution need not be Pareto for us to be able to apply Chebyshev’s inequality. Actually there are some Pareto distributions which do not have a finite mean for which we may not be able to apply the inequality.
@omkiranmalepati16453 жыл бұрын
@@AppliedAICourse Yes I meant we can't apply Chebyshevs for Pareto.
@naveenvinayak10883 жыл бұрын
What ever the distribution it is, if we don't know the distribution of data we can apply Cheby shev inequality
@abhishekkrishna96043 жыл бұрын
What if k is less than 1? Like 0.5, then in this case also, chebyshev's will not hold true. k should be >=1.
@AppliedAICourse3 жыл бұрын
Then, 1/k^2 would be greater than 1 which makes the inequality trivial as all probabilities have to be less than or equal to 1. Hence the inequality holds whenever K is greater than 0.
@goodgobikha17463 жыл бұрын
I think the ans for last question is it actually fails when we need to predict the output using the parameter because chebeshev inequality is a non- parametric one
@AppliedAICourse3 жыл бұрын
But, what are you trying to predict here? It is not clear. Typically Chebyshev’s inequality is not used for prediction in ML.
@prajwalsarpamale33753 жыл бұрын
68%
@AppliedAICourse3 жыл бұрын
Please check the rest of the video on why this answer is true only if the underlying distribution is Gaussian. Ask yourself what is the underlying distribution is non Gaussian and as we’ve done in the video.
@shubhamchoudhary54613 жыл бұрын
68% data lies in 1std dev if data is normally distrubuted.. imperial formula doesn't apply for if data is not normally distrubuted
@surajshivakumar51243 жыл бұрын
Is this same as Markov's inequality?
@AppliedAICourse3 жыл бұрын
Yes, Chebyshev’s inequality is a simple extension of Markov inequality. You can derive Chebyshev from Markov inequality in just 2 to 3 steps. You can find that on the Wikipedia article of Chebyshev’s inequality.
@AppliedAICourse3 жыл бұрын
Yes, please check this formal definition: en.m.wikipedia.org/wiki/Chebyshev%27s_inequality
@surajshivakumar51243 жыл бұрын
Can we use it for distributions like Pareto, zeta and t-disb(@t=2) where either mean or std is not finite?
@AppliedAICourse3 жыл бұрын
You cannot use this inequality if the mean or standard deviation is non-finite. That is clearly mentioned in the above link that I shared in the formal definition section.
@surajshivakumar51243 жыл бұрын
Thanks :)
@vijaycharankumararji2563 жыл бұрын
iam looking for offline course sir, is their any option
@AppliedAICourse3 жыл бұрын
Unfortunately, we don’t have an off-line course. All of our courses are completely online.
@SankarJankoti3 жыл бұрын
Amazon data scientist interview asks DSA question
@AppliedAICourse3 жыл бұрын
Not much for data scientist roles. You are expected to know the basics of time and space complexity, Recursion along with data structures that are often used in machine learning like hash tables and binary trees. For applied scientist roles, if you come from CS background, you’ll have dedicated rounds for data structures and algorithms very similar to those of software engineers.
@JaswinKasi3 жыл бұрын
What bullshit at 10:00 ? The proper interpretation is for all K's the inequality holds, it means P(X-\mu > 0) ightarrow 0 Please don't spread stupid things on internet.
@AppliedAICourse3 жыл бұрын
Please check the Chebyshev's inequality's probabilistic statement here: en.wikipedia.org/wiki/Chebyshev%27s_inequality#Probabilistic_statement It is only valid when k>0 and when we have non-zero variance for X
@JaswinKasi3 жыл бұрын
@@AppliedAICourse so say that. Don't put it in the equation and come up with stupid inequalities. It is like multiplying 0 on both sides of an equation and coming up with ridiculous equations like 3=4. Just mention that the Chebyshev's inequality doesn't work for zero variance. Period.
@AppliedAICourse3 жыл бұрын
I think that’s what we tried to convey. May be you misunderstood us. We are trying to show this as a boundary case where Chebyshev’s inequality does not work. I think most other viewers got that point through as no one else raised an issue with this aspect.