Cross-Entropy Loss Log-likelihood Perspective

  Рет қаралды 8,580

Matt Yedlin

Matt Yedlin

Күн бұрын

This is a video that covers Cross-Entropy Loss Log-likelihood Perspective
Attribution-NonCommercial-ShareAlike CC BY-NC-SA
Authors: Matthew Yedlin, Mohammad Jafari
Department of Computer and Electrical Engineering, University of British Columbia.

Пікірлер: 14
@vaibhavarora9408
@vaibhavarora9408 4 жыл бұрын
Great lectures. Thank you Matt Yedlin.
@davidlearnforus
@davidlearnforus 2 жыл бұрын
Hi, thank you so much! I am self lerner, with no much of formal background. Can you please explain how SUM p_i log q_i is entropy, because it does not have minus sign. If it would be log (1/q_i) we would get minus sign out of it but its not. I'm stuck there...
@AChadi-ug9pg
@AChadi-ug9pg 3 жыл бұрын
Muhammad is a good student
@bingbingsun6304
@bingbingsun6304 3 жыл бұрын
I want to know how you can write on the mirror and record it.
@mattyedlin7292
@mattyedlin7292 3 жыл бұрын
The camera sees the writing flipped through a mirror.
@peizhiyan2916
@peizhiyan2916 4 жыл бұрын
Nice
@garrettosborne4364
@garrettosborne4364 2 жыл бұрын
Can the old guy.
@yanbowang4020
@yanbowang4020 4 жыл бұрын
love your video hope to see more of it.
@bingbingsun6304
@bingbingsun6304 3 жыл бұрын
Well explained using human language.
@yjy8
@yjy8 3 жыл бұрын
you just cleared my doubt about likelihooh and log likelihood which I had for past yaer and half. Thank you so much
@victorialeigh2726
@victorialeigh2726 3 жыл бұрын
I really like that you guys hand clean the clear board. Feels like back in the classroom + math teather's small math talk!
@TylerMatthewHarris
@TylerMatthewHarris 3 жыл бұрын
thanks!
@jimbobur
@jimbobur Жыл бұрын
*(EDIT: Solved it, see comment reply)* I don't follow how you go from the case of numerical example, where the likelihood is a product of predicted and observed probabilities p_i and q_i each raised to the number of times they occur, to the algebraic expression of the likelihood where you take the product of q_i raised to N * p_i (or is that N_p_i? I'm a little unsure if the p_i is a subscript of the N or multiplied by it).
@jimbobur
@jimbobur Жыл бұрын
I worked it out. The answer is to remember that the number of times the outcome i, with probability p_i occurs can be expressed by rearranging the definition p_i = N_p_i / N and substituting this into the expression for the likelihood in the general form that follows from the numerical example: L = Π q_i ^ N_p_i , giving L = Π q_i ^ N*p_i
Description of the Gradient Operator
6:38
Matt Yedlin
Рет қаралды 2,2 М.
Binary Cross-Entropy
10:20
Matt Yedlin
Рет қаралды 13 М.
Новый уровень твоей сосиски
00:33
Кушать Хочу
Рет қаралды 4,9 МЛН
The joker favorite#joker  #shorts
00:15
Untitled Joker
Рет қаралды 30 МЛН
Players vs Corner Flags 🤯
00:28
LE FOOT EN VIDÉO
Рет қаралды 69 МЛН
Probability is not Likelihood. Find out why!!!
5:01
StatQuest with Josh Starmer
Рет қаралды 1,1 МЛН
Categorical Cross - Entropy Loss Softmax
8:15
Matt Yedlin
Рет қаралды 16 М.
1. Maximum Likelihood Estimation Basics
6:33
Christina Knudson
Рет қаралды 272 М.
The KL Divergence : Data Science Basics
18:14
ritvikmath
Рет қаралды 47 М.
Loss Functions - EXPLAINED!
8:30
CodeEmporium
Рет қаралды 134 М.
Mathematical Proof Writing
19:23
The Math Sorcerer
Рет қаралды 48 М.
KL Divergence - CLEARLY EXPLAINED!
11:35
Kapil Sachdeva
Рет қаралды 28 М.
Why do we need Cross Entropy Loss? (Visualized)
8:13
Normalized Nerd
Рет қаралды 48 М.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 609 М.