A question if I may, referring to the formula, at 8:26 aren't you supposed to minus theta1 with learning rate / size of training set * sum of the whole G15 to G20 (instead of just G15)? Thanks.
@Debesta Жыл бұрын
what is the formula to reach error sq. ??
@TheRatdoctor6 ай бұрын
I think he has it wrong. Average of the sum of squares should be 68639.667 in the first iteration.
@PrachiChaudhary-v8w Жыл бұрын
Calculation of gradient descent requires derivate of (h(theta) - y)^2. May I know why didn't you calculate the derivative or is it just not used in practice?
@fiacobelli Жыл бұрын
It is baked in the algorithm.
@AdityaKumar-sx6qk10 ай бұрын
aren't we suppose calculate THETA1 and THETA2
@AdityaKumar-sx6qk10 ай бұрын
initially using linear regression coefficient formula.