Machine Learning - The Mathematics of Machine Learning | NerdML

  Рет қаралды 1,826

NerdML

NerdML

Күн бұрын

Пікірлер: 16
@hoaxuan7074
@hoaxuan7074 3 жыл бұрын
You can swap what is adjusted in a neural network. You can use fixed dot products and adjustable (parametric) activation functions like fi(x)=ai.x x=0, i=0 to m. The fast (Walsh) Hadamard transform can be used as a collection of fixed dot products. Such a net then is: transform, functions, transform, ... , transform. To stop the first transform from taking a spectrum you can apply a fixed randomly chosen (or sub-random) pattern of sign flips to the input to the net. The cost is n.log2(n) add subtracts and n multiplies per layer and 2.n parameters where n is the width of the layer.
@NerdML
@NerdML 3 жыл бұрын
Great understanding
@hoaxuan7074
@hoaxuan7074 3 жыл бұрын
There is an unwritten book on dot products that neural net researchers should read. And in that book there is a chapter on the statistics of the dot product where you learn that the variance equation for linear combinations of random variables applies. And also often the CLT. Then you can learn how to turn the dot product into a general associative memory with a weight vector. You need a locality sensitive hash with bipolar (+1,-1) outputs. This fits the dot products preference for non-sparse inputs. To recall dot the hash with the weight vector. To train get the recall error and divide by the number of dimensions. Then add or subtract that value to each element in the weight vector according to the +1,-1 hash term to make the error zero. This adds a small amount of Gaussian noise to all the prior stored memories. To understand why think about if you used a true hash algorithm together with the CLT. Anyway you can remove all the noise by repeatedly storing all the traning pairs a few times. A fixed randomly chosen pattern of sign flips before the fast Hadamard transform gives a random projection. Repeat for better quality. Applying binarization to the random projection gives a locality sensitive hash.
@NerdML
@NerdML 3 жыл бұрын
Can you just provide me the name of book?
@hoaxuan7074
@hoaxuan7074 3 жыл бұрын
@@NerdMLI am waiting to read that book too!!!! Who will write it? When will it be published? 😎
@NerdML
@NerdML 3 жыл бұрын
So you have already read this from somewhere else....can you provide me that link....by the way where are you from?
@hoaxuan7074
@hoaxuan7074 3 жыл бұрын
@@NerdMLThere is a blog page with some information on Fast Transform fixed filter bank neural nets. There does seem to be a foundation problem with neural net research where the low level 'mechanics' like the dot product have not been properly studied. Then you must find out by your own effort.
@NerdML
@NerdML 3 жыл бұрын
Oh I see
@facelessguy2093
@facelessguy2093 4 жыл бұрын
You are genius Bro. Great Video
@NerdML
@NerdML 3 жыл бұрын
Thank you so much 😀
@babritbehera4087
@babritbehera4087 4 жыл бұрын
Great work ....
@NerdML
@NerdML 4 жыл бұрын
Thanks man!!
@babritbehera4087
@babritbehera4087 4 жыл бұрын
@@NerdML I do have few steps as a beginner to step into the coding part from scratch... To better understand code flow for a starter . Which I prepared a very low level programming to the ml
@NerdML
@NerdML 4 жыл бұрын
That's really great...u can reach out to me through rahulsainipusa@gmail.com
@babritbehera4087
@babritbehera4087 4 жыл бұрын
@@NerdML sure...
@nikhilseth9001
@nikhilseth9001 4 жыл бұрын
Mathematics genius🤯
Война Семей - ВСЕ СЕРИИ, 1 сезон (серии 1-20)
7:40:31
Семейные Сериалы
Рет қаралды 1,6 МЛН
7 Outside The Box Puzzles
12:16
MindYourDecisions
Рет қаралды 151 М.
Best of CES 2025
14:50
The Verge
Рет қаралды 340 М.
Netflix Removed React?
20:36
Theo - t3․gg
Рет қаралды 66 М.
Automated Work Machines That Will Change Your Life Forever
17:01