Machine Learning on FPGAs: Sigmoid Function and Exercises

  Рет қаралды 3,730

Marco Winzker (Professor)

Marco Winzker (Professor)

Күн бұрын

This video is part of the lecture series about implementing a small neural network on an FPGA. It explains the FPGA implementation of the sigmoid function as ROM. And it gives you ideas about exercises, so that you can make experiments with the sigmoid implementation.
Project Homepage: www.h-brs.de/fp...
Source Code available at: github.com/Mar...

Пікірлер: 9
@stevenzhang6585
@stevenzhang6585 4 жыл бұрын
It is a pleasure and honor to be a student of this Professor!
@Sadiqkhan-st4pt
@Sadiqkhan-st4pt 3 жыл бұрын
How the Floating-point values are converted to Fixed-point values? I know by the Octave code. But can someone explain the concept please?
@Sadiqkhan-st4pt
@Sadiqkhan-st4pt 3 жыл бұрын
what is mean by 5 binary places of w1, w2, w3?
@marcowinzker3682
@marcowinzker3682 3 жыл бұрын
Please have a look at minute 5:40 in video "Machine Learning on FPGAs: Circuit Architecture and FPGA Implementation", kzbin.info/www/bejne/h5jNkqqcZ5Wpbbs The first floating point coefficient is 0.90314. In binary this is expressed with values of 1, 2, 4, 8, ... And the values behind the binary point are 1/2, 1/4, 1/8, ... We work with 5 digits behind the binary point so we multiply with 2^5=32. In the video you see that the value of 0.90314 multiplied with 32 gives 29. You calculate with this value and consider that it contains a factor of 1/32.
@Sadiqkhan-st4pt
@Sadiqkhan-st4pt 3 жыл бұрын
@@marcowinzker3682 Thank you so much for your detailed response. I think I have to revise some basic concepts first. By the way, your videos are excellent and please make more in-depth tutorials on Hardware accelerators for machine learning.
@Sadiqkhan-st4pt
@Sadiqkhan-st4pt 3 жыл бұрын
@@marcowinzker3682 If we increase the number of digits behind the binary point from 5 digits to let's say 7 digits, what will be its impact on the whole design? Because these values are handled as Integers in the VHDL. I don't see any impact it is going to make on the design in terms of hardware.
@marcowinzker3682
@marcowinzker3682 3 жыл бұрын
@@Sadiqkhan-st4pt Have a look at minute 3:15 of the video I mentioned earlier. If you increase the word width to 7 bit, the multiplier and adder will have two more bits. After the adder you can go again14 bit, so the ROM can be the same. It is a little bit more effort and a little bit more accuracy. Do not forget: At minute 8:15 the value sumAdress will have two more bit, so (17 downto 0) and the ROM gets value (17 downto 4).
@hussamalmaqtry1419
@hussamalmaqtry1419 4 жыл бұрын
Thanks
Machine Learning on FPGAs: Demonstration of HLS4ML Framework
14:27
Marco Winzker (Professor)
Рет қаралды 12 М.
Machine Learning on FPGAs: Circuit Architecture and FPGA Implementation
10:59
Marco Winzker (Professor)
Рет қаралды 44 М.
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
Cat mode and a glass of water #family #humor #fun
00:22
Kotiki_Z
Рет қаралды 42 МЛН
Activation Function | Neural Networks
5:13
First Principles of Computer Vision
Рет қаралды 34 М.
Complete Design-Flow for Video Lectures on Machine Learning with FPGAs
34:24
Marco Winzker (Professor)
Рет қаралды 1,7 М.
Deep Learning(CS7015): Lec 3.1 Sigmoid Neuron
12:30
NPTEL-NOC IITM
Рет қаралды 74 М.
Why do we use "e" in the Sigmoid?
4:54
ritvikmath
Рет қаралды 10 М.
Neural Networks on FPGA: Part 3: Activation Functions
28:31
Vipin Kizheppatt
Рет қаралды 20 М.
Machine Learning on FPGAs: Advanced VHDL Implementation
13:52
Marco Winzker (Professor)
Рет қаралды 13 М.
Machine Learning on FPGAs: Training the Neural Network
6:59
Marco Winzker (Professor)
Рет қаралды 15 М.
The Sigmoid Function Clearly Explained
6:57
Power H
Рет қаралды 112 М.
Sigmoid Neuron
20:21
IIT Madras - B.S. Degree Programme
Рет қаралды 6 М.
I Made an Electronic Chessboard Without Turns
14:32
From Scratch
Рет қаралды 1 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН