*🔥All Premium Courses Link of Engineering Funda🔥* docs.google.com/spreadsheets/d/1LeLxZPGiMB_ZDZggbZp3P7fK516pXYhVgZA__djNkWM/edit#gid=0
@EngineeringFunda2 жыл бұрын
⬇ *Premium Courses of Engineering Funda* ⬇ ✅ *༺🚩ARM Processor 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7nFEozQhZDZDSm09SwqbGP ✅ *༺🚩Microprocessor 8085 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5vHwmowy_kGtjq9Ih0FzwN ✅ *༺🚩Microprocessor 8086 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e4oAeDid0hwuiol_RJdscrp ✅ *༺🚩AVR Microcontroller 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e55CdbY_WnY6pejPHoojCkJ ✅ *༺🚩8051 Microcontroller 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e49i6neo70aGtFLvKeZ3IQD ✅ *༺🚩80386 & Pentium Processor 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7f4yPj6AbrUoburKwX0fFA ✅ *༺🚩Embedded System 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5xvXygtghfi-tzyeACx7CO ✅ *༺🚩VLSI 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e6au9bX9P_bA3ywxqigCsaC ✅ *༺🚩Digital Electronics 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7nYSG31YWEUfwgAp2uIOBY ✅ *༺🚩Network Theory 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7AccPu8mUhhsJNol9uIKTJ ✅ *༺🚩Control Engineering 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e43et6qbo4IqYSJCv-6kN90 ✅ *༺🚩Electromagnetic Theory 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e4I_YltJja47CwZJkzNWK89 ✅ *༺🚩Power Electronics 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5Hnu82T1CYLZ8kbZs4Jx8x ✅ *༺🚩Basic Electronics 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5G05PTgyTTSVyzTOKRfmTn ✅ *༺🚩Signal and System 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7VdLw7PebRTcZXb_4nKeVh ✅ *༺🚩Optical Communication 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7CDIWsh61eItP9iRw1EIQc ✅ *༺🚩Analog Communication 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7uyUYrpgUUQowmRuKxRdwp ✅ *༺🚩Digital Communication 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5PngHbdEadEun5XPvnn00N ✅ *༺🚩Antennas & wave Propagation 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7tzLIDL4aXUbtRFY3ykmkT ✅ *༺🚩Microwave Engineering 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e6A4Mtxud6xPHE1UecxWsHW ✅ *༺🚩RADAR Engineering 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e4KmA52Jw3-JhDhFIDQZ9Bv ✅ *༺🚩Audio Video System / TV 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7EJcPI0P_DMw49ufTYfuOz ✅ *༺🚩Engineering Drawing/ Graphics 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e79xuABrIQeXYlGuuickEz7 ✅ *༺🚩Basic Mechanical Engineering 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e7Fe4vAYDaL0bpseGNhc9on ✅ *༺🚩Mechanics of Solid 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e53xcLCS7ay2iLRolNxyxFk ✅ *༺🚩Theory of Computation 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e6GfXdwqWX5YmszV2KGv-yl ✅ *༺🚩Java Programming 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5BeN1WTXg1ENPtkRR3SfCI ✅ *༺🚩Python Programming 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5pY2eB-Lht2_CerQue0Xo4 ✅ *༺🚩Placement Test series on C 🚩༻* - kzbin.info/aero/PLgwJf8NK-2e5ovLgoJkv0Pn58UrucrTPt ✅ *༺ Please Share it with your friends to support us. ༻* 👉 *༺ You can also support us by joining us ༻* : kzbin.info/door/dlnqMpRrMcClK2fT6z8EEwjoin
@sreenidhiranganayaki75264 жыл бұрын
thank you sir hope your channel becomes famous and reaches 1 million subscribers......
@EngineeringFunda4 жыл бұрын
Your positive comments motivates me, Thanks and welcome 🙏
@manickam115 ай бұрын
Degree of randomness is best suited definition. Entropy become zero because of the coefficient. Good job. Nice lecture.
@abhishekjha_here3 жыл бұрын
Underrated channel. Best contents of any Engineering branch subjects is available here. Guided by our very own Hitesh Dholakiya Sir🙏🏻🙏🏻🙏🏻 Thank You So Much
@EngineeringFunda3 жыл бұрын
Your positive comments motivates me, my goal is to create largest community of engineers in entire globe, so please help me for that by sharing this lecture series(playlist) with your friends in social media (watsapp, telegram etc). Thanks and welcome 🙏
@abhishekjha_here3 жыл бұрын
@@EngineeringFunda definitely sir👍🏻
@davidfernandez4667 Жыл бұрын
Great work sir, you are the real hero
@EngineeringFunda Жыл бұрын
Your Appreciations, care and share matters a lot to me. #EnginneringLove. All the subjects playlist of Engineering Funda is available in comment section. Share it with your friends to support us. Your positive comments motivates me and person like me get boosted by my students feedback. Thanks and welcome 🙏
@yadhu5042Ай бұрын
THANK YOU, VERY HELPFUL
@EngineeringFundaАй бұрын
Your Appreciations, care and share matters a lot to me. #EnginneringLove. Let me give the details of few exiting features of Engineering Funda: 1. A link to more than 30 Courses is given in the Comment Section. 2. Detailed Syllabus of the subject along with chapters is given in the description of videos.
@prateek6502-y4p5 жыл бұрын
Let me clarify something any KZbin professor will not:So what are we waiting for.Let's enjoy the intuition part.Shall we then!....let's jump into it.Ok.... We know expectation of random variable X is E(X) = summation (xP(X=x))for every value of X.it gives you the mean value of a random variable.what is mean?.is it same as the term average?. Let's say if u flip a coin twice. And I ask whats the Expectation(mean) of getting head..by this I want to know how many heads per toss I expect from this. Then u assign random variable as the no of heads in each of 4 possible outcomes. Then u use the above formula and get the ans as 1 .i.e. E(X) = 1.this value basically means for tossing coin twice,U get 1 head per toss.but if u look at the outcomes they are HH,HT,TT,TH.the no of heads being 2,1,0,and 1 respectively. So to conclude the value 1 gives u 'mean' of no of heads for this particular event. Sorry my explanation is taking so long. But pls stick with me till the end . Now coming to entropy...entropy gives the average of information? No not really it actually gives u weighted sum of information. Now what is weighted average? Weight simply means how much chance of that value to occur.in the world of probability weight simply means Probability. Now u know entropy of any system is inversely proportional to the probability of information. But why? Because let's say instead of tossing a coin twice U toss it just once.now there are only 2 outcomes head or tail. So basically it means this sample space has less no of randomness unlike tossing a coin multiple times. And information does increase with randomness.but as the no of the outcomes increase, the probability of each of outcomes decreases being its inversely proportional.so coming to communication part u write information of each symbol as log1/P of that symbol. Now u remember the formula E(log1/p)= summation of(log1/p(s)* p(s.) ).isnt this formula the expected value(or mean value) of Entropy?.yes it is.just substitute X in E(X ) with log1/p(s). Where p(s ) is Probability of symbol s. At last ! u remember I told u Entropy is just weighted sum of information.Right?. look at the formula of entropy. In it information of each symbol is first multiplied with its weight(probability) and then added to other weighted information. So to conclude my biggest comment ever(more like lecture), go for the intuition don't just memorise stuff. At least Try..Then only Study will be fun. ; ))
@bhaveshamarsingh16565 жыл бұрын
Thank you mate. :-)
@nusratjahan27313 жыл бұрын
The way you described was too easy to understand, sir. Thanks sir
@EngineeringFunda3 жыл бұрын
Your positive comments motivates me. Teachers like me just wants positive comment from student. Love from you guys means a lot to me. My goal is to create largest community of engineers in entire globe. Please help me by sharing this playlist with your friends.
@Yashwanth5 жыл бұрын
Tq sir great men's channel
@pavankumar-ml7yl3 жыл бұрын
Thanku you that was very helpful
@EngineeringFunda3 жыл бұрын
Your positive comments motivates me, my goal is to create largest community of engineers in entire globe, so please help me for that by sharing this lecture series(playlist) with your friends in social media (watsapp, telegram etc). Thanks and welcome 🙏
@Serafhunter2 жыл бұрын
Thanks for the great lecture. Please provide feedback on the following points. 1. At 3:14 when P = 0, We get 0 * log(1) - log (0). Since log(0) is undefine, Won't this result in the equation in indeterminate instead of 0? Or Am I missing something here? 2. At 1:05, Instead of I1 + I2 .. + In, We need (I1 * P1 * n) + (I2 * P2 * n) ... (In * Pn * n). The n is cancelled with the denominator and we get H = Σ pi log(1/pi)
@theflamecoreguy79297 ай бұрын
true
@shrutikhandare3354 Жыл бұрын
Nice explaintion sir , you voice is same as my teacher guardian 😄
@EngineeringFunda Жыл бұрын
You welcome dear. Our all the Courses are available in comment section, it may be helpful to you in future.
@MounikaReddy-f9j Жыл бұрын
Sir could you please share the the topic properties of entropy 3rd property function sir please share
@jatothkishore3 жыл бұрын
Nice explanation sir
@EngineeringFunda3 жыл бұрын
Your positive comments motivates me, Thanks and welcome 🙏
@opbnl74832 жыл бұрын
Well thanks for teaching us🙂
@EngineeringFunda2 жыл бұрын
Your positive comments motivates me. Teachers like me just wants positive comment from student. Love from you guys means a lot to me. My goal is to create largest community of engineers in entire globe. Please help me by sharing this playlist with your friends.
@javid6594 жыл бұрын
very nice
@EngineeringFunda4 жыл бұрын
Your positive comments motivates me, Thanks and welcome 🙏
@kodieswaria5286 жыл бұрын
Nice video sir
@vaddiharinatha20935 жыл бұрын
Tq so much for ur work
@virajadesai4462 Жыл бұрын
It was good but I suggest you to summerize all points from beginning.
@EngineeringFunda Жыл бұрын
Noted ✅
@kalpanaverma12692 жыл бұрын
Thank you ! 🙌
@EngineeringFunda2 жыл бұрын
Your positive comments motivates me. Teachers like me just wants positive comment from student. Love from you guys means a lot to me. My goal is to create largest community of engineers in entire globe. Please help me by sharing this playlist with your friends.
@thisissharief76515 жыл бұрын
nice bro
@ajithsdevadiga16035 жыл бұрын
3rd property - entropy is a function of probability and symmetry.
@kabandajamilu90363 жыл бұрын
Thanks
@EngineeringFunda3 жыл бұрын
Your positive comments motivates me, my goal is to create largest community of engineers in entire globe, so please help me for that by sharing this lecture series(playlist) with your friends in social media (watsapp, telegram etc). Thanks and welcome 🙏
@mamtakundal96176 жыл бұрын
Thank you sir
@Cloudbox-NaughtyBoi4 жыл бұрын
Love you girl ❤️
@shreyatiwari59343 жыл бұрын
Is there any other properties or just two because for 10 marks it is not enough I guess
@EngineeringFunda3 жыл бұрын
It is sufficient dear
@shreyatiwari59343 жыл бұрын
@@EngineeringFunda Thank you sir
@mayanksagar6336 жыл бұрын
THANK YOU SO MUCH SIR
@aniketrajapure82512 жыл бұрын
Sir in formula negative sign is missing ?
@0sumitujjwal05 жыл бұрын
Entropy is average information per symbol
@EngineeringFunda5 жыл бұрын
Correct
@prateek6502-y4p5 жыл бұрын
Not really more like Entropy is weighted (probability) sum of the information.
@abhisheksingh-hm1gk5 жыл бұрын
Sir but what is symbol !
@mrroy78142 жыл бұрын
sir you did't cover all the entropy topic pls, make these video like conditional ,relative differential entropy ...etc
@subramanyak61873 жыл бұрын
I am from one of the top college in Bangalore Sir, We don't have any teacher like you. Point to be mentioned to all karnataka students sir Follows VTU syllabus only.
@atharvakale56256 жыл бұрын
plz improve the audio quality. There were random noises in between.