Convolution Intuition

  Рет қаралды 38,741

Dr Peyam

Dr Peyam

4 жыл бұрын

In this video, I provide some intuition behind the concept of convolution, and show how the convolution of two functions is really the continuous analog of polynomial multiplication. Enjoy!

Пікірлер: 145
@Keithfert490
@Keithfert490 4 жыл бұрын
Idk if this helped with my intuition, but it did kinda blow my mind.
@hamzazavier5645
@hamzazavier5645 2 жыл бұрын
instaBlaster
@influential7693
@influential7693 2 жыл бұрын
The result is not important , whats important is the process.. - Dr. Peyam..... sir you are extremely motivational to me
@quantphobia2944
@quantphobia2944 4 жыл бұрын
OMG, this is the simplest explanation of convolution I've ever come across, thank you so much!!!
@dougr.2398
@dougr.2398 4 жыл бұрын
General comment: Convolution can be thought of as a measure of self-similarity. The more self similarity between and within the two functions, the larger the convolution integral’s value. (There is the group theory connection)
@drpeyam
@drpeyam 4 жыл бұрын
Interesting!
@dougr.2398
@dougr.2398 4 жыл бұрын
Dr Peyam yes! That is why & how it applies to biology and music theory!!
@blackpenredpen
@blackpenredpen 4 жыл бұрын
Wow! I didn't know about that! Very very cool!
@chobes1827
@chobes1827 4 жыл бұрын
This intuition makes a lot of sense because for each x the convolution is essentially just an inner product between one function and another function that has been reflected and shifted.
@dougr.2398
@dougr.2398 4 жыл бұрын
Chobes 182 so if thé function and its shiftted values are strongly correlated (or even equal) the convolution integral approaches the integral of the square of the function. The more dissimilar the shift and the non-shifted values are, the integral can be greater or lesser than the integral of the square.
@bballfancalmd2583
@bballfancalmd2583 4 жыл бұрын
Dear Dr. Peyam, THANK YOU !! And engineering we’re taught how to use convolution, but never learn where the hell it comes from. Your explanations are like a brain massage 💆‍♂️. Thank you, thank you! You know an explanation is good when it not only answers a question I hadn’t even thought of, but also opens my mind to other ways of thinking about math. So much fun! Danka!!
@arteks2001
@arteks2001 2 жыл бұрын
I loved this interpretation. Thank you, Dr. Peyam.
@MrCigarro50
@MrCigarro50 4 жыл бұрын
Thanks for this video. For us, statisticians, is a very important result for it is related to finding the distribution of the sum of two random variables. So, in general I wish to express our appretiation for your efforts.
@stevenschilizzi4104
@stevenschilizzi4104 2 жыл бұрын
Brilliant explanation! Brilliant - makes it look so natural and so simple. Thanks heaps. I had been really curious about where it came from.
@area51xi
@area51xi Жыл бұрын
This arguably might be the most important video on youtube. I wanted to cry at the end from an epiphany.
@drpeyam
@drpeyam Жыл бұрын
Thanks so much 🥹🥹
@yhamainjohn4157
@yhamainjohn4157 4 жыл бұрын
One word in my mouth : Great ! Bravo !
@ibrahinmenriquez3108
@ibrahinmenriquez3108 4 жыл бұрын
I can surely say that i am continuously happy to see you explaining this ideas. Thanks
@user-ed1tg9rj1e
@user-ed1tg9rj1e 4 жыл бұрын
Great video!!! It really helps to make the intuition of convolution!
@gastonsolaril.237
@gastonsolaril.237 4 жыл бұрын
Damn, this is amazing, brother. Though I'll need to watch this video like 2 or 3 more times to connect the dots. Keep up with the good work! Really, you are one of the most interesting and useful KZbin channels I've been subscribed to
@erayser
@erayser 4 жыл бұрын
Thanks for the explanation! The convolution is quite intuitive to me now
@apoorvvyas52
@apoorvvyas52 4 жыл бұрын
Great intuition. Please do more such videos.
@mnada72
@mnada72 3 жыл бұрын
That clarified convolution once and for all 💯💯
@bipuldas2060
@bipuldas2060 3 жыл бұрын
Thank you. Finally understood the intuition behind this pop operation called convolution.
@dvixdvi7507
@dvixdvi7507 2 жыл бұрын
Awesome stuff - thank you for the clear explanation
@prettymuchanobody6562
@prettymuchanobody6562 Жыл бұрын
I love your attitude, sir! I'm motivated just hearing you speak, let alone how good you explain the subject.
@klam77
@klam77 4 жыл бұрын
very enjoyable! good stuff!
@mattetor6726
@mattetor6726 3 жыл бұрын
Thank you! The students you teach are very lucky :) And we are lucky to be able to watch your videos
@sciencewithali4916
@sciencewithali4916 4 жыл бұрын
Thank you so much for the baby step explanation ! It became completly intuitive thanks to the way you ve presented. It ! We want more of awsome content
@Debg91
@Debg91 4 жыл бұрын
Very neat explanation, thanks! 🤗
@corydiehl764
@corydiehl764 4 жыл бұрын
Okay, I really am seeing what you did there, but I feel like what makes this really suggestive to me is looking at the each power of x as a basis function. Wooooow, this is so much more abstracted and interesting compared to the way I usually look at it as a moving inner product
@sheshankjoshi
@sheshankjoshi 6 ай бұрын
This is wonderful. It does really make sense.
@kamirtharaj6801
@kamirtharaj6801 4 жыл бұрын
Thanks man......finally understood why we need convolution theorem
@camilosuarez9724
@camilosuarez9724 3 жыл бұрын
Just beautiful! thanks a lot!
@ronaktiwari7041
@ronaktiwari7041 3 жыл бұрын
Subscribed! It was wonderful!
@vineetkotian5163
@vineetkotian5163 3 жыл бұрын
I wasn't really understanding convolution...just had a broad idea of it.... this video made my mind click😎🔥.. insane stuff
@ShubhayanKabir
@ShubhayanKabir 4 жыл бұрын
You had me at "thanks for watching" 😍🤗
@gf4913
@gf4913 4 жыл бұрын
This was very useful, thank you so much
@monsieur_piyushsingh
@monsieur_piyushsingh Жыл бұрын
You are so good!!!
@chuefroxz9408
@chuefroxz9408 4 жыл бұрын
very helpful sir! thank you very much!
@LuisBorja1981
@LuisBorja1981 4 жыл бұрын
Dirty puns aside, really nice analogy. Never thought of it that way. As always, brilliant work.
@maestro_100
@maestro_100 2 жыл бұрын
Wow!, Thank You Very Much Sir....This Is A Very Nice Point Of View!!!
@lambdamax
@lambdamax 4 жыл бұрын
Hey Dr. Peyam. I had this issue in undergrad too! Thank you for the video. Out of curiosity, for convolutional neural networks, whenever they talk about the "window" in convolving images, would the "window" be analogous to getting the coefficient of a particular degree on this example?
@skkudj
@skkudj 4 жыл бұрын
Thanks for good video - from Korea
@visualgebra
@visualgebra 4 жыл бұрын
More interesting dear Professor
@corydiehl764
@corydiehl764 4 жыл бұрын
Now I'm really curious if this interpretation could be used to give a more intuitive interpretation of volterra series analysis. Which is my favorite analysis technique that I learned in electrical engineering
@DHAVALPATEL-bp6hv
@DHAVALPATEL-bp6hv 3 жыл бұрын
Convolution is for most mortals, a mathematical nightmare and absolutely non intuitive. But this explanation, makes it more obvious. So thumbs up !!!
@coolfreaks68
@coolfreaks68 9 күн бұрын
Convolution is integration of *f(τ).g(t-τ) dτ* *(τ , τ+dτ)* is an infinitesimally small time period for which we are assuming the values of *f(τ)* and *g(t-τ)* remain constant. *f(τ)* is the evolution of f(t) until the time instant *τ* . *g(t-τ)* is the version of g(t) which came into existence at the time instant *τ* .
@user-mz6hc5cv8x
@user-mz6hc5cv8x 4 жыл бұрын
Thanks for video De Peyam! Can you show Fourier and Laplace transform of convolution?
@polacy_w_strefie_komfortu
@polacy_w_strefie_komfortu 4 жыл бұрын
Very interesting. I wonder if we can draw other intuitions from polynomial functions and transfer them to general analytical functions. Anyway analytical function can be aproxymated locally by Taylor series. But in this case analogy seems to work not only locally but also in whole range.
@sukursukur3617
@sukursukur3617 3 жыл бұрын
:)
@dgrandlapinblanc
@dgrandlapinblanc 4 жыл бұрын
Thank you very much.
@blackpenredpen
@blackpenredpen 4 жыл бұрын
So who is convolution? I still don’t get it.
@drpeyam
@drpeyam 4 жыл бұрын
Oh, it’s just the integral of f(y) g(x-y) dy, a neat way of multiplying functions
@blackpenredpen
@blackpenredpen 4 жыл бұрын
Dr Peyam Lol I know. Notice I asked “who”. Since I remembered my still asked me who is convolution before. Because it’s taught after Laplace.
@dougr.2398
@dougr.2398 4 жыл бұрын
Self-similarity..... see my other postings in the comments, please!!
@blackpenredpen
@blackpenredpen 4 жыл бұрын
I just did! Thank you! That is so cool!
@danialmoghaddam8698
@danialmoghaddam8698 Жыл бұрын
thank you so much best one found
@DHAVALPATEL-bp6hv
@DHAVALPATEL-bp6hv 3 жыл бұрын
Awesome !!!!
@alexdelarge1508
@alexdelarge1508 Жыл бұрын
Sir, with your explaination, what was an esotheric formula, now has some real figure. Thank you very much!
@ranam
@ranam Жыл бұрын
Brother I know mechanical engineers could find resonance but when I had a deep thought on this resonance Is an slow accumulation of energy which is accumulated very high in small installments when the frequencies match if you strike a turning fork of 50 hz you get the same frequency of vibration on another tuning fork so they both vibrate if you strike it harder the amplitude changes hence loudness is a human factor the frequency is the same the languages that human speak through out the world the sound only resonate your ear drum for few seconds my question is that the harmonics is the fundamental frequency and overtones are the frequency that follow it take a word in any language you spell it according to convolution the thing scales and ques and stack the signal so convolution can be used to model resonance so when your ear drum vibrates it vibrates so the electrical signals are carried to brain like tuning fork ear drums vibrate within the audible spectrum 20 hz to 20000 hz hence resonance is caused by the word we speak and within the audible range the ear drums vibrate and we make sense of words I have seen in one videos on KZbin that due to harmonics in any sound causes resonance which could be modelled by convolution recalling the resonance its destructive because slow and steady accumulation of sound on the mass causes high stress and high energy to build inside and stress increase and the system fractures or collapses but our ear drum hearing the sound from human languages try to vibrate but why our ear drum when subjected to continuous exposure of sound does not fracture or rupture like a wine glass iam not telling about high loud sound higher than 80 db but a audible range sound within the frequency of 20 hz to 20000 hz under continuous exposure why it's not damaging it again not failure by high energy but low one in synchronisation on air . But I tried it in my students when I told them to be quite in class they did not listen to me so I took my phone and set an frequency 14000 hz and they told it was irritating the idea of resonance is "small effort but large destruction " just like Tacoma bridge where the wind just slowly accumulated energy on the bridge and it collapsed it so my conclusion is if an audible frequency at continuous exposure to an human ear can it cause bleeding again "small effort but great destruction" sorry for the long story I you are able to reach hear you must be as curious as me so still not finished the ear drum is shook by harmonics in the sound we make by the words( or )overtones in the sound we make by the words I know harmonics is the fundamental frequency and overtones are following it which under slow and steady accumulation of sound energy resonates and could damge the ear drums again "small effort but big destruction" not to mention we assume the person is in coma or brain dead hence when the sound irritates him he or she could not make a move so my question is so simple normally human ear responds to harmonics or overtones according to convolution which could be a disaster but with minimal effort 🙏🙏🙏🙏 at here I could be wrong because harmonics can also be used to construct sound so can it be destructive or the overtones which are the trouble makers and which one according to this gives a response curve when two signals convolved by harmonics or overtones which is destructive but with minimal effort and convolution happens when ear drums oscillate is by harmonics or the overtones or also the trouble makers there
@bat_man1138
@bat_man1138 3 жыл бұрын
Nice bro
@maxsch.6555
@maxsch.6555 4 жыл бұрын
Thanks :)
@Handelsbilanzdefizit
@Handelsbilanzdefizit 4 жыл бұрын
2:35 You should handle less coefficients and more coffeeicents ^^
@linushs
@linushs 3 жыл бұрын
Thank you
@leonardromano1491
@leonardromano1491 4 жыл бұрын
That's cool and gives a quite natural vector product for vectors in R^n: (u*v)_i=Sum(0
@drpeyam
@drpeyam 4 жыл бұрын
Coool!!!
@blurb8397
@blurb8397 4 жыл бұрын
Hey Dr Peyam, can we perhaps see a more rigorous definition of what you mean by “continuous polynomials”, how functions can be described in terms of them, and how that leads to the convolution? I would also love to see how this connects to the view of convolution in terms of linear functionals, as Physics Videos By Eugene made an extensive video on that which at least I didn’t really understand... Anyhow, thanks a lot for this!
@drpeyam
@drpeyam 4 жыл бұрын
There is no rigorous definition of continuous polynomials, they don’t exist
@blurb8397
@blurb8397 4 жыл бұрын
@@drpeyam Couldn't we define them as an integral average? like the definite integral from zero to n of a(t)*x^t dt, all of that divided by n to "cancel out" the "dt" part, if we look at it from a perspective of dimensional analysis like done in physics
@jaikumar848
@jaikumar848 4 жыл бұрын
Thanks a lot doctor payam ! Convolution is really confusion topic for me ... I would like to ask that, is convolution useful for mathematicians. ..? It is part of digital signal processing as per my information
@drpeyam
@drpeyam 4 жыл бұрын
So many applications! To get the distribution of the random variable X+Y, to solve Poisson’s equation, etc.
@klam77
@klam77 4 жыл бұрын
@@drpeyam Here's "the" classic video on convolution from the engineering school perspective: kzbin.info/www/bejne/lafcnJhpq6tnhM0 you will have to forgive the "cool 70s disco" look of the professor, it was indeed the 70s, so......he looks the part, (but, Prof Oppenheim is/was the guru on signals and systems theory") This is immensely useful math. immensely.
@sandorszabo2470
@sandorszabo2470 4 жыл бұрын
@@klam77 I agree with you. The "real" intuition of convolution comes from Signals and systems, the discrete case.
@klam77
@klam77 4 жыл бұрын
@@sandorszabo2470 Hello. But prof Peyam is nearly the same: when he talks of convolution in terms of multiplying two polynomials, prof oppenheim talks about "linear time invariant" systems which produce polynomial sums as "outputs" of multiple inputs in the LTI context! Almost similar! But yes, the original intuition was from the Engg department side, historically.
@patryk_49
@patryk_49 4 жыл бұрын
Wikipedia says the symbol from your thumbnail means something called ,,cross corelation'' and it's simmilar to convolution. I hope somewhere in future you will make a video about that.
@Aaron-zi1hw
@Aaron-zi1hw 9 ай бұрын
love you sir
@Handelsbilanzdefizit
@Handelsbilanzdefizit 4 жыл бұрын
When I transform a function f(x )into an endless series, and also the function g(x) Then I create a convolution with these two powerseries in your discrete way, with Sigma and Indices. Is the resulting series the same as I transform the continous version (f*g)(x) into a series?
@corydiehl764
@corydiehl764 4 жыл бұрын
That was my realization from the video too. Now that I think about it, I think that's the result from multiplication of taylor series both fixed about a point a.
@krzysztoflesniak2674
@krzysztoflesniak2674 2 жыл бұрын
Remark 1: This one is pretty nice: kzbin.info/www/bejne/h57GoIOPisuVgJo ["What is convolution? This is the easiest way to understand" by Discretised] It is in terms of integration of processes with fading intensity, but it is amenable for economic interpretation as well. Remark 2: This multiplication by gathering indices that sum up to a constant is crucial for the Cauchy product of two infinite series instead of polynomials (Mertens theorem). Remark 3: This convolution is with respect to time. In image manipulation the convolution is with respect to space (a kind of weighted averaging over pixels). That "spatial convolution" in the continuous case leads to an integral transform. One of the functions under convolution is then called a kernel. Just loose thoughts.
@secretstormborn
@secretstormborn 4 жыл бұрын
amazing
@omerrasimknacstudent5049
@omerrasimknacstudent5049 Жыл бұрын
I understand that convolution is analogous to the multiplication of two polynomials. The intuition here is to express any signal f in terms of its impulses, just like coefficients of a polynomial. It makes sense, thanks. But I still do not understand why we convolute a signal when it is filtered. We may multiply the signal with the filter point-wise.
@cactuslover2548
@cactuslover2548 2 жыл бұрын
My mind went boom after this
@adambostanov4822
@adambostanov4822 Жыл бұрын
so what is the result of the convolution of those two polinomials?
@BootesVoidPointer
@BootesVoidPointer 2 жыл бұрын
What is the intuition behind the differential dy appearing as we transition to the continuous case?
@krzysztoflesniak2674
@krzysztoflesniak2674 2 жыл бұрын
It tells to integrate wrt y and keep x fixed (the resulting function is of x variable). Integration wrt y is a continuous analog of summation over the index (also termed y at the end of the presentation, to highlight the jump from a discrete to the continuous case).
@dougr.2398
@dougr.2398 4 жыл бұрын
My profs at The Cooper Union, 1967-1971 likes to say the variable integrated over is “integrated out”..... which I hold is not accurate, as it is only in appearance, vanished..... the functions evaluated at each point of the “integrated out” variable contribute to the sum, as well as the end points. As the variable EXPLICITLY vanishes, it “goes away. By the way, Dr. Tabrizian, what is “f hat” you refer to in the Fourier transform description of the convolution? Please explain?
@drpeyam
@drpeyam 4 жыл бұрын
Fourier transform
@dougr.2398
@dougr.2398 4 жыл бұрын
Dr Peyam thanks!
@Mau365PP
@Mau365PP 4 жыл бұрын
7:13 what do you mean with *f* and *g* as *"continuous polynomials"* ?
@drpeyam
@drpeyam 4 жыл бұрын
Think of a polynomial as an expression of the form sum a_n y^n and what I mean is an expression of the form sum a_x y^x where x ranges over the reals
@prasadjayanti
@prasadjayanti 2 жыл бұрын
It made sense to me in some way...I still want to know the advantages of 'reflecting' and 'shifting' a function and then multiplying that with another function. If we do not 'reflect' then what ? Shifting I can understand..we have to keep moving window everywhere..
@ventriloquistmagician4735
@ventriloquistmagician4735 3 жыл бұрын
brilliant
@Brono25
@Brono25 3 жыл бұрын
I could never find an explanation of why (graphically) you have to reflect one function, multiply both and integrate. I see its too keep the indices to always sum the same?
@jonasdaverio9369
@jonasdaverio9369 4 жыл бұрын
Is it called the convolution because it is convoluted?
@drpeyam
@drpeyam 4 жыл бұрын
Hahaha, probably! But I’m thinking more of “interlacing” values of f and g
@dougr.2398
@dougr.2398 4 жыл бұрын
Convolution is a term that really might better be described as “self-similarity”. It even has application to music theory! (THERE is the Group Theory connection!!! And even Biology!!!)
@amirabbas_mehrdad
@amirabbas_mehrdad 3 жыл бұрын
It was amazing but at the moment you replaced coefficient with the function itself, I didn't understand actually how you did this. Is there anyone who can make it clear for me? Thanks.
@aneeshsrinivas9088
@aneeshsrinivas9088 Жыл бұрын
do alternate notations for convolution exist. I hate that notation for convolution since i love using * to mean multiplicaiton and do so quite frequently.
@drpeyam
@drpeyam Жыл бұрын
I love *
@wuxi8773
@wuxi8773 3 жыл бұрын
This is math, simple and everything has to make sense.
@allyourcode
@allyourcode 2 жыл бұрын
I feel that this definitely helped me. Not really sure why you began discussing the continuous convolution though. The whole polynomial discussion is perfectly applicable in the context of discrete convolution. Anyway, for whatever reason, motivating with polynomial multiplication somehow did it for me. Thanks! I'm also finding it helpful in higher dimensions to think in terms of multiplying polynomials (the number of variables = the number of dimensions): To find the coefficient for x_1^n_1 * x_2^n_2, you multiply coefficients of the input polynomials where the exponents add up to n_1 and n_2. This kind of explains why you need to flip the "kernel" (in all dimensions) when you think of convolution as a "sliding dot product": when you flip the kernel, the coefficients that you need to multiply "pair up" (such that the exponents add up to n_i). Also, I really like your sanity check: the two arguments MUST sum to x! Sounds gimmicky, but I'm pretty sure that will help me to remember.
@Muteen.N
@Muteen.N 2 жыл бұрын
Wow
@matthewpilling9494
@matthewpilling9494 4 жыл бұрын
I like how you say "Fourier"
@drpeyam
@drpeyam 4 жыл бұрын
It’s the French pronunciation:)
@timothyaugustine7093
@timothyaugustine7093 3 жыл бұрын
Fouyay
@poutineausyropderable7108
@poutineausyropderable7108 4 жыл бұрын
Does this mean if you convolute a function with 1 you get a taylor series?
@poutineausyropderable7108
@poutineausyropderable7108 4 жыл бұрын
That means you could get the taylor series of sin^2x, that would be useful in solving diff equations by solving for a taylor series. You could also continue value for sinx in the infinities.
@poutineausyropderable7108
@poutineausyropderable7108 4 жыл бұрын
Oh so i finally understood. F and g aren't time functions. They are the formula for the element of the taylor series. Sinx isn't f. F is i^(k-1)*(1/k!)* ( k mod 2)
@luchisevera1808
@luchisevera1808 4 жыл бұрын
My professor 7 years ago showed this by sliding a triangle into a rectangle until everything became convoluted
@mustafaadel8194
@mustafaadel8194 3 жыл бұрын
Actually you showed us the similarity between the two formulas , however I didn't understand convolution from that similarity 😥
@mrflibble5717
@mrflibble5717 2 жыл бұрын
I like your videos, but the whiteboard writing is not clear. It would be worthwhile to fix that because the content and presentation is good!
@austinfritzke9305
@austinfritzke9305 3 жыл бұрын
Was watching this at 1.5x and laughed out loud
@luisgarabito8805
@luisgarabito8805 11 ай бұрын
Huh? 🤔 interesting.
@krishnamishra8598
@krishnamishra8598 3 жыл бұрын
Convolution in one Word ???? Please answer!!!
@yashovardhandubey5252
@yashovardhandubey5252 4 жыл бұрын
It's hard to believe that you can take out time from your schedule to answer KZbin comments
@drpeyam
@drpeyam 4 жыл бұрын
Thank you! :)
@SIVAPERUMAL-bl6qv
@SIVAPERUMAL-bl6qv 4 жыл бұрын
Why convolution is used?
@forgetfulfunctor2986
@forgetfulfunctor2986 4 жыл бұрын
convolution is just multiplication in the group algebra!
@LemoUtan
@LemoUtan 4 жыл бұрын
Just what I was thinking! I only recently started reading up on group modules and thus getting my jaw slowly pulled down whilst watching this
@dougr.2398
@dougr.2398 4 жыл бұрын
forgetful functor please explain or at least partially illuminate the Group Theory connection?
@LemoUtan
@LemoUtan 4 жыл бұрын
@@dougr.2398 If I may, this may help (straight to the examples in the wikipedia article about group rings): en.wikipedia.org/wiki/Group_ring#Examples
@vineetkotian5163
@vineetkotian5163 3 жыл бұрын
Sir I cant seem to practice this subject the right way......I'm worried the question might get twisted in the exam and my brain will freeze
@fedefex1
@fedefex1 4 жыл бұрын
How can i write a continuous polinomium:?
@drpeyam
@drpeyam 4 жыл бұрын
With convolution :)
@dougr.2398
@dougr.2398 4 жыл бұрын
Dr Peyam what a convoluted reply!!! :)
@patryk_49
@patryk_49 4 жыл бұрын
I think analogous to normal polynomial : P(x) = integral(a(t)*x^t)dt
@dougr.2398
@dougr.2398 4 жыл бұрын
Patryk49 what is a normal polynomial? Is there a correspondence to a NormalSubgroup?
@dougr.2398
@dougr.2398 4 жыл бұрын
Here’s one answer: mathworld.wolfram.com/NormalPolynomial.html
@dougr.2398
@dougr.2398 4 жыл бұрын
Vous avez un bon accent Français!
@drpeyam
@drpeyam 4 жыл бұрын
Merci!
@f3ynman44
@f3ynman44 3 жыл бұрын
a_k*b_x-k looked like a Cauchy Product. Is this a coincidence?
@gosuf7d762
@gosuf7d762 4 жыл бұрын
If you replace x with e^(I th) you see convolution theorem.
@zhanggu2008
@zhanggu2008 3 жыл бұрын
This is good. But it feels like a start, and the goal of a convolution is not explained. why do so, why use polynomial coefficients?
@dougr.2398
@dougr.2398 4 жыл бұрын
You were right to both hesitate and then ignore the possibility that you had misspelled “coefficients”. English is difficult because it is FULL of irregularities.... this is one instance of a violation of the rhyme “I before E (edited 12-12-2023) except after C or when sounding like “Eh” (“long” A) as in Neighbor and Weigh”. Had you bothered to worry about that during the lecture, it would have impeded progress and the continuity (smile) of the discussion.
@elmoreglidingclub3030
@elmoreglidingclub3030 Жыл бұрын
I do not take drugs. Never have. But now I feel like I’m on drugs. What’s the point of all this??
Crazy product rule proof
18:46
Dr Peyam
Рет қаралды 5 М.
Convolutions are not Convoluted
10:28
SigFyg
Рет қаралды 51 М.
Trágico final :(
01:00
Juan De Dios Pantoja
Рет қаралды 28 МЛН
Convolution
11:16
Dr Peyam
Рет қаралды 8 М.
Introducing Convolutions: Intuition + Convolution Theorem
11:08
Faculty of Khan
Рет қаралды 112 М.
Signals and Systems - Convolution theory and example
24:25
UConn HKN
Рет қаралды 196 М.
Step Function and Delta Function
15:41
MIT OpenCourseWare
Рет қаралды 205 М.
8. Convolution
53:44
MIT OpenCourseWare
Рет қаралды 58 М.
Convolution-What's τ got to do with it?
12:04
Darryl Morrell
Рет қаралды 104 М.
Convolution of Functions and Laplace Transforms Examples
31:14
James Wenson
Рет қаралды 8 М.
Graphical convolution example
11:16
NTS
Рет қаралды 220 М.
How Convolution Works
20:05
Brandon Rohrer
Рет қаралды 44 М.
convolution of images
6:54
Alexandre Damião
Рет қаралды 176 М.