Hey all. Just a few clarifications I'd like to make in response to comments I've seen. It seems I've had to do this a lot lately, huh? Nothing gets past you guys :) 7:04 - Many have taken issue with the piecewise function g(x) being a satisfying example of a function that fails to equal its Taylor series because it's a "gluing" of two completely different functions, so it's natural to expect its Taylor series to behave incorrectly at the join. But the critical trait with this particular piecewise function that makes it different from most others is that the join is truly "seamless": despite being a "gluing" of two "different" functions, it's perfectly smooth (i.e. has derivatives of all orders) at the join point, which is not usually true of most piecewise functions you could construct. Because certainly if a function fails to be smooth at a point, its Taylor series will break there. 0:05 - I was aware that many calculators do not actually employ Taylor series directly to compute sin, cos, and e^x. My intent there was just to make it as clear as possible that I'm talking about computing these functions at a truly arbitrary input (like a calculator can). In my defense, that's why I used the word "might" in that line, but I was probably asking that word to do too much work, so if I could go back, I'd rewrite that line. Apologies for any confusion!
@Kapomafioso11 ай бұрын
On a related note, a video on how these functions are actually implemented, would be awesome. But also that's more like computer engineering, so maybe not quite suitable for this channel. Anyway, I love your videos!
@patrolin11 ай бұрын
@@Kapomafioso they are implemented by polynomials - see Handmade Hero Day 440
@Sean-of9rs11 ай бұрын
Well the first point isn't really an issue, as there is the famous Fabius function, which is smooth everywhere and analytic nowhere. I don't know how, but I trust the math it is.
@gigantopithecus825410 ай бұрын
@@Kapomafiososomtimes they use the agm
@typo69110 ай бұрын
Did you mean to write unsatisfying?
@maxwellhunt373211 ай бұрын
I love Taylor's Theorem. It's one of those results that is so incredibly important, but is not at all obvious at first sight.
@anggalol11 ай бұрын
@@deltapi8859 Engineer use it a lot for approximation. Maybe you already heard about sin(x) ≈ x. That is based on Taylor Series
@leif107511 ай бұрын
Isn't what eh said around 6:40 not right..if you "contort " the function it no longer is the same.function anymore its no longer e^× except maybe in some.small.subsection..so isn't thst wrong?
@joeyshi211411 ай бұрын
@@leif1075 what do you mean? He wanted to look at a different function with the property that a neighbourhood of points around x = 0 is the same as e^x. It illustrates that not all functions are analytic
@ExplosiveBrohoof11 ай бұрын
Yeah, I was never taught it when I took calc in high school. I wonder if there's a nice visual proof of it somewhere on KZbin.
@sakshamsingh177811 ай бұрын
@@ExplosiveBrohoofthere is a KZbin video titled "geometric interpretation of sinx= ... ..." From mathemaniac KZbin channel you should check it out
@angelofdeth9411 ай бұрын
One interesting thing about analytic functions is they behave more like "infinite degree polynomials" than a general smooth function. Polynomials are very rigid. If you know the value at n+1 points of a degree-n polynomial, then you know the whole polynomial. So even though it might seem like you could express a lot of different shapes with a degree-4 polynomial, it only takes 5 points to completely pin it down. There's a theorem in complex analysis that says if you know the value of an analytic function at a sequence of points and at a limit point of the sequence, then you know the analytic function everywhere. For example, if you know the values at 1/n for every natural number n, and at 0, then you uniquely determine the analytic function. In retrospect, it's kind of "obvious" that analytic functions would act like infinite-degree polynomials, because that's basically what a power series is.
@cparks100000011 ай бұрын
@@pyropulseIXXINot sure what you're trying to say.
@nielskorpel886011 ай бұрын
nice that the theorem existsin the complex plane, but does it also exist on the real line? complex derivatives are much more deanding objects, making complex definitions and theorms much stronger and narrower than their real analogs.
@scalesconfrey573911 ай бұрын
@@pyropulseIXXI "That is, you could have a unique curve for any set of given points, yet still be able to draw literally any shape" Your statement is patently false. A unique curve is a unique set of points, and any shape is defined by its points. Even if what you meant was that you can define any region with said curve as a boundary, that still means that you can't draw those regions that have a different boundary using that curve.
@simenjorissen535710 ай бұрын
Wow that's actually a really cool result, what's the name of this theorem? However, aren't there some conditions that need to be placed on the sequence of points you're evaluating in? Because the way you phrased it I could pick the sequence (aₙ)ₙ with aₙ=0 for all n, obviously the limit is also 0. So that would mean that knowing a function in 0 is enough to know the whole function
@pavlopanasiuk729710 ай бұрын
Also wondering about that theorem you mentioned. It has been heavily used in my qft courses, yet I haven't had a satisfactory explanation of "extending real-valued function to complex-valued function uniquely and analytically"
@jacob_90s11 ай бұрын
I know this wasn't the primary point of the video, but I just wanted to note this because it's something I was very interested in when I first started programming, and I had a hard time learning this because every first year calc student would just copy and paste the same damn explanation about taylor series in every online forum.. Most programming math libraries DO NOT use infinite series of continued fractions to calculate elementary functions (the exceptions generally being arbitrary precision libraries) The issue with them is that they are in general too slow, and often times require the intermediate calculations to be computed at a greater precision than the final results needs to be. Instead, when writing the math library, the developers will curve fit either a polynomial or rational function, which can compute the function within a certain range to the required level of precision. Additionally, identities are often used to reduce the input to a smaller range so that you don't have to try and compute the values for all possible floating point values; the trig functions are probably the best example of this; sin and cos are defined for all x values from -infinity to +infinity, but since it just repeats, you can reduce the input value into the range -2pi to +2pi (depending upon the library sometimes it will be reduced even further). Similar tricks can be used for exponential and logarithmic functions using the layout of floating point numbers. For anyone who wants to read up more on this, I would suggest * Approximations for Digital Computers by Hastings (1955) * Computer Approximations by Hart (1968)
@ryanpitasky48711 ай бұрын
CORDIC is another commonly used algorithm.
@Kapomafioso11 ай бұрын
Wouldn't it be enough to only consider the interval [0, pi/2] for trig functions? For sin, for example, if the value is between pi/2 to pi, the values are reflected. From pi to 2pi, the values are negative of those between 0 and pi. So the whole curve can be reconstructed from just the interval 0 to pi/2.
@johannbauer286310 ай бұрын
If the square root operation has its own instruction you can extend this further: you only need [0, pi/4] and use sin(x)^2 + cos(x)^2 = 1 to fill in the rest This was used for example by mario64 modders iirc
@TheFrewah9 ай бұрын
Fast inverse square root os pretty clever.
@samsamson60709 ай бұрын
@@Kapomafioso just [0, pi/4] is enough! (Through reflection of that region of the circle over x = y)
@OwlonH.Christ11 ай бұрын
It's honestly wild how well he can explain these things using the visuals.
@billcook476811 ай бұрын
The crazy thing about analytic functions is that if you know everything that is going on in a “small” region around a point, you understand the entire function.
@marekkryspin871211 ай бұрын
@@pyropulseIXXI Imho @billcook4768 discusses something different. What is surprising is the amount of information needed for a complete description of an analytic function. Indeed, it is sufficient to know "only" all the derivatives at a point to determine the entire function potentially over the entire real line. This means that a countable amount of local information (focused at a single point) provides a complete description of a function defined on a potentially big domain.
@freyc111 ай бұрын
It's so obvious it's only true for a very particular kind of functions... As the video explains perfectly. "Pure intuition" is just hasty reasoning, in that case, I'm afraid.@@pyropulseIXXI
@scalesconfrey573911 ай бұрын
@@pyropulseIXXI "In fact, the moment I learned about derivatives and linear approximations, I instantly knew, via pure intuition, that if I took an 'infinite' amount of derivatives to 'approximate' the function, I would get the exact function but in polynomial infinite series form." In that case, how do you explain bump functions? The existence of functions which have derivatives of all orders at the origin and yet fail to be analytic shoots your "intuition" out of the water. That's why mathematics relies on proof to determine truth, rather than insight and assumption.
@Czeckie10 ай бұрын
it's not crazy. Analytic function is given as the taylor series, which is described precisely by the countably many numbers. Analytic functions are simple, that's why it works. What's crazy is that holomorphic = analytic. Usual proofs don't offer an intuitive reason for it.
@ossigaming841310 ай бұрын
@@pyropulseIXXIYour intution won't work for all analytic functions. Take for example ln(x). The Tayler Series only converges for a small Spectrum of numbers.
@B_u_L_i11 ай бұрын
THANK YOU. When I first heard about the Taylor expansions of e, sin and cos, the fact that they can be described by a polynomial exactly was so confusing to me. Like it's so random. But it makes a lot more sense now.
@ciceron-636611 ай бұрын
In fact it’s not really a polynomial because it has an infinite number of non-zero coefficient But the infinite sum is equal
@B_u_L_i11 ай бұрын
@@ciceron-6366 I really, really don't give a damn.
@prod_EYES7 ай бұрын
@@B_u_L_i😭
@calmkat903211 ай бұрын
This is my #1 favorite subject! All of calculus feels like a narrative, but none moreso than Taylor series. The way it starts with something plain with approximating functions, to turning irrational, even transcendental, functions into these weird work-around ratios, it's just such a cool story! It even ends what I consider a years-long story arc in math. Since algebra 1, we learned about functions. Then we steered into the seemingly unrelated geometry. Then we alternate with algebra 2 and trigonometry. And it all comes together here at the end of calculus 2, when you turn sin(x) and cos(x) into plain algebra, and vice versa. And as a bonus, you learn that the taylor series of cos(x) + i*sin(x) is the same as e^x. Meaning trigonometry, algebra, and calculus all meet here. Add the cornerstone of geometry, pi, by making x=(i*pi), and boom. The one and only e^(i*pi)=-1
@stephenbeck722211 ай бұрын
In a sense, Calc 1 and 2 is an adventure in approximating functions. Tangent line approximations are learned early in the course, which is a first order technique. Euler’s method is typically introduced with basic differential equations (which may be reserved for the separate course of differential equations, but the AP Calc BC curriculum does cover it), which is iterating on the first order approach. Then Taylor series come along and extend the first order tangent lines into polynomials of however many degrees you’d care to find. Then you can take more advanced courses and blow it all up with Fourier transforms.
@scalesconfrey573911 ай бұрын
@@pyropulseIXXI "People like you are so insecure; I went to UC Berkeley and double majored in physics and mathematics." And yet here you are, boasting about credentials that remain unproven and talking down to strangers on the internet for learning things that are "inherently obvious" as though that's what mathematics is about. What you are doing is called projection, because it is clear that *you* are insecure, otherwise you would not be grand-standing about your accomplishments like this. The sheer audacity of you to say that you "thought you would be around smart people" but they were all "oaf[s] that struggled", right before spouting incorrect information is palpable. "In fact, you can literally create any function with any arbitrary functions, provided you can choose the coefficients of those arbitrary functions and have an infinite amount of them." That statement is false. The fact you still believe that after presumably watching a video that gave clear counter-examples to that claim means you should try to get a refund for your degree...
@rla92710 ай бұрын
@pyropulseIXXI why do you feel the need to tell us all this? It has nothing to do with the math.
@jakeypowell18317 ай бұрын
@@scalesconfrey5739Amazing comment, thank you ❤
@kingbeauregard11 ай бұрын
Taylor Expansions are great. For concept, I recommend this: a given function f(x) is actually built out of a bunch of polynomial terms (ax, bx^2, cx^3, etc) but it does not readily admit to what the coefficients a, b, c, etc are for the various terms. So we need to torture the function into confessing each coefficient. The method of torture that works is taking the derivative the appropriate number of times for a given polynomial term, and then setting x equal to zero. It's brutal and harrowing work, but it's also brutally efficient.
@SphereS711 ай бұрын
What I personally find surprising is the effectiveness of compactly supported smooth (or continuous) functions, which are not "good" functions if you have the naive idea that the "best" possible functions are the real-analytic ones. From being used to show all sorts of approximations in function spaces, while having a topological vector space structure that is extremely non-trivial to having a dual that in the end is big enough to contain all kinds of weird "functions" people got as solutions of linear PDE via heuristic methods. Such a beautiful theory. Also, a good show of how weird and different is the world of complex calculus.
@tylershepard426911 ай бұрын
This is a great video. Sadly we don’t use this method anymore. Bit-shifting and a very accurate representation of log(2) is the efficient route. Extend this concept and add an extra register for complex numbers. It’s been a while (5 years) since I’ve done any assembly in an x86, but if I recall these functions are built right into the hardware essentially.
@trogdorbu11 ай бұрын
I'm not making the connection between this and bit-shifting, although I am familiar with the latter. Can you expand on this?
@nuke_clear10 ай бұрын
@@trogdorbuI am guessing its about how calculators calculate values of e^x and other such functions at any x
@dkosolobov10 ай бұрын
This method is more relevant than it seems: Intel made a mistake in their fsin function and programmers had to implement the sine by hand usually using the Taylor expansion and a few tricks. The backward compatibility prevents a simple patch to the issue. See the article "Intel Underestimates Error Bounds by 1.3 quintillion" that explains the problem.
@Smitology6 ай бұрын
@@trogdorbu I think it was a comment on the motivation of taylor series established at the beginning of the video, to compute the values of exponential and trigonometric functions digitally
@Grecks7529 күн бұрын
This is a must-watch for any Calculus student. Initially I thought _analytic_ would have to do with the existence of continuous derivatives at a point or in a small open neighborhood. But that is actually about being a _smooth_ function, and smooth and analytic is seemingly not the same (in the real numbers) as you have demonstrated excellently with the smooth counterexample. The function being piecewise or not is actually not important, it's just a distraction, don't mind the comments. The deep reason of why and where a Taylor series converges to the original function being related to the growth class of the n-th derivatives within the convergence radius was highly enlightening to me. Thank you.
@KarlWork-n3i11 ай бұрын
Brilliant video of the Error term in Taylor Theorem. It's NEVER assumed that a Taylor series will converges for all x tho. IF it converges to F(x) at a point the next question that's asked is on what interval (what neighborhood of x) does it converge to F(x) and that's as you point out, IS there an open interval around x such that the series converges to F(x) ? As you point out in the video derivatives are providing LOCAL information. Now Polyas Theorem is a physical way of looking at complex functions using DIV and CURL operators and can allow you to decide if a function has a valid Taylor series(that means Analytic) in some region/neighborhood using NONE LOCAL information. It's a physical (physics way) way of doing Cauchy Theorem. Amazing really. Brilliant video, thank you.
@MisterTutor201010 ай бұрын
If anyone tells you that math is boring, just shake it off.
@osmosisjohns56508 ай бұрын
😂 I see what you did there
@PlushyguyYT4 ай бұрын
Yeah math it fun!
@PlushyguyYT4 ай бұрын
Yeah, math is fun!
@MagicGonads11 ай бұрын
The way it is shown to construct analytic functions from other analytic functions is a bit too vague. It is true that they form a ring, so addition, subtraction, and multiplication work. It's also true that composition works, however you have to carefully consider what happens to the domain where you can easily puncture it making it only piecewise-analytic on the original domain, and it's not as obvious as for the ring where we have the open intersection of the domains as the resulting domain. And specifically for division and inversion there are special conditions that need to be met and of course for inversion the domain totally changes.
@kevj870811 ай бұрын
Just another casual banger video. Very quickly becoming one of my favorite KZbin channels (not just math). Keep it up, you're killing it unbelievably hard.
@Kram103211 ай бұрын
another neat one (similar to the bump function) is the fabius function which has the property that its derivative is two rescaled copies of itself. Normally defined on the unit interval, it's also possible to extend it into a pseudoperiodic form that is positive or negative according to the Thue-Morse sequence. If you don't do so though, it's constantly zero for negative values and constantly one for any value beyond 1, and in between it takes rational values for any dyadic rational input.
@NoNTr1v1aL11 ай бұрын
Thought you were gonna dive into Schwartz's Theory of Distributions at the end there after you mentioned the bump function and its uses, then I remembered the video title and duration. Maybe it could be the topic of another video. Absolutely brilliant video! Can't wait for the next one.
@05degrees11 ай бұрын
Also, analytic functions are quit rigid, usually you can’t arbitrarily define them on several intervals at once (like the case with exp, sin and cos which automatically define themselves on all of ℝ!)-but non-analytic functions allow more freedom. Though analytic functions are still not the worst when trying to have one that’s _very much like_ zero even if not exactly zero outside a region: take the ever-present gaussian exp(−x²/2), for many purposes it’s very much zero outside, say, x ∈ [−10; +10]. Taking a larger power of x will make it even better at this, though then we’ll get a function that is less useful in many fields. Analytic functions are like “infinite-order polynomials” in a sense. Plain finite-order polynomials, on the other hand, are the top candidate for being the worst rigid class of functions that seems like a nice and large class at first. Has its upsides because of that, though.
@BrianGriffin83Күн бұрын
I had several questions lingering about Taylor Series, and this video answered them all. Neat work! 👌
@Audio_noodle11 ай бұрын
Fantastic video, filled the gaps in understand I had with taylor expansion, and kinda explained why taylor series are such a powerful tool in physics :D
@JourneyThroughMath9 ай бұрын
As a teacher, Im familiar with Taylor series, but I have never stopped to consider how they form. Thank you for this video!
@Procyon5011 ай бұрын
The fact that combinations of analytic functions are also analytic is so cool. This reminds me of how elements of groups stay in the group, when you multiply them together. Are these concepts related?
@epicwalrus126211 ай бұрын
Yes, analytic functions on a given domain form a ring, which is a group with multiplication (but not necessarily division)
@schweinmachtbree101311 ай бұрын
Things staying in a given set when applying an operation (your 2 examples - 1: the set of analytic functions mapping into itself when applying addition, subtraction, multiplication, composition, etc., and 2: the set of elements of a group mapping into itself when applying the group multiplication) is called "the set being _closed_ under the operation". With this terminology your 2 examples are "(the set of) analytic functions being closed under +, -, *, ∘, etc." and "(the underlying set of) a group being closed under the group multiplication".
@schweinmachtbree101311 ай бұрын
@@epicwalrus1262 To clarify a little, a ring _R_ is an abelian group _A_ = ( _A_ ; +, 0, -) (0 and - being the identity element and inverse operation for +) together with an associative multiplication × distributing over it (that is, (a×b)×c = a×(b×c) for all a,b,c in _A_ and also a×(b+c) = a×b + a×c and (a+b)×c = a×c + b×c for all a,b,c in _A_ ). Depending on where rings are being used, × is sometimes required to have an identity element, denoted 1, such rings being called "rings with 1", "rings with identity", or "unital rings". An example of a unital ring is yours, analytic functions on a domain _D_ (or any subset of *C* ), for which the standard notation is _R_ = C^ω( _D_ ), with multiplicative identity being the constant function _f_ ( _x_ ) = 1, and an example of a non-unital ring being the analytic functions on a domain _D_ with compact support (using the precise definition of a "domain" in complex analysis: a non-empty connected open subset of *C* ), denoted _R_ = C^ω_K( _D_ ): now the constant function _f_ ( _x_ ) = 1 is excluded by definition (since the support of _f_ is all of _D_ , but _D_ is open so not compact) and _R_ has no multiplicative identity.
@drdca826311 ай бұрын
@@epicwalrus1262a commutative group under addition, and closed under a multiplication operation, where that multiplication distributes over the addition, and is associative. Typically one also requires that there be a multiplicative identity, but I think some people don’t require that? But most people give a different name to the version of the idea without that requirement.
@jorgenharmse47529 ай бұрын
@@epicwalrus1262: Extend to meromorphic functions, and then you can do division. (Weierstrass factorisation implies that the field of meromorphic functions on a connected open set 'is' the field of fractions of the ring of holomorphic functions.)
@ciCCapROSTi10 ай бұрын
Thanks mate, I was fascinated by Taylor series since the first semester of calculus, but forgot a lot since then. Good, concise, informative video.
@jimi0246811 ай бұрын
This channel is like 3b1b with a different voice and I love it
@jonathanbeeson861410 ай бұрын
Just wanted to add my thanks and appreciation. My level of mathematical sophistication was well matched by your level of explanation !
@latarte393111 ай бұрын
A gem amongst all mathematical channels, thank you for the insights
@some1rational6 ай бұрын
Damn I saved this in a playlist and put off watching it until now, I'm glad I got around to it. This is so amazing, as a math minor some of these concepts eluded me during university, particularly everything after 13:00 , but within 20 minutes you literally finally made me understand and truly appreciate analytic, holomorphic functions and their 'equivalence' in complex analysis
@ElchiKing11 ай бұрын
An addition to 12:00 This does not only hold for composition by concatenation, inversion, division, multiplication, addition and subtraction, but I think also for "most" solutions to equations: Suppose we have some equation of the form f(x,y)=0 and let f be analytic if we fix either x or y. Suppose further, that we have some solution (x0,y0) such that df/dy(x0,y0) is nonzero. Then the implicit function theorem tells us, that we can locally describe the set of solutions around (x0,y0) as the graph of a function g, i.e. near (x0,y0), all points satisfying the equation f(x,y)=0 have the form (x,g(x)). Furthermore, the same theorem also tells us that g is differentiable near x0 and if f is analytic, then so is g (at least in a small neighborhood of x0).
@remekstepaniuk78209 ай бұрын
I could listen to this guy explain all of mathemathics to me. From axioms to complex functions, to topology and geometrics ❤
@mrtthepianoman11 ай бұрын
Thank you for making this video! I learned the concepts of smooth and analytic in the context of complex analysis where they are equivalent. As a result, I have always had a hard time remembering what the distinction is. This makes it clear by outlining where they diverge in the real numbers. Well done!
@blitzkringe11 ай бұрын
Thanks, my struggle with the concept of complex analytic functions seemed almost hopeless until youtube recommended me this video
@МаксимСинцов-п9б3 ай бұрын
Thank you so much! I've never thought about it in such beautiful way!)
@samiswilf3 ай бұрын
Incredible explanation with incredible clarity
@mauisstepsis55248 ай бұрын
This is the most insightful discussion of Taylor series I have seen. Thanks a lot!
@GhostyOcean11 ай бұрын
Complex analysis is probably my favorite subject to study. All the nasty things from real analysis get smoothed away.
@budderman3rd11 ай бұрын
Well complex is more complete than just reals.
@Tutor-i11 ай бұрын
What did you take first complex analysis or real? I can choose to take complex or real next year but don’t know which one to choose.
@ryanh716711 ай бұрын
@@Tutor-i if you are in undergrad, your real analysis course and introductory complex analysis course (probably called something like "functions of complex variables") will be very different courses with a different focus. Introductory real analysis courses tend to focus on the basics of set theory, topology of metric spaces, and sequences/series of real numbers/vectors in metric spaces. Sometimes you'll get to derivatives and the beginnings of Riemann integration. Introductory complex functions courses tend to focus on the parts of complex analysis which can be handled with standard multivariable calculus. They'll walk you through the standard exponentials of complex functions, the basics of complex polynomials/the fundamental theorem of algebra, and then usually go towards talking about how to handle derivatives and integrals of well-behaved complex functions (functions who are equivalent to rotation and scaling in R2). Sorry for the novel, but I don't think you should really think of them as being in sequence for each other, because they tend to have a different purpose and focus.
@GhostyOcean11 ай бұрын
@@Tutor-i I took complex first, but actually I took it concurrently with my intro to proofs class. Guess you could say I was smart enough to still be in the top of the class while learning proofs
@6funnys11 ай бұрын
Real is definitely more fundamental and will change the way you think about math, but it still kind of depends on your institution. Where I go to school, they offered a functions of a complex variable course that was in between the levels of a calculus course and an analysis course - we did a fair mix of proofs and computations. I absolutely loved that class, and it definitely came before real in the difficulty progression. But if you’ve got a lot of room in your schedule next semester, real is pretty awesome.
@coaster123511 ай бұрын
would also be fun to learn about pade approximants, and compare the priorities of the two approximations (at an arbitrary close neighborhood of a point vs over an interval), and why pade does better at the latter
@nikkatalnikov11 ай бұрын
Great video, thank you! Bump functions are really important in studies of weak solutions / weak derivatives as support functions for distributions.
@Steindium11 ай бұрын
Awesome video. I always had my doubts with the Taylor series, so it's nice to see a video addressing them. In fact, coincidentally, I was just watching a video on Euler's identity and grunted when it was another proof using the Taylor expansion.
@wumbo_dot_net11 ай бұрын
I also struggled with them in school, something about the *why* was always missing. I also made a video about Taylor series recently if you’re interested!
@ericdculver10 ай бұрын
Great video! I have know about the examples of smooth but not analytic functions for a long time, but I did not know why they failed to be analytic. This was very illuminating.
@tangentfox467710 ай бұрын
I love that you describe the bump function as useful because it's not too bumpy.
@darlingdarling29433 ай бұрын
I really enjoyed your explanation about why some functions diverge from the Taylor Series representation! However, even though a lot of functions don’t make any logical sense when expressed as a Taylor series, they do have many uses with analytical continuation. Similar to how the Riemann Zeta function doesn’t really make sense for real parts less than or equal to 1 we can assign values that fit with what naturally bounded values we have. As long as a function is not piecewise, or as long as it has infinitely differentiable properties for all x, as far as I’m aware, it is still a validly accepted Taylor series within the ideas of analytic continuation.
@billgatesharmikropenls11 ай бұрын
This is quickly becoming my favorite channel
@anvayjain41007 ай бұрын
Gonna watch every second of sponsored second cuz the rest of the video is worth it.
@yds626811 ай бұрын
Most calculators or computers don't use Taylor's theorem for trig functions and the exponential. It's very inefficient.
@ryanpitasky48711 ай бұрын
CORDIC!
@justafanoftheguywithamoust559411 ай бұрын
Then what do they use ?
@fullfungo11 ай бұрын
@@justafanoftheguywithamoust5594 Some use lookup tables with further approximation techniques like Newton’s method.
@yds626811 ай бұрын
@@ryanpitasky487 exactly, the CORDIC algorithm, amazing invention
@FredericoKlein11 ай бұрын
i remember getting really spooked about this in college thinking that if you could know every derivative of a path, that you could calculate it in the future and how future information would be somehow hidden in higher order derivatives, I kinda forgot about this, but i think it has to do with my incomplete understanding of the taylor theorem and the limitations of taylor explansions
@erikb.celsing44967 ай бұрын
This video is completely AMAZING I am so thankful you made it!!
@Jaylooker11 ай бұрын
Holomorphic functions under more conditions are automorphic forms like modular forms.
@MagicGonads11 ай бұрын
they don't call them conformal maps for nothing
@tap909511 ай бұрын
One smoot but not analytic function that I like is the Fabius function. It has the properties that f(0)=0, f(1)=1, and all derivatives at 0 and 1 = 0. So it works like the smoothest possible step function. But it's not analytic so you can't really compute values for arbitrary points. It also has a fun functional differential equation, f'(x)=2f(2x).
@cdenn01611 ай бұрын
As a PhD physicist, I greatly appreciate this point
@Calcprof10 ай бұрын
The numerical evaluation of transcendental functions is a fascinating field, and there are many subtleties. I particularly like the use use assymptotic but divergent series. Also rational function approximations can be used and can converge "past" (on the other side) of singularities (poles).
@hqTheToaster10 ай бұрын
I have the same notion. I used Taylor Series to approximate how I figure characters should be scaled to each other in Dreams (a game) to mimic how characters are aligned in Smash Bros. Series from 'one canon height structure' to another, but eventually, I had to settle for making a table of possible canon non-Smash Bros. heights to convert to canon Smash Bros. heights simply because the measurements like to fudge each other. Great video!
@yanceyward36897 ай бұрын
An absolutely wonderful video of a concept I was wrestling with just a few weeks ago.
@hitarthk9 ай бұрын
It's so cute that you make sure people don't hate non analytic functions by showing their utility ❤
@Waffle_611 ай бұрын
please never stop making videos
@tryingintrovert123911 ай бұрын
We learnt to approximate the value of the sine function in our engineering programming class using Taylor series. It was a very interesting experience
@yashdagade12405 ай бұрын
TYSM for this video. I have some how used the same lexicon of "local information" and how pointwise information shouldn't be able to travel to infinity! This is so good :D
@TheJara12311 ай бұрын
Again wonderful man!! Helps me out of my personal health pain!! Reminding to the wonderful world of math!! Please keep posting often....you have real unique gift to explain complex math concepts!!
@blueheartorangeheart376810 ай бұрын
I was just explaining to a student how the Taylor series worked, and I realized I had no idea WHY it worked. Then I remembered this video showing up on my timeline
@aidansnyder4225 ай бұрын
I first came across non analytic functions in an introductory condensed matter class. I think we were investigating cooper pairs which give rise to superconductivity, and we were curious why a perturbation approach couldn’t lead us to cooper pairs directly. We were shown that the strength of the cooper pair force was related to exp(-1/x**2), which we dubbed a “non-perturbative” function ie, couldn’t be found through perturbation theory. The upshot of this strange quirk was that we had stumbled across a non analytic function that exists in nature, which was deeply disturbing. Anyway, great video, thanks for this explanation
@MattMcIrvin11 ай бұрын
Bump functions are important in differential geometry--they're what allow us to define maps between any smooth manifold and a set of coordinate charts that are like Euclidean space (like a set of flat maps covering the geography of the round Earth), which don't disturb any of the derivatives of functions on the manifold. The bump functions define the cross-fade from one coordinate chart to another. That's useful for proving things.
@dariomartinezmartinez54229 ай бұрын
I absolutely love your videos, I'm studying maths and I find your videos crystal clear, thank you very much for making such a good content!
@_unkown865211 ай бұрын
Hey morphocular! Huge fan here! I would recommend using a darker colour palette for your vids, because the pink here is a little agressive 😅
@AlonAmit8 ай бұрын
The function exp(-1/x^2) is smooth and non-analytic without requiring any piecewise definition. The video makes it seem like the trouble with the non-analytic example is associated with its "piecewise" nature, but that isn't so; the claim that compositions of analytic functions are analytic is incorrect. Otherwise, very nice video :)
@that_guy469011 ай бұрын
Thank you for your video. It made me view the Taylor series from a new perspective
@1timoasif8 ай бұрын
Need a Complex Analysis video after this one 🙏
@whatitmeans11 ай бұрын
Nice video, recently I found about the existence of smooth bump functions and how they aren't analytical. But there are much more that could be told about the failure of Taylor expansions: I learned that due the Identity Theorem, no non-piecewise defined power series could match a constant value in a non-zero measure interval, wich means that an analytic function just cannot represent a phenomena with a finite duration (so having a finite extinction time). As example, the differential eqn: x' = -sgn(x) sqrt(|x|), x(0) =1 has as unique solution x(t)=1/4 (1-t/2 +|1-t/2|)^2 that becomes exactly zero after t>=2 No power series could approximate this simple solution for all t. This means that at least no 1st neither 2nd order linear ODE, and neither non-linear ODEs like Bessels' and others with power series solutions, just cannot represent a finite extinction time: for doing it so the ODE must have a non-lipchitz point in time were uniqueness could be broken (so, it must handle singular solutions). This could have deep meaning in physics, How you could describe accurately what "time" means if your models don't even know when the clock stopped ticking? Think about it. More on MathStackExchange tag [finite-duration]
@DrCorndog110 күн бұрын
I don't think I'd ever even heard the term "analytic function" until I took a complex variables class, where I used a textbook that used "analytic" in place of "holomorphic." For a long time I wasn't able to appreciate the difference in the definitions of these two equivalent terms, nor how profound their equivalence is.
@Ribulose15diphosphat2 ай бұрын
Speaking of Complex Numbers. Taylor-Series can be used to show the Link between E-Function and Sine. This shows the Euler-Formula that Exp of a imaginary number gives a circling finction.
@AlessandroZir8 ай бұрын
look, this is one of the best videos I found about this very fundamental topic, that it seems most mathematicians and engineers are just incapable of explaining conceptually; but it gets again confuse at some point (4'13")! the procedure shouldn't work, ok! but the point is that it actually works;
@hiredfiredtired8 ай бұрын
The point is that it doesn't work for most functions, only a subset of functions
@AlessandroZir8 ай бұрын
@@hiredfiredtired yes, but for this small subset it works quite well, and it is important to state this clearly enough;
@tomkerruish298211 ай бұрын
I'd speculate that the reason analytic functions show up so much is because we're using them to solve (relatively simple) differential equations.
@Avighna9 ай бұрын
10:25 one small addition I’d make is that there exists some t that makes this equality true, but it’s not necessarily true for all t values between (x, x0). The rest of the argument still stands, the error will approach 0 as n approaches infinity, because n! grows a lot faster than x^n and e^t is independent of n
@patturnweaver11 ай бұрын
wonderful. i now see why analytic functions are such a big deal. makes it much easier to find an approximating function for one thing. i also see the advantages of working in the complex domain. if a function has derivatives for all n on an open domain, then the function is analytic. if a function is smooth, then its analytic. if a function has a first derivative, then it has a derivative for all n.
@giacomocasartelli550310 ай бұрын
Great video, complete and sound explanation of a profound concept
@pfeilspitze11 ай бұрын
TBH, I was expecting something about how we *compute* them -- like whether using the taylor series in the obvious way with IEEE floating-point numbers actually converges to the best representable answer (with ±½ULP error). This video didn't really have anything about computing them at all.
@lifthras11r11 ай бұрын
That is however much harder to explain and argubaly too computer-centric. I pretty much enjoyed the entire video even though I realized it's nothing to do with computation (and I forgot the Taylor expansion had the remainder term for a very long time).
@Kapomafioso11 ай бұрын
Fun fact, sin, cos and exp are very closely related in an analytic sense- they're just linear combinations of each other for some well chosen argument. So it is not surprising that all three have the same radius of convergence. On the other hand, analytic functions with poles will only have Taylor series with finite radius that's at most the distance from the point to the nearest pole. e^(-1/x) has a singularity (essential, that is, this function hits every possible complex value arbitrarily close to x = 0) at zero, so it's Taylor series around x = 0 has zero radius of convergence (it only reproduces the zero value, but that is also undefined in the complex sense).
@TheFrewah9 ай бұрын
The mathologer channel has a good video showing how these series work. It’s beautiful!. Maybe 3 blue 1 brown as well. Animations are beautiful and open source
@francescololiva582611 ай бұрын
Yes you're back with a new video! I'm going to watch it now, I know it will be great❤
@disgruntledtoons7 ай бұрын
"This shouldn't work!" Then introduces a function designed to not work. The Taylor expansion does require that the function and all derivatives be continuous from the estimation point to the point to be calculated, but with that condition satisfied it works.
@Kwauhn.11 ай бұрын
Of course complex functions make everything simpler! Their "complexity" is doing all the heavy lifting 😉
@TheIllerX7 ай бұрын
This information issue is part of a much larger general question about how much local information can be used to predict the behavior at other points. All information about an analytic function is contained at each point of the function. A function on the real line constructed where you just pick a random function value for each argument value would be the opposite. The information at each point says absolutely nothing about the value at other points. Most functions are in between those extreme cases. It would be interesting to quantify this information dependence between points in some general way.
@mustafizurrahman569911 ай бұрын
Superb mesmerising....cannot thank you more for such lucid explanation
@xTriplexS11 ай бұрын
Needed to write the algorithm for this today. It's nice that google is listening to everything 🙃
@dmytryk7887Ай бұрын
Around 12:05 you give a combination of analytic function which you seem to say is also analytic, but the first term is x^2/sin(x) . But this cannot be analytic since it involves division by 0 (at infinitely many points, in fact)
@KipIngram7 ай бұрын
Calculators and computers generally do trig functions using the Cordic algorithms, which are somewhat more specialized than plain Taylor series, and involve some tabulated "special values."
@eriktempelman20976 ай бұрын
Thoroughly enjoyed this one ❤❤
@kingbeauregard11 ай бұрын
Here's what weirds me out. If you do the Tayler Expansion of sin(x), you get a polynomial that neatly goes to 0 if you supply a value of x=0. Obvious enough; each term becomes 0 to some power. But what happens when you supply a value of pi or 2*pi? Then you have an infinite sequence of transcendental numbers raised to integer powers, and you add them up and it still equals zero. That's just nuts. It shouldn't happen. And yet, it totally does.
@kavinbala888511 ай бұрын
great video on taylor series though this isn't the method computers use to calculate sin x they use an approximation for sin x from 0 to pi/2 then just flip it into position for any other value of x
@bowtangey68309 ай бұрын
Around 7:09, the e^(-1/x) function (with f(0)=0) used is not continuous at x=0 (with its natural domain (-infinity, infinity)), and so has no derivatives of any order there. As a student I was taught g(x)=e^(-1/x^2) (with g(0)=0), which does not have that defect.
I think many things are smooth but not analytic. For example, if I was stationary, and later accelerated, I don't think there would be an instant increase in acceleration. At the same time, my distance to the spot on the ground I started might start at 0 for some time and then increase later. It doesn't matter if you use the reference point as "spot on the ground" "center of Earth" or "Center of the Galaxy" since all that would do is add a velocity with some acceleration towards the center or the 1st and 2nd derivative to the position function. I think it doesn't make sense you can predict the position function out to infinity knowing only local information from one point of time.
@PolitictalDipsit10 ай бұрын
I just realized something that i think the taylor expansion/taylor series works on function that differentiate within a loop. For example sin x differentiate it four time would bring back sin x
@jussari796010 ай бұрын
Yes, this is true! Any such function is closely related to e^x (for example, sin x = (e^(ix) - e^(-ix))/2 ). Basically we want to solve the differential equation y^(n) = y ("the nth derivative of y is y"), and it turns out the solutions of that are linear combinations of functions of the form exp(ax), where a^n = 1. (i.e. a is an nth root of unity). And because the exponential function is analytic, so is any solution of the equation.
@LegendLength11 ай бұрын
Glad to finally know what holomorphic means after seeing it so much on wiki!
@joaopedrodiniz706711 ай бұрын
Wow, and once again you succeed to amaze me. Congratulations on the amazing video!
@alessandrocalderoni814010 ай бұрын
I still don’t understand why the factorials are put in the denominator of the expansions after the first 1-degree term. In the video it says it’s done to account for the factor that pops up when deriving that specific term (example from the video: “you divide by 2 factorial in order to account for the factor 2 that pops out when deriving x^2”). I understand why factorials are used, but why should you account for that?
@maiamaiapapaya9 ай бұрын
idk what you're saying but I really like the music
@AllemandInstable11 ай бұрын
great video, would have loved to watch these when i was a student doing its first steps in analysis
@anmoldesai60229 ай бұрын
Hey! Love the video. I was also studying Laurent series and wanted to know if it solves exactly the problem you explained. Thanks!
@igxniisan699611 ай бұрын
You yourself said that derivative of e^x is always the same function.. and THAT is the reason why it works..
@f5673-t1h11 ай бұрын
It would've been really informative to show what some of these functions look like in the complex plane (using domain coloring), and how extremely "messy" they are.