#YAY . . . and thanks for another wonderful excursion through some of math's byways! A little extension of this subject: This method is sometimes called the "tangent method," because it uses the tangent to the graph, to home in on a zero of the function. It requires finding the derivative of the function; but sometimes, it's difficult or impossible to calculate the derivative. In such cases, there's what's sometimes called the "secant method." In this method, you need 2 starting points, not just one (& it helps greatly if they bracket the target; that is, if one of them gives f(x) < 0 and the other gives f(x) > 0). Then you just evaluate the function itself at x₁ and x₂ and draw a straight line (the "secant") between (x₁, f(x₁)) and (x₂, f(x₂)), and solve the equation of that line for y=0. Call this new x, "x₃" and now use x₂ and x₃ to find the next x, etc. In each iteration, you discard the "older" of the two guesses you have, make the newer one the 'new' older one, and the newly generated guess becomes the newer one. The downsides are: (1) You have to keep track of 2 current 'guesses,' instead of just 1, and (2) Convergence to the zero is a bit slower. The upsides are: (1) You have only 1 function to evaluate each time, not 2. (2) You don't need to take the derivative of your function Interesting exercise to try: bprp started by showing that there must be a zero between x=0 and x=1. Try applying the secant method, using these two points as your starters. Note that there's an advantage to using the one that gets f(x) closer to 0, as your "2nd," or, latest, guess. Compare the rates of convergence of these two methods. Fred
@blackpenredpen6 жыл бұрын
Thank you for the detail explanation as always!!
@emmanueljosephcomargo30126 жыл бұрын
Did you just explained my numerical methods class lol
@marcushendriksen84154 жыл бұрын
There's also the more basic bisection method, which is slower but is guaranteed to converge.
@ffggddss4 жыл бұрын
@@marcushendriksen8415 Yes! In fact, in the days before home computers & handheld electronic scientific calculators, a method that was taught for this, was basically the decimal form of bisection - you'd bracket the zero, say, with consecutive integers, n and n+1; interpolate based on f(x) at each of those; then bracket the new x-value with consecutive tenths; interpolate again; etc., to the desired accuracy. It had someone's name attached to it, that I can't recall. Fred
@ffggddss4 жыл бұрын
@@marcushendriksen8415 I just remembered the name (of the decimal version). It was called Horner's Method. Fred
@Abdega6 жыл бұрын
Wow, Newton sure knew a ton!
@blackpenredpen6 жыл бұрын
Definitely.
@jarjuicemachine4 жыл бұрын
Very underrated
@ffggddss3 жыл бұрын
@Abdega: I see what you did there! Fred
@evanparsons1233 жыл бұрын
And how!
@mzadro73 жыл бұрын
Dum dum, tch
@stevethecatcouch65326 жыл бұрын
Back in the stone age, the first useful program I wrote in FORTRAN was an implementation of the Newton method for finding the zeros of polynomials.
@blackpenredpen6 жыл бұрын
Steve the Cat Couch The Stone Age for me was "Visual Basic". And I wrote a tic tac toe program. Lol
@StephenMarkTurner6 жыл бұрын
Yup, Waterloo Fortran IV (Watfiv), about 1979.
@blackpenredpen6 жыл бұрын
I actually have never heard of that until you two mentioned it...
@StephenMarkTurner6 жыл бұрын
Yes, Watfor was Waterloo Fortan. The successor was Fortran 4 (IV), hence the 'fiv' pun in the name. Then came Fortan 77, I think
@ffggddss6 жыл бұрын
+ Stephen Turner: Yes, FORTRAN 77. Then FORTRAN 90. My computer teeth were cut on ALGOL 60. And as a kid, I watched my dad coding in FORTRAN II, using specially printed coding pads, because everything had to be column-controlled. I think he actually started out in FORTRAN I - or more likely, pre-FORTRAN, 1951 or earlier. I was a tot, so I don't recall. All I know is, his first computer experience was before there was magnetic core memory. Fred
@shangxu55576 жыл бұрын
So pretty much, you're saying, when faced with a hard equation, just go off on a tangent a bunch of times?
@yasu0main8856 жыл бұрын
shang xu more or less
@ffggddss6 жыл бұрын
Hah! Yes. But bprp didn't say it; Newton (and Raphson) did! Fred
@アヤミ5 жыл бұрын
Definitely yes lmao
@jarjuicemachine4 жыл бұрын
The function must be continuous
@taatuu256 жыл бұрын
"Is it really so hard?" "if Ω*e^Ω = 1, then Ω^(e^Ω) = 1/e" ..Yes, yes it is
@mokouf33 жыл бұрын
For Ω*e^Ω = 1, we can know: 1: e^Ω = 1/Ω 2: ln(Ω) + Ω = 0, ln(Ω)/Ω = -1 Ω^(e^Ω) = Ω^(1/Ω) = e^(ln(Ω)/Ω)= e^-1 = 1/e This requires patience to prove but not that hard.
This is really a great introduction to the Newton-Raphson method. And if anyone knew about it already, it's also a great introduction to the Lambert W function. So it's a great video overall!
@abderrazekchaouachi64094 жыл бұрын
I am a retired academic inspector of maths in Tunisia. I have a great admiration for your enthousiasm and courage. You can be an inspiration for every future mathematician
@u.v.s.55836 жыл бұрын
You are a great tutor and a blessing, man! Thank you, I wish you were around 20 years ago to enhance my calculus experience!
@blackpenredpen6 жыл бұрын
U.V. S. Thanks. I am a teacher : )
@LunizIsGlacey Жыл бұрын
13:52 x_3 was almost an Euler-Mascheroni jumpscare lol
@MouhibBayounes9 ай бұрын
xD i felt the same way 😂
@DashRevoTV6 жыл бұрын
"That's because...it's too boring. Let's use Newton's Method." Hahahahah exactly what my teacher told us.
@brahmandsaraswat8674 жыл бұрын
Finally found a channel, who's narrator is a great teacher. You taught me, numerical analysis, in just 20 minutes, wow.
@xCorvus7x6 жыл бұрын
For the Newton method to work better, you can just analyse the function for its maxima and minima (since you can derive it, that should be possible). Then pick an x0 such that f(x0) is between a negative local minimum and a positive local maximum or, if there is only one local extremum, an x0 close to that. Of course, you can still end up in loops.
@Aruthicon6 жыл бұрын
He stares into our soul each time he looks up the values of the approximation to W(1). Also, "Start learning toady" - description
@blackpenredpen6 жыл бұрын
Tommy Thach lol thanks.
@Aruthicon6 жыл бұрын
That was a fast reply, thank *you*!
@blackpenredpen6 жыл бұрын
: )
@EpicFishStudio6 жыл бұрын
because the function is so kink. it makes mathematicians _lip smack_ *moist*
@danielfajardo9636 жыл бұрын
Great video! It would have been great to mention that Omega is trascendental :)
@u.v.s.55836 жыл бұрын
Not that it would be a great surprise.
@kamoroso946 жыл бұрын
I love this video! This is my favorite problem in math. I came across this problem in high school by my own and couldn't figure it out right away. Eventually I came up with the idea of approximating it with e^-e^-e^…. Once I found the approximation, I googled the number and found out it was called the Omega constant, which is really cool. And it mentioned the Lambert W function, but I didn't really understand it at the time. I just love this problem because it opened my eyes to the fact you can't really solve everything with regular algebra :p
@guest_informant6 жыл бұрын
Do you know about tying a goat to the edge of a circular field so that it can eat half the grass. From memory I think that only has an approximate solution as well.
@kamoroso946 жыл бұрын
@@guest_informant oo I have not! Now I wanna try it. So you'd be solving for the length of the rope and the radius of the field, or something?
@guest_informant6 жыл бұрын
+Kyle Amoroso I first came across this about 30 years ago. Googling turned up Wikipedia ("It was first published in 1748 in England, in the yearly publication The Ladies Diary: or, the Woman’s Almanack."(!)), Wolfram, stackexchange and an xkcd discussion. There is no exact solution. This formulation is from the xkcd discussion: *You have a goat tied to the fence of a circular paddock. The paddock has a radius of 100M. You want the goat to eat HALF of the grass.* *How long does the rope need to be?* Assume an even covering of grass, and don't worry about the goat's neck length etc.
@ffggddss6 жыл бұрын
+ Guest Informant: That sounds like it's related to a problem I ran across in astronomy. You have a solar eclipse that will be partial where you are. You know the apparent angular diameters of both Sun & Moon (they're generally *not* equal!), and you know, at maximum eclipse (or at any particular time during it), what fraction of the Sun's apparent *diameter* will be encroached on by the Moon. Assuming both bodies to be exactly spherical, what fraction of the Sun's apparent *area* will be covered? Fred
@efulmer8675 Жыл бұрын
Another Omega Constant that's also really cool is Chaitin's Constant which is defined as the probability that a given program will halt given the capacity to run correctly, but because of its nature and the relation to the Halting Problem, it is uncomputable and 'non-guessable' but they are represented with the capital Omega when discussing them to have a nice symbol to work with.
@Davidamp6 жыл бұрын
16:12 This is just b r i l l i a n t
@gamebro63373 жыл бұрын
WOW this is my first time heard about Omega constant... so cool~
@gideonmaxmerling2044 жыл бұрын
fun fact, lowercase Ω is ω, so it kind looks like a w.
@OzzlyOsborne6 жыл бұрын
My first assignment before beginning college had Newton's Method in it, and for our Math program over the summer I ran into lambetW's all the time, and I had no idea what they meant. And now I understand what both mean. Well done thanks for the video.
@leonardromano14916 жыл бұрын
Simply do O(n+1)=exp(-O(n)) with O(0) close to the solution (for example 1 works), this converges quite fast towards the correct solution. This method works for slowly changing functions as O(n+1) and O(n) are equal as n->infinity.
@RaulPrisacariu9 ай бұрын
You can check OEIS sequences A370490 and A370491 to see how you can obtain an infinite series for the Omega constant using Whittaker's root series formula. I used Whittaker's root series formula to obtain infinite series for other mathematical constants (like 1/e, plastic number, Dottie number and ln 2). Sometimes Whittaker's root series formula is a useful alternative to Newton's method. 🙂
@sparshsharma5270 Жыл бұрын
Recently had Newton Raphson method in Approximations in my 4th semester. The method is easy but never knew about Omega constant. Thanks for that.
@ndeleonn2 жыл бұрын
Thanks for the video. I would point out two things: (1) This approximation method is the same as expanding the function in a Taylor series to liner order in (x-xg) where xg is your initial guess, (2) the iteration is very robust so that even a terrible initial guess, xg, will converge rapidly to the answer.
@pablojulianjimenezcano43626 жыл бұрын
I finally understand Newton's method, thanks you so much!!!! :) #Yay
@blackpenredpen6 жыл бұрын
Pablo Julián Jiménez Cano yayyyyy!!!!
@brahmandsaraswat8674 жыл бұрын
20:50 That's BRILLIANT
@ZackJRich6 жыл бұрын
Thank you!! 3 years I have tried to find a way to solve the equation: 3^x = x^2 And I knew the answer because I used a graphic calculator but I wanted to find a mathematical way of solving the problem and I did using this method!! I asked my profesors for help but they told me it can't be done and that these problems are solved by computers So, thank you for the enlightment
@ffggddss6 жыл бұрын
Hmmm, looks like it has only one real solution, and it's negative .... somewhere between -1 and -½. Your professors were right if your question was taken as, "How do you solve that equation symbolically?" Numerically, it's quite soluble, and Newton's Method works just fine; so does the "secant" method. Fred
@ZackJRich6 жыл бұрын
ffggddss I knew there was a solution because graphs of x^2 and 3^x have an intersection Question I asked my professors was simply "If x^2 equals 3^x, how do I find x?" Too bad they don't teach us these very usefull and interesting methods in highschool
@ffggddss6 жыл бұрын
Some places *do* teach these sorts of things. Too many do not. The way you asked, should have elicited, *at least,* something like, "There's no analytic method, but there are many numerical techniques to solve that." Fred
@aniketeuler64433 жыл бұрын
Really enjoyed as always 😃😃
@alexismignon7839 Жыл бұрын
I didn’t know this thing. Thanks for showing it.
@prashanthkumar03 жыл бұрын
😱😱😱😲😲wow...really genius ....love it... Newtown was an amazing guy ....
@luisrosano35106 жыл бұрын
"When calculus meet analitic geometry and shake hands" Is I.V.T. the Bolzano´s teorem?
@eberthenrique18685 жыл бұрын
The Bolzanos theorem is an especific case of IVT
@emmanueljosephcomargo30126 жыл бұрын
This is the first lesson in my Numerical Methods and I didn't learn it til today! Lol PS. I already finished the course lmao
@cameronspalding97926 жыл бұрын
Making zero the subject makes things so much more convenient: we can apply Newton’s method on f(x) if we want to solve f(x)=0, we can apply factorisation; the roots to f(x) are intersections of the x-axis and as |x|=0 iff x=0 m: the roots to f(x) are the points such that |f(x)|=0
@WisdomVendor16 жыл бұрын
if you are into functions, which means you have a feel for them and their general shape, then all you need do is graph a few points around a suspected zero of that function and a short bit of interpolation and you will have the zero. Newton's method is the most complicated way I've seen to get this accomplished.
@DutchMathematician6 жыл бұрын
+WisdomVendor1 If you interpolate between two points on the graph that bracket the true root, then you are essentially applying one iteration of the so-called "secant method" (which can be reguarded as the Newton(-Raphson) method where the derivative is replaced/approximated by a finite difference). In my (humble) opinion, this does not lead to a formula that is (essentially) easier ... As a concept, I agree with you.
@sugarfrosted20056 жыл бұрын
This might be an ok thing to introduce householder methods in general.
@isobar58575 жыл бұрын
When I did maths , ages ago, I recall that there is a way to determine legitimate starting values for the Newton Raphson method that will always ensure convergence, I will have to look up my old notes.
@jilow9 ай бұрын
For any functions f and g. We can find where f(x) * g(x) = 1 by solving (fx)^-1 = g(x), so in this case when: 1/x = e^x. ln (1/x) = ln (e^x) natural log boths sides ln(1) - ln (x) = x (log properties) -ln(x) = x (simplify) Doesn't seem as helpful as I'd hoped-- but I did graph it and got x = ~.567 :).
@ZipplyZane6 жыл бұрын
The omega constant always felt redundant to me when you can just use W(1). If we didn't have it, we could actual use omega(1), which looks more like a predefined function, as it uses a Greek letter, like pi(x) and gamma(x).
@Pete-Prolly5 жыл бұрын
Not tired of scratching my head and looking to KZbin for help... not even close "Tutor guy" I have "BPRP guy!!"😀
@sugarfrosted20056 жыл бұрын
Not loving a computable number being called Omega. :3
@Gold1618036 жыл бұрын
I knew there would be a Chaitin reference somewhere in the comments :)
@RobinHillyard3 жыл бұрын
Just to add a little history, Newton’s buddy Halley (of the comet) derived an approximation method involving, additionally, the second derivative. Householder generalized these methods to include the third, fourth, etc. derivatives.
@emanuellandeholm56572 жыл бұрын
Nice, and pretty close to the Euler-Mascheroni constant.
@idrisShiningTimes3 жыл бұрын
I cannot really express how I'm thankful to you. You explained this in such a beautiful manner that now I can work with Lambert W functions for Reals very easily. Thank you BPRP o7
@Jenab76 жыл бұрын
Maybe you will find x(i) very close to a local maximum or local minimum of the function before you find the root. Then your x(i+1) goes.... whoosh, far away. If the problem is persistent because the root and the extremum are very close to each other, then you will need to use *Father Wiggly's Famous Reverse Interpolation Bisection Method.* This is a combination of bisection method, to find a small interval that contains the root, and then (after the root is tightly contained) a 2nd degree Lagrange interpolating polynomial is used to estimate the value of the root even more accurately. I wrote an algorithm using this method to find the eccentric anomaly of a hyperbolic transfer orbit after being given the mean anomaly and the eccentricity. *Father Wiggly's Famous Reverse Interpolation Bisection Method* works when Newton's Method, Danby's Method, and similar methods using higher order Taylor series fail because of skittering off the roots of the derivatives. Father Wiggly was my cat for a long time. I named the method for him.
@Pianissimo3113 жыл бұрын
13:52 you dont need to be so tense, you can read the value showing us, thats ok 😂😂
@terapode6 жыл бұрын
And as always a great video.
@blackpenredpen6 жыл бұрын
Roberto Miglioli thanks!!!
@nikitakipriyanov72604 жыл бұрын
Whoa, this is the solid basement to explore a whole big world of fractals, Newton basins!
@nikitakipriyanov72604 жыл бұрын
O.k., I've managed to put this Lambert W into my fractal rendering program. So, here is goes. The image obtained when solving x exp(x) = 1 with a newton method. The color is by root reached, the shading is: the more iterations were needed to converge, the darker point is. There is a whole bunch of different complex roots (the Lambert W is a multivalued function, the equation in question in theory has infinite number of roots), but I've had only 6 base colors cycled, so different colors are different roots for sure, but same color doesn't mean same root. Nevertheless, a big basins of same color really converge to the same single root. imgur.com/a/OwB2OPr
@kieronsultana32875 жыл бұрын
1 question, how can you have an inverse function (called "w" in the video) when the graph is not a function. A function is when it is one to one or many to one mapping. If the graph of the complicated expression does not follow these types of mapping it is not a function => it does not have an inverse function. Thanks!
@unfetteredparacosmian5 жыл бұрын
There's really two W functions, W0 and W1, but there's only one for positive x
@supercr33p3r72 жыл бұрын
Man, I was coming here to explain, but both of the explanations I had in mind we're beaten to by you two.
@yaleng45976 жыл бұрын
W(1) Done
@ZipplyZane3 жыл бұрын
Is there any chance that W(x) can be written in terms of Ω? It otherwise seems odd to have a constant when you could just write W(1).
@risheraghavendira60424 жыл бұрын
Is this correct?? W(1) = omega
@blackpenredpen4 жыл бұрын
Yes!!
@abdallahamouda66336 жыл бұрын
From where do you have the t shirt?
@markhughes79272 жыл бұрын
Fascinating!
@XCyclonusX5 жыл бұрын
Sounds like Lambert just wanted a constant named after himself so he co-opted the Omega constant.
@papsanlysenko52326 жыл бұрын
Can you do a video about logarithmic integral function? or exponential Integral function? Or both?
@henriquefraga44866 жыл бұрын
that was the function i got at my calculus exam *-*
@davidseed29394 жыл бұрын
note that if you are measuring schemes which converge rapidly you should take note of the number of operations in each iteration. the following scheme is obtained from asimple rearrangement of the original equation. It takes more iterations by fewer operations x(n+1) = exp(-x(n)). very easy to implement on a calculator just press [+/-] and [exp] alternately a dozen times. since we know root is between 0 and 1 , start at 0.5, Note that inverting the equation to give x(n+1) =-ln(x(n)) . this is unstable. and although this can be annalysed ( ref “radius of convergence) its often easier just to try it and it it doesn't converge then invert the equation. For equations which are difficult to differentiate. eg it might be a complicated equation or just a table or results that you interpolate with cubic splies. then the chord method is best. start with two points either side orf a root find where the chord intersects the axis and use that as a new point. replace the point that is the same side of the x-axis
@ramzimay96695 жыл бұрын
Thank you Good Lecture
@vishalmishra30465 жыл бұрын
Use Newton's method after taking ln (natural log) of both sides => x + log(x) = 0 => converges faster with fewer and lower computational-cost iterations.
@giorgiomicaglio6 жыл бұрын
Awesome video, blackpenredpen😍🔝🔝
@moonwatcher20014 жыл бұрын
Awesome, thanks!!!
@ThePeterDislikeShow3 жыл бұрын
I'd like to try and calculate this constant to lots of digits. What are some of the ways that have been used to calculate this? It doesn't look like it has a good power series or such.
@brunoamezcua31126 жыл бұрын
amazing video, as always
@DonSolaris6 жыл бұрын
OMG!! He shrunk the black ball! Is he a voodoo priest or something?
@brechtbollaert81876 жыл бұрын
where can you get the blackpenredpen shirts?
@CT-pi2gl2 жыл бұрын
I solved the question in the thumbnail image by expanding the function into x(1+x+x^2/2...)-1=0, and then using Newton's Method with only those first 3 terms. It converged very quickly!
@hklausen4 жыл бұрын
finally I understand Newtons method . Thanks :-)
@rob8766 жыл бұрын
x[n] = exp(-x[n-1]) is another iterative scheme but doesn't converge as fast.
@mohamedelouajrachi663 жыл бұрын
C'est important .thanks
@adriftinsleepwakefulness70394 жыл бұрын
Could you please make a video explaining gradient decent and convex problem?
@Rundas694206 жыл бұрын
Nice one. Is it possible to get the formula for X sub n in explicit form? Would be nice.
@brunoalejandroandrades3546 жыл бұрын
Crystal-Math it'd be pretty cool, but I don't think u can, otherwise u would have a nice limit def for omega, and I believe bprp would've pointed it out
@vitakyo9826 жыл бұрын
1/e^(1/e^(1/e^(1/e^(1/e .... It's an infinite serie .
@xamzx92816 жыл бұрын
on x_3 i thought it was going to be "y" which equal 0.577
@alanturingtesla6 жыл бұрын
Well, this is great, with this W we can now actually find an inverse for all those exponential-polynomial mixes!
@alanturingtesla6 жыл бұрын
In terms of W, of course.
@woulzername6 жыл бұрын
!!! No supreme shirt?!!! great vid btw
@blackpenredpen6 жыл бұрын
lemon Danish that fell on the floor : )
@captaintwist70054 жыл бұрын
loved it
@swajag46535 жыл бұрын
You can just equate this as e^x=1/x The point of intersection between e^x & a hyperbola 1/x x belongs to (0,1)
@EtherDais5 жыл бұрын
You can also represent Omega as an infinite nest of -Ln(-Ln(-Ln(...)))
@ignacioignacio1328 Жыл бұрын
I need solve this e^x-ln(x+3) Thanks!
@goodboyneon Жыл бұрын
Doaremon music in the background, brings back so many memories!
@Marcos-bo6vi6 жыл бұрын
Thanks, bro!
@gabeherophenom4066 жыл бұрын
x=W(1)
@SuperYoonHo2 жыл бұрын
Newton=New ton=Knew a Ton!
@abdulrahmanradwan61674 жыл бұрын
Thanx
@bagelnine93 ай бұрын
But why is there a constant that is equal to W(1)? Does it have a practical implication?
@arielfuxman88684 жыл бұрын
What about getting a second order approximation for the function using the second derivative and getting a faster converging approximation?
@ThePhenomBot6 жыл бұрын
Doaremon tone in the starting ❤️
@nischay47606 жыл бұрын
Yea :D
@almaska823 жыл бұрын
You had to find the second derivative to use this formula. From its sign depends on the left or right, you need to start the approximation.
@mostafaahmednasr6215 жыл бұрын
Is this newton raphson approximation right?
@stevenwilson55564 жыл бұрын
This was a good explanation to how to get omega, but you didn't talk much about what this function is useful for. Please consider mentioning that in future if you know why certain functions are helpful or useful for some application.
@karstenmeinders48446 жыл бұрын
I wonder if there is an algebraic expression for Omega, not only the approximation shown in the video.Great stuff nevertheless!
@blackpenredpen6 жыл бұрын
Karsten Meinders W(1)
@trangium6 жыл бұрын
How about a series, only using + - * /, roots, and logs?
@karel85876 жыл бұрын
make newton sums video please
@clivegoodman164 жыл бұрын
When I was at University, it was called the Newton-Raphson method. I wonder why Raphson is no longer mentioned.
@warrickdawes79006 жыл бұрын
It's also fun that the solution to x.ln(x)=1 is 1/W(1).
@davidhitchen53693 жыл бұрын
A quick and dirty way to solve for this is to solve x = exp(-x) numerically by iterating exp(-x) until it converges on a solution. Seed at 1. The first approximation is 1/e. After 22 iterations it's at 0.56714. It converges way slower than Newton, but if you have a spreadsheet you can set it up and calculate it in seconds.
@kentw31373 жыл бұрын
Amazing
@zackeriaeslynesjbrautccie42832 жыл бұрын
Love it
@markolazarevic42096 жыл бұрын
Can you do a video on how to use Lambert W function to evaluate solution for like xe^x=n and x^x=n where n is different than 1
@blackpenredpen6 жыл бұрын
Yea, you can check out my new videos for them!
@hipepleful3 жыл бұрын
Is there any other uses of the omega constant? Does it pop up in any math field?
@mattgsm6 жыл бұрын
Solve y*(e^y)=x next!
@blackpenredpen6 жыл бұрын
I will find time to do that.
@Minecraftster1487906 жыл бұрын
blackpenredpen I doubt that
@blackpenredpen6 жыл бұрын
Minecraftster148790 ok.
@DutchMathematician6 жыл бұрын
+Matt GSM Actually, this equation is easy to solve (in explicit form even) by using Lambert's W function. For real x ≥ -1, the function F(x)=x*e^x is strictly increasing and therefore has an inverse which is (the real-valued restricted) Lambert's W function. Hence, W(x*e^x)=x by definition of the W function. If we apply the function W to both sides of the equation y*e^y=x, we therefore get the simple expression y=W(x). (assuming y ≥ -1/e and x ≥ 0) Note that in the video, the special case x=1 was handled. I can imagine most readers will argue that Lambert's W function is not seen as an elementary function (it is implicitly defined by the relation W(x*e^x)=x) and that they might therefore not agree with the term "solved" (read as: "solved in terms of elementary functions"). But isn't the function ln(x) also implicitly defined? And how are exponential functions defined for irrational arguments? Both functions, however, are generally regarded as elementary functions. Just some food for thoughts...
@AlgyCuber6 жыл бұрын
y = W(x)
@channelbuattv Жыл бұрын
Surprisingly, the constant has an infinite representation of e^-e^-e^-e^-e^-e^-e^-.... Because it's a solution to x = e^-x and e^x = 1/x
@alexismandelias6 жыл бұрын
I'm Greek. I write capital Ω all the time. He makes them better...
@nischay47606 жыл бұрын
You don’t bother to make them good because you make them all the time.
@barryzeeberg367227 күн бұрын
I have a philosophical question. Consider sin(x) and W(x). I think of sin(x) as 'natural' and W(x) as 'artificial' or 'concocted' or perhaps 'ad hoc'. But I wonder if sin(x) and W(x) are both really on the same footing, and I am just biased because sin(x) is more familiar to me?