Memorising Graham’s Number Creates Black Holes | Entropy

  Рет қаралды 5,118

mindmaster107

mindmaster107

Күн бұрын

Пікірлер
@desicmanifold4025
@desicmanifold4025 Жыл бұрын
As always, immensely impressed by your clarity of explanation and the utter charm you convey it with! The idea of numbers having mass/energy is a concept I'd never heard of before in that context, never going to forget that now.
@skadoosh1729
@skadoosh1729 Жыл бұрын
This is so underrated. Your voice is akin to 3b1b's actually
@TheSummoner
@TheSummoner Жыл бұрын
That's a great overview of all the different faces that entropy takes! Personally my favorite application of the notion of entropy is the so-called maximum entropy principle, used to decide which probability distribution best fits what is observed about a phenomenon.
@tcaDNAp
@tcaDNAp Жыл бұрын
It took me a year to get back to both SoME videos on this channel, but now I'm hooked on these deeper connections 🤝
@hedgehog3180
@hedgehog3180 9 ай бұрын
Slight correction, Carnot did not have the concept of entropy since he believed in the Caloric theory of heat. Therefore in his original description of the Carnot cycle the engine takes out as much heat, Q, from the hot reservoir as it returns to the cold reservoir. So he didn't believe that a heat engine does work by extracting heat from a heat difference, and the concept of efficiency, how much work the engine can extract from the heat, did not exist in his conception. Though he did sorta prefigure the idea of the 2. law off thermodynamics with his proof that no heat engine can be more efficient than the equivalent Carnot engine, but since he didn't conceive of heat as energy he also didn't think that the heat the engine delivered to the cold reservoir was lost energy. The version of the Carnot cycle you have up, and the formula for the Carnot efficiency were conceived by Clausius in order to rectify Carnot with the modern molecular theory of heat, who also coined the term entropy and the most common formulation of the laws of thermodynamics. So he probably deserves a lot of the credit. Though one thing that's sorta neat is that Carnot described a heat engine as "something that interrupts the free fall of heat", which is very close to a more modern understanding of heat "falling" from a state of low entropy to a state of high entropy and heat engines accelerate that fall by extracting work from the heat difference.
@matiasnovabaza8208
@matiasnovabaza8208 Жыл бұрын
This year happen to me that looking for videos about entropy I couldn't find anything that actually explain it, and i think that you did an excellent job here, thanks
@mindmaster107
@mindmaster107 Жыл бұрын
This is the first ever super comment I’ve ever got! Thank you very much! I don’t make videos too often, but I make sure that each one i a treat worth waiting for.
@susulpone
@susulpone Жыл бұрын
i love your voice it makes listening to these concepts so pleasant
@mindmaster107
@mindmaster107 Жыл бұрын
I appreciate the compliment :D I’m thinking of doing slightly different formats of videos, for instance a 1 hour lightly edited talk on science history. You might enjoy that it seems
@fluffy_tail4365
@fluffy_tail4365 Жыл бұрын
Another banger, you can explain certain coincepts with such an energy that is contagious
@mindmaster107
@mindmaster107 Жыл бұрын
I have the energy of a perpetual motion machine Thank you so much for your comment!
@Jaylooker
@Jaylooker Жыл бұрын
Wick’s rotation connects entropy to quantum mechanics by way of statistical mechanics. The prime number theorem can defined using the offset integral Li(x) = ∫ li(z) dz. Notably, the Li(x) bounded between 0 and 1 = -ln 2 like the information content defined at 7:45 and like how probability (and information) where defined as S = k_B ln W at 9:22. Also, Chebyshev’s functions for prime numbers are similarly defined to that of Shannon’s entropy at 8:38. This suggests the primes follow some entropy law and randomness. Thermodynamics and the dissipation it entails through entropy have solutions that are described using Gaussians and Fourier series. These solutions generalizes to harmonic analysis, automorphic functions, and automorphic forms such as modular forms and provides a mathematical basis to do entropy.
@sgtstull
@sgtstull Жыл бұрын
This channel is disgustingly underrated.
@mindmaster107
@mindmaster107 Жыл бұрын
Thanks so much for the kind word!
@arminhrnjic1678
@arminhrnjic1678 Жыл бұрын
One of the greatest entries for SoME3! Hope you win!
@mindmaster107
@mindmaster107 Жыл бұрын
Thank you so much for the kind words! I feel like I have already watched and voted on 25 entries that are better than mine, but I’ll hold my fingers crossed as you never know.
@ClemoVernandez
@ClemoVernandez Жыл бұрын
Great to see you back uploading! :)
@mindmaster107
@mindmaster107 Жыл бұрын
Good to see you comment how happy you are!
@momothx1
@momothx1 Жыл бұрын
i wish i had you as a teacher! you have such a nice way of providing information and you’re passionate about what you do. keep up the good work ❤
@mindmaster107
@mindmaster107 Жыл бұрын
Thanks for the comment!
@EntropicalNature
@EntropicalNature Жыл бұрын
Great video. Just a note on the zeroth law (without trying to be pendantic): it's more a law which defines the notion of Temperature and accompanying scale without the explicit invocation of entropy. It can be stated as: ''if a body C, be in thermal equilibrium with two other bodies, A and B, then A and B are in thermal equilibrium with one another." Seems like a moot observation, but it ensures we can safely do our maths and calculate things like pressure, chemical potential or other nice thermodynamic properties between systems and say something useful about them IF the systems are in thermal equilibirum, i.e. have the same temperature.
@mindmaster107
@mindmaster107 Жыл бұрын
Absolutely! It’s part of why it’s called the zeroth law, as its often just assumed and moved on from.
@arbodox
@arbodox Жыл бұрын
Yay, another mindmaster upload! Great video as always, I appreciate your in-depth and clear explanations of physics topics. I'm looking forward to revisiting your videos when I begin my physics undergrad next month. :D
@mindmaster107
@mindmaster107 Жыл бұрын
Good luck on your university journey! Remember to enjoy both the learning, and meeting amazing people along the way.
@pyrenn
@pyrenn Жыл бұрын
Same for me as well! On all the points
@dcs_0
@dcs_0 Жыл бұрын
great video, it serves beautifully as a sequel to the Veritasium video on the same topic, diving more into the actual mathematics behind entropy after getting a feel for what it is, which is insanely satisfying lol
@mindmaster107
@mindmaster107 Жыл бұрын
Glad you enjoyed it.
@docopoper
@docopoper Жыл бұрын
I love this channel.
@mindmaster107
@mindmaster107 Жыл бұрын
I love you.
@docopoper
@docopoper Жыл бұрын
@@mindmaster107 Awwwww
@cycklist
@cycklist Жыл бұрын
Wonderful. Thank you.
@SuperMarioOddity
@SuperMarioOddity Жыл бұрын
This is some sci-fi shit
@omicrontheta38
@omicrontheta38 Жыл бұрын
This video was incredible
@mindmaster107
@mindmaster107 Жыл бұрын
I’m glad all my effort was worth it
@omicrontheta38
@omicrontheta38 Жыл бұрын
@@mindmaster107 Entropy has been a big confusion for me for ages and this video was perfect! I am really grateful for all your talent and effort with your videos.
@hedgehog3180
@hedgehog3180 9 ай бұрын
You could totally make a video just about thermodynamics because after studying it I've found that most popular understandings are like slightly wrong in a way that critically skews perception. To give some examples: The first law is often said to be "energy can neither be created nor destroyed only transformed" this is not wrong but it's basically the same idea as conservation of energy which is just a basic physics thing not really a thermodynamics thing. The formulation I prefer is "the internal energy of a system is equal to the heat added to it plus the work performed on it" or U=Q+W, I prefer this formulation because it actually makes a statement about work and its relationship to heat and it clarifies the concept of internal energy as distinct from just heat. All of this is way more useful when doing thermodynamics. The second law has a similar problem, the most popular formulation is "the entropy of the universe tends towards a maximum" or something similar like that but this formulation kinda says nothing, like what is entropy? (I know wrong video to say that) and why does it increase? Another much better formulation is "It is impossible to realize a reversible cyclic process where work is performed by extracting heat from a single reservoir that remains at the same temperature", this of course sounds like nonsense but if you understand the Carnot cycle it basically boils down to saying "no engine can be more efficient than the equivalent reversible Carnot engine" and that of course means that a heat engine must deliver some amount of waste heat to the cold reservoir. Another formulation that is also somewhat common and in my opinion pretty good is "heat cannot flow from a cold body to a hot body without work being performed", you can see how this is equivalent to the other one I liked if you just perform a thought experiment where you have a Carnot engine and then some magical substance that can transfer heat from a cold body to a hot body. In that case what you end up with is the cold reservoir remaining at the same temperature while all of the heat energy of the hot reservoir gets turned into work. Other than that actually putting the Carnot cycle in it's proper historical context is really interesting, like Carnot was trying to improve steam engines and if you just take the conclusions of the Carnot cycle you can explain basically all the technological developments of the steam engine. Firetubes in boilers are a way to raise the temperature of the hot reservoir, compound expansion engines are a way to let the steam undergo adiabatic expansion for as long as possible and thus get as close to its condensation temperature as possible, and the limiting case of an extremely high number of pistons is basically just a steam turbine, which is why they're so efficient. Some early steam engines had their pistons contained inside the boiler but this obviously means that there is direct contact between the hot and cold reservoirs and thus it made the engine less efficient, even though it seems like a smart way to provide insulation. Superheated steam is another effort to make the engines as reversible as possible, since the Carnot engine assumes an ideal working gas and wet steam is very much not an ideal gas (which follows intuitively from the kinetic theory of heat) however by superheating the steam it does start to act more like an ideal gas. Maybe I'm just saying all of this because I just wrote about it but I think it could make for a good video, if I at some point have time myself I'd probably give it a shot.
@mindmaster107
@mindmaster107 9 ай бұрын
Genuinely, make that video! I made my videos because I found no one doing it for this level of understanding. If you want to take it to the next level, you have my full encouragement!
@hedgehog3180
@hedgehog3180 9 ай бұрын
@@mindmaster107 Thanks!
@filker0
@filker0 Жыл бұрын
Though it isn't comprehensive, I think this video is very good at explaining information entropy and how that relates to the heat death of the universe. Unfortunately, this itself contributes to the entropy of the universe which as far as we know is a closed system, thus accelerating by some tiny amount to the aforementioned heat death of the universe...
@mindmaster107
@mindmaster107 Жыл бұрын
Think about how daydreaming speeds up the end of the universe
@NiceGameInc
@NiceGameInc Жыл бұрын
If we already know the heat death of the universe will eventually come, why don't we stop right there on the spot and call it a day? In my humble opinion, entropy is still very poorly defined and i won't go into detail here on why i think so.
@oosmanbeekawoo
@oosmanbeekawoo Жыл бұрын
@@NiceGameInc Hey, can I ask you to explain how? I know we don't have a definitive definition for Entropy like we have for Work. But the only reason I think Entropy is poorly defined is because I can list all the conflicting definitions of Entropy from textbook to textbook. I can't find an ‘authoritative’ explanation why Entropy is poorly defined. If you have something to say, please do so!
@whatitmeans
@whatitmeans 2 ай бұрын
I think there is something not just precise: the Boltzman entropy kb\ln(W) is not equal to Shannon entropy which is instead the average entropy per particle. By the way, there is a simpler intuitive definition of Entropy in information theory, read the first 13 pages of the paper by R.V.L. Hartley "Transmission of Information" (1928), I think Shannon took the idea from it.
@mohamedmouh3949
@mohamedmouh3949 8 ай бұрын
thank you sooo much 🤩🤩🤩
@lake5044
@lake5044 Жыл бұрын
Can someone help me? I got curious about the maximum bits that can be stored in a 1kg mass. I used these equations here 15:11 (replacing ln(10) with ln(2)). And I got about 4456 TB is the maximum in one kg of mass. Hmm is that correct?
@mindmaster107
@mindmaster107 Жыл бұрын
Yep! Keep in mind that maximum bits increases with mass squared, so it isn’t a direct proportional relationship.
@mindmaster107
@mindmaster107 Жыл бұрын
I feel like I should go more into detail here. Increasing the volume within information can be stored, itself increases entropy. You can imagine increasing the volume of a cloud of gas. Therefore, while it is correct a 1kg black hole will store that amount of data, any storage device larger than that could eclipse it. My argument of minimum mass holds up when considering larger numbers, where the black hole radius is larger than human scales, say Graham's number which is much larger than Earth. For 1kg, the black hole is smaller than a proton, so we have a bit of wiggle room there.
@lake5044
@lake5044 Жыл бұрын
​@@mindmaster107What do you mean by "storage devices larger than the volume of a 1kg blackhole would eclipse it"? Also, does the max information in a blackhole depending on its mass? its surface area? or volume? Meaning, I assumed that the max info in a volume is the same as the info of a blackhole of the same volume, but is that true?
@mindmaster107
@mindmaster107 Жыл бұрын
@@lake5044 For a given volume, absolutely the maximum information that can be stored is a black hole of that size. For a given mass, because spreading out mass increases entropy, and 1kg gas can take up loads more space than a 1kg black hole, this relationship isn't strictly valid. It still validates the fact that memorising numbers generates heat.
@2wr633
@2wr633 Жыл бұрын
I don't get how the amount of of possible microstates can be maximize, isn't the amount of of possible microstates of a closed system a constant?
@mindmaster107
@mindmaster107 Жыл бұрын
I didn’t spend as much time as I should have on this part of the video, so I’ll go into detail here. Microstates absolutely, for a given amount of energy and a closed system, should be a constant. However, we can sub-divide groups of microstates even further into microstates corresponding to a set energy, volume, etc. In the video, I assumed since the two gases were being mixed, we could ignore all these other factors, since both gases shared the same conditions. This is absolutely NOT assumable for the general case, which is why the full derivation is much more thorny. You can find a video from Viascience on this, as his videos are a beautiful showcase of the mathematics.
@hazimahmed8713
@hazimahmed8713 5 ай бұрын
Why did you stop uploading? Your videos are very good.
@mindmaster107
@mindmaster107 5 ай бұрын
I'm still around, just things are currently busy in my life. A video will come one day, don't worry :D
@OliBomby
@OliBomby Жыл бұрын
Entropy, thermodynamics, probabilities…to a common man like myself, these words mean less than dirt. But in the hands of a physician? Let’s just say, things can get interesting.
@mindmaster107
@mindmaster107 Жыл бұрын
Physics isn't so great? Are you kidding me? When was the last time you saw a subject with such depth and mathematical? Physics puts the game in another level, and we will be blessed if we ever see a subject with potential skill and passion for science again. Chemistry breaks records. Biology breaks records. Physics breaks the rules. You can keep your experiments. I prefer the magic.
@petevenuti7355
@petevenuti7355 Жыл бұрын
​@@mindmaster107 breaks the rules? Don't you mean makes the rules?
@tcaDNAp
@tcaDNAp Жыл бұрын
Is this one more reason that it's impossible to know everything about anything? I hope UpAndAtom enjoys this video
@mindmaster107
@mindmaster107 Жыл бұрын
This is one more reason :D
@HimanshuSingh-ej2tc
@HimanshuSingh-ej2tc Жыл бұрын
loved it
@BorisNVM
@BorisNVM 3 ай бұрын
cool
@johanyim3097
@johanyim3097 Жыл бұрын
Your math warning was 4 minutes and 17 seconds too late into the video
@mindmaster107
@mindmaster107 Жыл бұрын
shhhhhhhhh
Жездуха 42-серия
29:26
Million Show
Рет қаралды 2,6 МЛН
🎈🎈🎈😲 #tiktok #shorts
0:28
Byungari 병아리언니
Рет қаралды 4,5 МЛН
-5+3은 뭔가요? 📚 #shorts
0:19
5 분 Tricks
Рет қаралды 13 МЛН
Rethinking the real line #SoME3
14:54
Proof of Concept
Рет қаралды 99 М.
The Boundary of Computation
12:59
Mutual Information
Рет қаралды 1 МЛН
Why are tensors EVERYWHERE? | Tensors for beginners #SoME
9:23
mindmaster107
Рет қаралды 25 М.
The Fibonacci Music Box (#SoME3)
16:50
Marc Evanstein / music․py
Рет қаралды 224 М.
Is the Logistic Map hiding in the Mandelbrot Set? | #SoME3
12:14
The Verhoeff-Gumm Check Digit Algorithm #SoME3
17:00
Concepts Illuminated
Рет қаралды 194 М.
Percolation: a Mathematical Phase Transition
26:52
Spectral Collective
Рет қаралды 369 М.
Unsolved Math: The No-Three-In-Line Problem #SOME3
12:52
Signore Galilei
Рет қаралды 162 М.
Graham's Number Escalates Quickly - Numberphile
7:17
Numberphile
Рет қаралды 506 М.