that is why I don't like order/disorder to describe entropy. it creates this confusion.
@mohammadsadeghashrafpoor6737Күн бұрын
Coooool
@nathanielibabao8528Күн бұрын
Great Visualization!
@davidgillies620Күн бұрын
The thing to remember is that entropy is measuring _distinguishable_ microstates.
@mahtpav63892 күн бұрын
So the car is always at least... 2 steps ahead?
@MarbleScience2 күн бұрын
@mahtpav6389 No every position is possible. Some are just very unlikely.
@seabassvn2 күн бұрын
The crystalline to liquid analogy is wrong. there is inherently more entropy in a liquid vs a crystal at the same temperature.
@MarbleScience2 күн бұрын
"Liquid” indeed has more entropy compared to “crystalline”. That is also what I am saying in the video! The reason why it has a higher entropy is because more of the possible microstates fall into the “liquid” category compared to the “crystalline” category.
@TheRealJoseramirez2 күн бұрын
In the great scheme of things, does it matter?
@michaeljames59362 күн бұрын
Thank you. I agree completely. I always thought the idea of entropy, privileged the observer, as some arbiter of information.
@danstrayer1112 күн бұрын
this is the husband from Blacklist.
@scottferson1703 күн бұрын
I think this perfectly exemplifies the absurdity and self-torture of trying to use a single probability distribution to characterize knowledge. I assure you, MarbleScience, that changing from a crystal to a liquid is a physical change in reality, not merely a perception. If state changes of matter were not physically real, steam engines wouldn’t work unless we were paying attention to them. There is a better way to express what is known. The starting ‘distribution’ in the third round is not three equal masses of one third at each of three locations, but rather three imprecise masses at each location. Each of these is an interval between zero and one corresponding to your ignorance about whether the marble is there. Same for the ending distribution in the second round. We might such an interval the ‘dunno’ interval because we don’t know whether it’s a zero or a one. That’s what needed to represent our ignorance here. The result is called an imprecise probability distribution, which is essentially a set of many possible distributions. A single probability distribution cannot really characterize ignorance about the marble's location. And entropy is not a full description of all the uncertainty. Entropy is not a measure of how unspecific our descriptions are, but instead a characterization of physical reality. It is interpreted as the evenness of the distribution of the actual states of the system’s components over their possible states. Some analysts suggest complementing Shannon entropy with a discrete or continuous version of the Hartley measure of ignorance to create a fuller description that includes both this evenness and our ignorance.
@Astronomator2 күн бұрын
This is a great comment. It's as if the video is asserting a "Bell's Theorem" approach to reality based on ignorance rather than on the non-existence of unmeasurable properties. Quantum mechanics is confusing enough. Let's not complicate entropy more than necessary by needlessly incorporating the mindset of the observer. Again, great comment, and I agree completely.
@MarbleScience2 күн бұрын
The reason why macroscopically processes evolve towards high entropy is just simple statistics. If for example one macrostate comprises 1000000 microstates and another just 2, it is simply very likely to end up in one of the 1000000 microstates of the first macrostate. It is the same reason why we usually don’t win in a lottery, and sure, it doesn’t matter if you are watching the process or not. It is not exactly a steam engine, but I encourage you to watch my video about the entropy driven car: kzbin.info/www/bejne/hoG6dqCgo9amfJo
@Astronomator2 күн бұрын
@ That's a great video! Thank you for posting. You've gained a new subscriber! I hope your job search goes well.
@salvamarin83813 күн бұрын
Thank you Wally!!!!!!!!!!!!!!!!!
@jareknowak87123 күн бұрын
👍
@asifsiddiqui13474 күн бұрын
Very nicely done. Brilliant work. Pls keep it up.. ❤❤❤
@WWld4 күн бұрын
Great!!
@sebastiandierks79194 күн бұрын
Upright format? Did you try to hide the disorder in the rest of your room, or do you try to appeal to a younger audience? xD Anyway, great video :)
@MarbleScience4 күн бұрын
No actually I wanted to make a short and then it turned out longer than expected 😄
@nirn_5 сағат бұрын
@@MarbleScience a simple idea for 5 seconds -> 4 minute video :D
@Zero-cc4yg4 күн бұрын
Great Video! :>
@cezarcatalin14064 күн бұрын
One could guess where the ball is in the last example because the person knowing which is the cup with the ball will tend to treat that cup different from the other two. Either move it around more than the others or less but almost never in-between. So, ironically, mixing the cups for long enough can reveal where the ball is even if you started with zero knowledge of its position - this is because your brain tends to leak information unintentionally. Quantum mechanics (as in the standard model) and classical thermodynamics are ironically incompatible because of something similar. QM is fundamentally “leaky” even with all the inherent wave-particle duality which mathematically results in the Heisenberg uncertainty principle. This means that, at least according to our current understanding of QM, we should see negative changes in entropy all the time - which do happen in quantum systems, it’s just that in large “classical systems” entropy sortof emerges out of thin air (technically called decoherence in QM terms).
@josephbrenner82484 күн бұрын
It's all about the context.
@marius53614 күн бұрын
You explain a subjective view of entropy. Physical entropy (thermodynamic entropy) is an objective quantity that doesn't depend on what an observer knows. Even if no one is watching, a gas expanding in a vacuum increases in entropy because there are more possible microstates available. If you gained knowledge (by looking at the marble), your personal uncertainty would decrease, but the physical entropy of the system wouldn't decrease unless an external energy input allowed it.
@MarbleScience4 күн бұрын
I would like to answer with a paragraph of E.T. Jaynes 1996 paper about the gibbs pradox: "There is a school of thought which militantly rejects all attempts to point out the close relation between entropy and information, claiming that such considerations have nothing to do with energy; or even that they would make entropy "subjective" and it could therefore could have nothing to do with experimental facts at all. We would observe, however, that the number of fish that you can catch is an "objective experimental fact"; yet it depends on how much "subjective" information you have about the behavior of fish. If one is to condemn things that depend on human information, on the grounds that they are "subjective", it seems to us that one must condemn all science and all education; for in those fields, human information is all we have. We should rather condemn this misuse of the terms "subjective" and "objective", which are descriptive adjectives, not epithets. Science does indeed seek to describe what is "objectively real"; but our hypotheses about that will have no testable consequences unless it can also describe what human observers can see and know. It seems to us that this lesson should have been learned rather well from relativity theory. The amount of useful work that we can extract from any system depends - obviously and necessarily - on how much "subjective" information we have about its microstate, because that tells us which interactions will extract energy and which will not; this is not a paradox, but a platitude. If the entropy we ascribe to a macrostate did not represent some kind of human information about the underlying microstates, it could not perform its thermodynamic function of determining the amount of work that can be extracted reproducibly from that macrostate."
@marius53613 күн бұрын
@@MarbleScience Jaynes makes a great point about entropy being linked to information, and that perspective is useful in some contexts. But in thermodynamics, entropy is still an objective property of physical systems. A gas expanding into a vacuum increases in entropy, no matter what we know about it. The connection to information is real, but entropy isn’t just about our knowledge-it also reflects the physical reality of systems.
@imghoti5 күн бұрын
Holy shit! That GENIUS! I've thought along those lines for a very long time but never heard (or even thought) it expressed so concisely. You need to run with this and write a book!
@iw9cpl5 күн бұрын
Thank you very much
@leo.dupire7 күн бұрын
This is a phenomenal explanation
@liviu-deacu7 күн бұрын
Congratulations for the didatic presentation! Very nice way to reveal the intended idea. Otherwise, good morning that you realized that entropy has to do with information, and as such, with life. Information that has to do with the theory of systems. Systems that always mean life. Of course, without living beings there is no such thing as entropy. Entropy only exists from the perspective of a complex system, of a living being. It is what I wrote to you once per email. This is why we need the BIOLOGICAL point of view to judge entropy. Physics itself would not exist without living beings being capable of observing "non-living" matter, if there is such a thing.
@captheobbyist64348 күн бұрын
so that means everything in the universe has infinitely high entropy? if you think about it, it's practically impossible to measure an object with 100% certainty. Edit: wait, THIS VIDEO ONLY HAS 70 LIKES AND 4 COMMENTS? this is criminally underated! this video deserves more
@OliviaSNava8 күн бұрын
Well the thing is, we can never measure absolute entropy. It’s always about a change in entropy. We don’t know how much entropy a piece of wood has, but if we burn it, its entropy increases.
@MarbleScience8 күн бұрын
I wouldn't say that things in the universe necessarily have infinitely high entropy, but we will probably never know their absolute entropy values. That's why outside of toy models, we only work with entropy differences not with absolute values.
@PaulMurrayCanberra3 күн бұрын
Put 99 green balls and 1 blue ball into an urn, shake, and draw one. You don't 100% know what colour you are going to get, but that's not the same as not knowing anything about the colour you are going to get.
@StoryTime-w4s8 күн бұрын
I actually did 100% know that the ball was on the left side(your right side) No I didn’t break any thermodinamic laws,I simply heard the marble moving when you moved that cup💀💀💀💀
@doyoureallyneedaname8 күн бұрын
Ok, so entropy is subjective then😅
@MarbleScience8 күн бұрын
That's interesting because in the last round I actually did not put a marble under any cup, exactly because I wanted to avoid the loud rolling noise 😄
@marcom.76859 күн бұрын
Then, over time as our measurement precision and capabilities increase, can we consider that for a given problem we might have negative entropy by measuring it better?
@MarbleScience8 күн бұрын
A measurement can decrease entropy in one part, but to get the complete picture, we also have to consider the measurement device itself. If the measurement apparatus in itself increases entropy more (e.g. powered by electricity) than entropy is reduced by the result of the measurement, the overall entropy change could still be positive. E.g. Landauer and Penrose also argue along these lines in the context of Maxwell's demon.
@BrijeshMishra099 күн бұрын
It's amazing how simple you made it to understand! The object shadow example was really intuitive. Could you please also share the soundtrack you're using? I absolutely loved it!
@fynngerding41019 күн бұрын
Thanks for the intuitive explanation
@Eastern19 күн бұрын
Why does your statement resonates with Hesinberg's uncertainty principle
@LaDimonideMaxwell9 күн бұрын
Awesome video!
@bhavyapandeyneetapandey412514 күн бұрын
Love from India ❤❤
@LaDimonideMaxwell15 күн бұрын
Great video! Awesome visuals and clear explanations. I'm using it with my students from now on, thanks :)
@jonathanramirez260615 күн бұрын
Love the example.
@OmarBadr-d3r16 күн бұрын
One of the best videos I've ever watched on KZbin.
@adamasadis619318 күн бұрын
is it feasible to determine light paths, in particular when the path finishes? I found this confusing
@AlexDani.N20 күн бұрын
I have a math exam tomorrow and this is part of the exam. Thx man!
@MarbleScience20 күн бұрын
Good look with your exam :)
@thomasneumann210320 күн бұрын
value = energy x information lol
@MarbleScience20 күн бұрын
Why not also multiply with the number of atoms? Or with pi? Or any other constant?
@steinmar220 күн бұрын
You get the energy by slowing down the wind afterwards btw (;
@akira_kom21 күн бұрын
Ok so when he said we dont consume energy he menas it isnt like, poofed away. It just changes to not so useful form Entropy
@MarbleScience20 күн бұрын
I really don't like this way to put it. I know it's popular, but it doesn't make any sense. We could do the same weird thing with any other quantity that stays constant. For example the number of atoms stays constant, but now we could come along and say we are consuming valuable atoms and convert them into less valuable atoms. I would say that's a strange way to put it. It's still the same nitrogen atoms, when wind is converted into heat. Now you say a valuable form of energy is converted into a less valuable form. I would say that's a strange way to put it. It is still the same kinetic energy of atoms, when wind is converted into heat. Why not just acknowledge that the only difference is an entropic one? Why do people always project that difference into the realm of energy, where it does not belong?
@ronaldreeves42121 күн бұрын
I dont like this defintion because it decouples it from information theory making the concept of entropy less useful. It decouples it from the macrostate which takes less information to describe. I think there is something that can reconcile these where a maximum information state could collapse rextarting big bang
@MarbleScience21 күн бұрын
I'm not sure if I understand what you are saying... Botzmanns entropy formulat S= klnΩ is kompatible with Shanons entropy formula used in information theory. Shanons formula turns into boltzmanns formula if we make the assumption that all states are equally likely. What exactly do you think is not compatible?
@jareknowak871222 күн бұрын
All the best in the New Year to You and all the Viewers. Please don't leave us without Your new vidoes for so long!!!
@blazel46222 күн бұрын
If they could monetize the wind like they have done with oil and electricity, it would be even more valuable.
@ShaheenGhiassy21 күн бұрын
Wind farms generate electricity and produce profits
@TurboCodingMaikol22 күн бұрын
I loved the video but why are you "Where's Wally"?
@martinsanchez-hw4fi22 күн бұрын
What do you use to make your animations?
@MarbleScience22 күн бұрын
I use Blender for my animations :)
@Dheeraj3612p23 күн бұрын
You exactly k Look like me and my maternal uncle
@shradhayadav369529 күн бұрын
Is that you Ryan Eggold ?
@freedomclublkАй бұрын
This is the Best Video about Monte Carlo Simulation.Keep it Up
@youngidealistАй бұрын
What software created this?
@theofficialjizzyАй бұрын
Food for thought: When Marble is sort of similar to ray tracing. What could ray tracing be similar too that we currently don't understand? Maybe gravitational waves?
@A.RustamАй бұрын
Fantastic video! This channel deserves immense success!