that marble fortune teller bit was amazing, and beautifully made
@MarbleScience Жыл бұрын
Thanks :)
@qookiemonsta2557 Жыл бұрын
your visualisations and examples are carefully thought out, which make it a profound joy to watch your videos!
@sebastiandierks7919 Жыл бұрын
Interesting to compare physicists with fortune tellers, of all xD Certainly provocative, but of course the metaphore is well phrased and explained. Nice playful video on the fundamental idea of statistical mechanics.
@kat13man Жыл бұрын
Thank you for this brilliant visualization of entropy. You have truly done a great service in taking the time to show us how Boltzmann's definition of entropy works. And while entropy might be an artifact of how scientists looked at things, it does not diminish the incredible insight and expansion of understanding of our universe that an equation for entropy does for science though the use of mathematics. Manipulation of the entropy equation reveals unseen secrets. Then, there is Boltzmann's constant. The mathematical smoking gun that the Universe was designed. Can you explain why this constant keeps appearing over and over again. It is more than a mathematical anomaly because we can measure it using a gas at a constant temperature because pressure times volume is always the same. Entropy appears to be a way of looking at things that provides deep inside into the working of the Universe. Also, in information theory, those marble configurations with low entropy are very valuable. Entropy goes everywhere.
@MarbleScience Жыл бұрын
Thanks! Actually there is nothing magical about Boltzmann's constant. It is just a matter of units. The, sort of "natural", unit of entropy is given by the chosen logarithm. If you choose the logarithm with a base of two then the resulting unit of entropy is "bit", which is popular in computer science. If you choose the logarithm with base e then the unit is "nit". Now the problem is that historically, we (kind of arbitrarily) chose units for energy, temperature and entropy before we understood the statistical nature behind it. Now we have to multiply with Boltzmann's constant every time entropy pops up to convert it into these historically grown units.
@datre8256 Жыл бұрын
Really nice video! I haven't thought about entropy in this way.
@adelelopez1246 Жыл бұрын
Excellent explanation! It's a difficult concept to understand correctly, and so it's really great seeing an accessible explanation of it that actually gets it right!
@RAyLV17 Жыл бұрын
Didn't realize that I was already subbed to this channel. I just got a notification for this vid, and wow what a wonderful video this was! If I could, I'd subscribe again! :p
@MarbleScience Жыл бұрын
Uploading a video is always a bit of anxious moment. Your comment is pure relief :) Thanks for the lovely comment!
@u.s.navy_pete4111 Жыл бұрын
You are an admirable teacher and presenter. Keep it up!
@fanir3310 ай бұрын
This channel is pure gold
@jph8220 Жыл бұрын
I happened upon your channel a couple of years ago when looking for teaching inspiration, and vaguely remember a video where you were considering funding or some kind of full time job on the platform; I hope you found success! :) You clearly put a lot of work/effort into this and you're doing a phenomenal job. The fortunate teller comparison is quite an amusing one that I think works well. If you're looking to appeal to a general audience for science outreach, the only thing I would say is that maybe don't include the equation for entropy in the video and just emphasize it's directly related to the number of microstates (kB and ln don't change anything on a conceptual level and IMO would take away from the video for the less mathematically-oriented who don't know what either of these things are).
@MarbleScience Жыл бұрын
Thanks :) I agree that kb and the logarithm are not really important here, but some still might prefer to see the concrete equation. The good thing about KZbin is that you get data to analyze these kinds of things. As a creator e.g. you get a retention graph showing how much all parts of the video are watched. If a lot of people were strongly confused by the formula and stopped watching, I would see a drop of in the retention after showing the equation. That is not the case. Actually, there is a small peak in the retention of people going back to the formula to look at it a second time.
@jph8220 Жыл бұрын
@@MarbleScience Cool, I don't create videos so I had no idea it had those sorts of analytics!
@crawkn Жыл бұрын
The study of entropy emerged from the study of the behavior of gasses in enclosed spaces. It was useful in that context, but I had come to believe that the sorts of things people claimed entropy meant for the future of the universe were exceeding the power of any principles of entropy to predict. However, the perspective that it is in fact not predicting anything very specific, perhaps changes it from wrong to irrelevant.
@amarnamarpan11 ай бұрын
The definition entropy depends upon the intelligence of the observer. it's not just about probability. It is about computability. We talk probability only when all the numerous microstates are either hard to measure or even if we do measure it is still hard to use the measurement to predict the future using those.
@galaxyvita2045 Жыл бұрын
I think one of the most interesting things about entropy is that we use it to diffuse the arrow of time. All other physical laws are symmetric when it comes to going forward and backwards in time. Entropy can be seen as a way to predict to which part of phase space a system will go over time. Also an other interesting thing is that there are multiple definitions of entropy depending on the field in which you are working, compare science, math, biology, ... they all have different definitions which are all related but we don't know the big underlying idea yet.
@Eastern1 Жыл бұрын
Thank god you are back
@50secs Жыл бұрын
Beautiful video, wish you all the best. Your explanation really added another dimentions to my understanding of generalisation through entropy.
@HyattPan6 ай бұрын
If one day we have WW3 or in a battle field, I wish you were in my team. You seems knows some stuff and sharp inside. And this video is the best Entropy introduce video. The animation and the marble model box is superb. You can just sell the marble box set. I believe every people who watch this clip would want to have one. Those can reminds us how this world have become what we have now. Thanks again.
@MarbleScience6 ай бұрын
Let's just hope it doesn't come to that :D You can 3d-print the box if you like: www.thingiverse.com/thing:6361546
@joshuat6124 Жыл бұрын
Nice video, as someone with a PhD in atomistic simulation it's nice to revisit these concepts :) I highly recommend the series 'Order and Disorder' by Jim Al-Khalili for people interested in the story of how energy and entropy emerges as concepts.
@PaulVirtual2 ай бұрын
Great video, but the opening question reinforces a misconception of entropy---namely that one can calculate entropy from just a microstate. The video later explain why this doesn't work, but the opening tease will embed in memory and lead many astray.
@kgblankinship Жыл бұрын
This is a very creative, insightful, and well presented description of physics.
@HuLi-iota10 ай бұрын
I wish you got a video to explain how you did this visualisation and how time-consuming it was, respect!
@giefuser Жыл бұрын
What is the connection between entropy and potential energy?
@MarbleScience Жыл бұрын
I am thinking about making a video about that. Stay tuned :)
@blackskyy3274 Жыл бұрын
This is an amazing video! Thank you :)
@srijantiwari8152 Жыл бұрын
Exceptional video!!
@tecno_andre2752 Жыл бұрын
a very epic video
@FleshgodImmolation8 ай бұрын
Amazing video! I would love to see a video on free energies
@MarbleScience8 ай бұрын
That's on my list ;)
@gabrielstahl56295 ай бұрын
Great video, thanks a lot!
@adel_mdfkrjones Жыл бұрын
Bruh, can u tell pease, how u do ur animations?, I'm currently doing a project for diffuse transmittance testing in optical phantoms and I'm struggling to find how to simulate it.
@nickb8638 ай бұрын
Could you please explain non ergodicity (systems where it doesn’t visit all possible micro states) as well as how chaos can turn to order or vice versa? Psallis entropy?
@harikumarmuthu881911 ай бұрын
Hey, can you explain phonons and explain the difference between temperature and sound using phonons, as both of them are different manifestation of the phonon.
@vishalbhoknal272911 ай бұрын
Share some sources to dive deep into this interpretation of entropy you explained.
@danielwalker5682 Жыл бұрын
Wonderful explanations.
@Brombelade Жыл бұрын
When I saw your video, I was reminded that economists differentiate between microeconomics and macroeconomics. I wonder if we can draw an analogy to the microstates and macrostates you are talking about here. A fundamental issue in economics is whether macroeconomic variables (such as inflation/deflation, growth/recession, (un)employment) can be explained as emergent from microeconomic behaviors, i.e., the economic decisions of millions of individuals. The current state of the art economics says it cannot. But I have always wondered if that is really true, or in other words, if our microeconomic models might be too simple to give rise to the macroeconomic effects we can observe. I wonder if in thermodynamics we can observe a similar disconnect between microphysics (e.g., behavior of atoms) and macrophysics (such as entropy).
@MarbleScience Жыл бұрын
Interesting question! In thermodynamics / computational chemistry at least I think it is mainly a question of computational resources. With the current resources you can simulate a single protein on an atomistic level for a reasonable time. If you want to study how a complete cell works or how multiple cells interact with each other, you won't get far with an atomistic model. This however does not mean that it wouldn't work in principle if you had the resources.
@CristalMediumBlue Жыл бұрын
Really well explained! Thanks
@deepakkumar207811 ай бұрын
what kind of knowledge required to make this kind of simulations. plz tell the tools and techniques to make this type of illustrations.
@ritviksharmaphysics10 ай бұрын
Thank you.
@ab-tu5wc Жыл бұрын
Cool video, I was wondering what you use to animate this stuff. Is there a resources for making stuff like mathematical simulations in a software like blender/maya or are there other software better suited for this kind of thing
@MarbleScience Жыл бұрын
Thanks! I use blender for my animations. This time I made quite heavy use of blenders new geometry nodes and simulation nodes, E.g. to visualize the marble arrangements. There are not many tutorials specifically for mathematical stuff, but once you understand how it works, it is no problem to use it for anything you like. There are plenty of great general tutorials here on KZbin.
@ab-tu5wc Жыл бұрын
@@MarbleScience Thank you for the information! The visualizations are great as usual, keep up the great work.
@Lastonestanding7 Жыл бұрын
I was always told that entropy is arrow of time its the only thing that differentiates past from future since concept of things in a system evoluving in more likely state itself has no physical meaning then does time loses its meaning too ?
@ahsaaah724711 ай бұрын
your way of explanation is such a addiction, can you talk about Enthalpy too?
@Roman_4x5 Жыл бұрын
It would be nice to start from the definition of entropy in the first place ;)
@anirbanmandal31239 ай бұрын
wonderful video
@lomash_irl6 ай бұрын
This video is like therapy for students who were not taught appropriately 😅
@richardogujawa-oldaccount1336 Жыл бұрын
Love this guy
@izzygrandic2 ай бұрын
this is a great video
@eyesburning Жыл бұрын
Amazing video as always! Where can I get the marble box you had in this video? Did you 3D print it?
@MarbleScience Жыл бұрын
Thanks :) Yes it is 3D printed, and I have uploaded the files to Thingverse for you and other people who might find it useful: www.thingiverse.com/thing:6361546 The problem is that with a new account it apparently takes 24h until my upload becomes public. You will need some patience ;) sorry
@eyesburning Жыл бұрын
@@MarbleScience Amazing, thanks so much! I will check back in 24 hours. And where did you buy the orange marbles if I may ask?
@jareknowak8712 Жыл бұрын
👍
@danielrhouck Жыл бұрын
You show two specific microstates and say they have the same entropy, which in a sense is true, but there are natural ways of analyzing the information. I don’t just mean natural in human psychology; I mean mathematically natural definitions. Computer science has an entire sub-field for information theory which does let you talk about the entropy of individual bit sequences. There isn’t quite a full definition, but there is up to a constant, and it takes more information to specify a random arrangement than it does to specify “the left half”.
@MarbleScience Жыл бұрын
I had quite a long and interesting discussion about that with @dv2915 here in the comments: kzbin.info/www/bejne/h5unZKGsf7GgmKM&lc=Ugw6x5x_bkd0TVExFyR4AaABAg Maybe you can add something to that :)
@danielrhouck Жыл бұрын
@@MarbleScience The link isnʼt working for me; it just takes me to the video not to the specific comment. I blame KZbinʼs site.
@danielrhouck Жыл бұрын
I found it. You are great at explaining things and should have a wider audience but for now having a small audience makes it easy to find a specific thread.
@nicholasdepaola374011 ай бұрын
Bro you good😮
@BabaBoee5198 Жыл бұрын
Yay
@rockapedra1130 Жыл бұрын
Same
@LuisAldamiz Жыл бұрын
So entropy is just about probability? It's weird that modern physics is so obsessed with chance, really.
@MarbleScience Жыл бұрын
It is, yes! Everything that involves a lot of atoms (that means basically everything that matters to us) is a question of statistics. For my taste we don't even acknowledge the importance of statistics nearly enough.
@LuisAldamiz Жыл бұрын
@@MarbleScience - Maybe, I wouldn't want to be interpreted as being against statitics as technique but making it (or probability) not just a method but the very pillars of modern science sounds to me like those neoplatonists who believe that math or information are real beyond actual reality (phisics, nature). Almost certainly a related issue. As they say: "there are lies, damn lies and then there are statistics". And then they also say that "the easiest person to deceive is yourself". My issue would be anyhow that maybe physicists are taking probability too seriously in many ways, including quantum mechanics. After all, it seems to imply that blond people are more entropic than brunet people, which are much more common globally, and I see no reason for that: rareness or peculiarity would be a better name than entropy, IMO.
@Vlow5210 ай бұрын
If you want the world to be precise and accurate, it will only disappoint you. Physicians tend to make a theory that correlates with the outcome and believe in like it’s a reality, but it’s always a narrow relative view based on the measures and conceptions that were also just believed to work. No matter how advanced and broad a theory could be, it’s just a theory and can’t explain the whole system because it is a part of it.
@LuisAldamiz10 ай бұрын
@@Vlow52 - It's fine: experimental oucomes are evidence. What has been discovered is OK, the problem is on how they want to make sense of it by slashing out General Relativity and not being humble about what we truly know re. Quantum Mechanics. The problem is not of faith on experimental outcome but of way too much faith on maths and also lack of interest in further advances. Plus probably some questionable Newtonianist leftovers, not so much in Relativity (often accused of being "classical" because of lack of quantum granularity) but in Quantum Mechanics (which retains stubbornly Newtonian time and space since Dirac failed at achieving Unification and Shrödinger took over from there. Shrödinger's equation is neither dead or alive, but nobody seems to dare to open the box.
@taktsing49699 ай бұрын
You know, God does play dice, a really obsess one.
@Dreadwinner Жыл бұрын
🤯
@ronaldreeves42121 күн бұрын
I dont like this defintion because it decouples it from information theory making the concept of entropy less useful. It decouples it from the macrostate which takes less information to describe. I think there is something that can reconcile these where a maximum information state could collapse rextarting big bang
@MarbleScience21 күн бұрын
I'm not sure if I understand what you are saying... Botzmanns entropy formulat S= klnΩ is kompatible with Shanons entropy formula used in information theory. Shanons formula turns into boltzmanns formula if we make the assumption that all states are equally likely. What exactly do you think is not compatible?
@dv2915 Жыл бұрын
Not quite correct, in my opinion. Take the following definition. Entropy is the length of the message needed to precisely describe the state of the system. Now, one state is 'all nine balls are on the left side'. Another state is 'one ball on the left side in row 1, column 3, two balls on the left in row 2 columns 2 and 3, two balls on the left in row 3 columns 1 and 3, two balls on the right in row 1 columns 1 and 2, one ball on the right in row 2 column 3, one ball on the right in row 3 column 2'. Do these two states look like they have the same entropy?
@MarbleScience Жыл бұрын
Interesting Question! Thank you! I think the problem with your definition is that it mainly measures how efficient a language is at describing something. Each language would need a different number of characters to describe a state, and we would end up with different entropy values for every language. Maybe another language has a simple word for the locations occupied by marbles in the second state, like our language has a word for "left". Also, now that you have watched the video I can simple describe the second state as "the second state". That's a short statement. Is the entropy lower now?
@dv2915 Жыл бұрын
@@MarbleScience I'd say not. The full message now has to include both the previous description of 'the second state' and your last statement. Programmatically, you can first declare two matrices of a certain size and then simply say that matrix one in filled with ones and matrix two is filled with zeros. Any other state will require providing 'snapshots' of both matrices. And that will lengthen the message, no matter what language you use.
@MarbleScience Жыл бұрын
@@dv2915 But then wouldn't the "full message" also need to include a definition of the language? We can't take it for granted that the recipient knows our language if we can't take it for granted that the recipient knows my video 😄
@dv2915 Жыл бұрын
@@MarbleScience True, the full message would include if not the the definition of the language then at least the dictionary of the terms. But still, the principle holds. Like, your example of 'the second state' would require a 'dictionary' of all possible states for this set of matrices and marbles.
@MarbleScience Жыл бұрын
@@dv2915 Typical programming languages might contain a built in function to directly generate a matrix of all ones or zeros (because their creators thought that might be useful). They don't have to contain a built in function to generate every possible state directly. Now I could ship a programming language that has a built in function to directly generate "the second state". Why would it be ok to have a built in function to generate a matrix with all zeros without having functions for all other states, but not ok to have a built in function that creates "the second state" without having a complete "dictionary" of functions for each possible state?
@kevon217 Жыл бұрын
that damn bird!
@Dennis-hb8tw Жыл бұрын
3D printed marbles! yippie!
@HansLemurson Жыл бұрын
Soo...entropy is in the eye of the beholder?
@MarbleScience Жыл бұрын
In a sense yes. However, if two people choose the same macroscopic variables to describe something they will end up with the same entropy values. It does not so much depend on the observer but instead on the perspective they take. E.g. if we choose to count the number of marbles on the left side of the grid, the entropy is well defined for that that way to look at the system, and anyone who takes this same perspective will get the same values.
@andracoisbored10 ай бұрын
entropy makes my brain hurt.
@jnrose28 ай бұрын
TIME OUT !!! This rendition is a limited and not thorough depiction of Entropy as identified 300 years ago. The interpretation of comparing statistical microstates (static topological geometric options) obliterates and ignores the original property … the ever diminishing ability to: DO WORK. (capacity to induce ‘actioned change’ … produce utility of INTERSACTIONS. ‘That is: transforms over time …. Less or More … possible. This is a distinctly Different Rendition of “possibilities”. DYNAMIC AVAILABLE OPTIONS (sequentially related) -versus- the characterization he presented …, of mixed happenstances … no importance on mandated causalities involved. His version here allows any sequence or one state can randomly generate any of the thousands others. Behavior PATTERNS were the Entropy Qualia identified during the Industrial Revolution. …, NOT: indifferent states comparisons …, as presented here!!! **** YES … the math notions presented here are okay. They are clinically ‘correct’. It’s just that those math correlations totally mislead on the natured of PROCESSES … generating. ‘Unrecoverable’ …. Order … and continual lessoning of abilities to diffuse PRODUCTION OF UTILITY functions. **** This video … and any similar ones … are dangerously misleading!!’ . *** His math is correct. Only INCOMPLETE.. and depicts the wrong properties of. …,GRADIENTED states changes which have to do with WORK (per 1700’s and later … concerns) and -now- in the Information Age … with COMMUNICATION and induced BEHAVIORS CORRELATIONS (as more important than thermodynamic!!!). IG visitor (Follower) … there to here…: jnrose2
@MarbleScience8 ай бұрын
Maybe "Order" (whatever that is) is not that unrecoverable after all. How else would the world have ever gotten into the state it currently is in? If entropy was a one way road, how do you explain the existence of low entropy states?
@jnrose28 ай бұрын
@@MarbleScience. Perfect question/notion ! ***Entropy (as traditionally modeled) is **-not-** an absolute monolithic ‘tendency’ towards ‘disorder’. [eg, the intangible ‘energy dissipation’ gradient is always identified in small sample circumstances ~ observations … **-then-** spoken of as some ‘absolute universal running down’ … diffused ineffective … dispersion of energies through n-dimensional Space(… spacetime) ]. *** Obviously, different real states & architectures of “forms ~ functions” regularly … at all known levels of Complications … -do- ‘recover Order’ spontaneously. Although it is not “recovered’ duplication of immediately prior states (which - if reversal of a process stream can -be- reversed - needs added -external- energies to do the task). ** Ordered new energy accretion groups are generated by a NON statistical mechanism (same process, different enacting forms ... at all levels of Complexity production), Operational Function Potential … in the universe. *** Micro-states comparisons are NOT the “mechanism” involved (even though the statistics math seems to be some kind of ‘marker’ -for- the mechanism). *** ALSO, the conventional fallback analysis to justify Complexity and Emergence… is written everywhere as must being some FOURTH LAW of thermodynamics. [that is a typical logic deduction … but … looking deeper into the relations involved … that is Wrong. No “4th law” is required (!). And especially … no statistical properties will describe Negentropic order generation]. The explanation for Local Regional (conditional) “Complex Order generation” … has to do with several co-involved COMMUNICATION FACTORS between clusters of agents present in an ‘events set’ [ many different architectures are possible ; shared ‘interactions properties’ are what are universal despite architectural forms ‘differences’]. ****. Thermodynamics is only ONE FORM of the involved relations & properties. (which is why we see it so often and we focus on entropy as a trade off of energy states. The reality is … energies are only a subset of more Ubiquitous… Communications aspects. Eg - fields of force .. are **-not-** defined as ‘energies’ but they interact and produce both entropic **-and-** negentroepic outcomes. (!!!!!!) ). *** *** Your conventional depiction of statistical factors involved … is not Wrong. It is a sidebar situation monitoring math. It is not a process explanation math. That is a big distinction. -- I hope you understand. -- jnrose2 (IG)
@MarbleScience8 ай бұрын
"Your conventional depiction of statistical factors involved … is not Wrong. It is a sidebar situation monitoring math. It is not a process explanation math." I kind of agree that on a fundamental level the statistics is not the reason why something happens. Let’s take a classical lottery with balls in a rotating container as an example. Of course the actual reason for the numbers we draw is in the physical interactions of these balls colliding with each other. However, this more exact physical approach quickly gets too complicated for us to follow in detail. That's why we come up with simplified models, and these simplified models are heavily governed by statistics. E.g in the case of the lottery we typically ignore all the exact physics and simply assume that it is equally likely for every numbered ball to be drawn. “The explanation for Local Regional (conditional) “Complex Order generation” … has to do with several co-involved COMMUNICATION FACTORS between clusters of agents present in an ‘events set’ ” Honestly, this doesn’ really tell me anything. To me it seems like there is a much simpler answer. Processes with a negative change in entropy have a small but non zero chance of happening. The chance for all the matter required for the big bang to randomly come together is (for us) unimaginably small but it is not zero! That means, if we wait for an infinite amount of time it is actually guaranteed to happen at some point. Also, if the Universe is infinitely big, it is guaranteed to happen in some part of it. I think all the matter coming together randomly for something like a big bang only seems unlikely on the time and size scales that we are used to thinking about.
@jnrose28 ай бұрын
Fine comments. Thank you. [please bear with me- I don’t know how to copy-paste KZbin comments]. I am glad you recognize the distinction between equations types - most folks don’t assess which equations do what. *** I want to follow up your pgh that starts “Honestly…”. Your “simpler answer” again falls into the “non explanation” type of equation. Similar to Eddington who also intoned that if there is a non-zero probability, that that suffices to account for negentroepic complexity generation; simply on the criteria that ‘if it is “allowed” then when complexity happens, it is justified‘. Such a monitoring assessment does ‘simply identify’ that complexity (Negentroepy) “can” happen, not “how” it happens (per a universal performance operation .. at every level of complication). **** Your context circumstance … the life time of the universe - does not map to the ongoing regularity of vast numbers of generated Complexities forming constantly … in all living systems, AND, cosmically … stars forming, molecules forming, solar systems forming, galaxies forming. These are all versions of Easily & Ubiquitously formed … negentropic Complexities. (!) *** The statistics “allowable” description does not cover the huge number of actual productions. ** As far as my reference to factors of communication being involved, I do include a probability factor in that model. The probability of messaging between 2 (or more) entities … an active negentropic forming requires that possible information or energy transfers-interactions MUST be “c > 0” in order to maintain linkage, coupling, complexity. All involved communications must sustain “greater than zero” probability. 😮😲😃👍. ******* I hope that makes better sense to you. 🙏 -- jnrose2
@MarbleScience8 ай бұрын
“Your “simpler answer” again falls into the “non explanation” type of equation.” You are correct, but the more fundamental physical explanation would require us to know the exact state of gazillions of photons, electrons, atoms, etc. which is just impossible. The best we can do is a statistical assessment of the situation. “Your context circumstance … the life time of the universe - does not map to the ongoing regularity of vast numbers of generated Complexities forming constantly … in all living systems” I think there are two types of entropy reduction that we should differentiate: 1. Local reduction of entropy that is linked to a greater increase in entropy at a different location. 2. Overall entropy reduction without a linked increase at some other location. If I understand you correctly you are talking about the first kind. E.g. If I eat something, I might grow some muscle tissue (which might be a reduction in entropy), but at the same time the entropy is increased by breaking up the carbohydrates I’m eating, and distributing most of the carbon atoms as CO2 in the environment. Even if there might be a local entropy decrease, overall entropy is increasing. This kind of local entropy reduction at the expense of a greater increase elsewhere, is not unlikely at all, and indeed it happens all the time. These are processes with an overall increase in entropy after all. However, they can not explain why the world would ever not be in its maximum state of entropy in the first place. At some point a process with an overall negative change in entropy must have occurred.