I do have a degree in computer science and I find you again and again to be one of my best CS teachers in topics, that were never discussed or badly explained during my studies!
@zen16472 жыл бұрын
Great video! You're awesome!
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@sherpajones Жыл бұрын
16:23 What if the logic gate operated by interpreting the interference pattern of light. If you only had one light source, there would be no pattern. If you have two, there would be a pattern. The presence or not of a pattern can be your 0 or 1. This should easily be reversible.
@moth5799 Жыл бұрын
@@hyperduality2838 Christ mate you really think quoting Yoda makes you look smart lmfao? There are 4 laws of thermodynamics but that's only because we call one of them the Zeroth law. Your one is just made up.
@hyperduality2838 Жыл бұрын
@@moth5799 Subgroups are dual to subfields -- the Galois correspondence. The Galois correspondence in group theory is based upon duality. There are new laws of physics -- Yoda is correct. Energy is dual to matter -- Einstein. Dark energy is dual to dark matter. Energy is duality, duality is energy -- the 5th law of thermodynamics! Potential energy is dual to kinetic energy -- gravitational energy is dual. Energy is measured in Joules (duals, jewels) in physics.
@compuholic822 жыл бұрын
Fun fact: Reversible logic is really important in quantum computing. Since all state changes can be represented as unitary matrices, quantum gates are always reversible.
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@laughingone37282 жыл бұрын
@@hyperduality2838 Nicely stated. Thanks for that.
@Snowflake_tv2 жыл бұрын
Really?
@hyperduality28382 жыл бұрын
@@laughingone3728 You're welcome, it gets better:- There is also a 5th law of thermodynamics, energy is duality, duality is energy! Energy is dual to mass -- Einstein. Dark energy is dual to dark matter. Action is dual to reaction -- Sir Isaac Newton (the duality of force). Attraction is dual to repulsion, push is dual to pull -- forces are dual. If forces are dual then energy must be dual. Energy = force * distance. Electro is dual to magnetic -- Maxwell's equations Positive is dual to negative -- electric charge. North poles are dual to south poles -- magnetic fields. Electro-magnetic energy is dual. "May the force (duality) be with you" -- Jedi teaching. "The force (duality) is strong in this one" -- Jedi teaching. There are new laws of physics! Your mind creates or synthesizes syntropy! Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic. Duality creates reality! Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Everything in physics is made from energy hence duality. Concepts are dual to percepts -- the mind duality of Immanuel Kant.
@hyperduality28382 жыл бұрын
@@Snowflake_tv There is also a 5th law of thermodynamics, see my next comment.
@domotheus2 жыл бұрын
Good stuff! If you're interested there's an even weirder (and very theoretical) application of reversible computing called a "Szilard engine" where you can go back and forth between waste data and waste energy. Using the wasted bits of reversible computing you can theoretically extract energy out of a system that's at an equilibrium state, basically meaning you can convert energy into data and data into energy
@zdlax2 жыл бұрын
mass = energy = information
@landspide2 жыл бұрын
isn't this like Maxwell's demon?
@abdobelbida71702 жыл бұрын
How so?
@MasterHigure2 жыл бұрын
@@abdobelbida7170 Maxwell's demon can separate a fluid into fast / slow atoms, or a fluid mix into its constituent parts. You can extract energy from the recombining. The cost is that the demon's knowledge of the state of the system becomes less and less useful as it does its work.
@lubricustheslippery50282 жыл бұрын
I am confused. I thought that information needs an physical representation according to information theory and thus something like pure information should not be a thing similar to that pure energy is not a thing. Then is it something like you can convert entropy and enthalpy, in chemistry you calculate with Gibbs free energy to see if an reaction can occur that is the combination of entropy and enthalpy.
@demetrius2352 жыл бұрын
I worked in the semiconductor industry (DRAM) for a few years and now one of the courses I teach is Thermodynamics. I had no idea about the Landauer limit so thanks for teaching me something new! Also, good work pointing out that a completely reversible process is not possible as there is always some energy loss (collisions in your billiard ball case). This was an excellent video!
@anntakamaki19602 жыл бұрын
Is it in Russia?
@demetrius2352 жыл бұрын
@@anntakamaki1960 "it"?? I did not work in Russia and I have never been to Russia. I have no desire to set foot in Russia.
@mathslestan3323 Жыл бұрын
@@demetrius235 Wow so much love for Russia 😑
@Nehmo Жыл бұрын
How can you be associated with semiconductors in any way and not know about the Landauer limit? Sorry to be critical, but it's rather basic. Now that you know about it, you will recognize encountering it again and again.
@katrinabryce Жыл бұрын
@@Nehmo Possibly because it is so small that for all practical purposes it is zero? The Landauer limit of a typical modern 35W CPU is about 35 nanowatts. And for DRAM it is actually 0W, because the whole point of RAM is that you get back out what you put in, so it is reversible.
@gotbread22 жыл бұрын
While the second law gives a mathematical justification for that energy loss, it does not give a deeper "why" that is the case. The fundamental issue is that of information erasure itself. This comes down to collapsing a state to a single value. Imagine the 2 bits getting reduced to 1 bit. This means we force one bit from having a variable state (either 0 or 1) to a fixed state. It can be any value, does not matter, but it is now a constant and no longer a variable. This is where the loss occures. One helpful visual is a ball in a potential with 2 valleys (as a standin for a particle in a bipotential). Now this ball can be in either of the 2 valleys initially. By definition we dont know, else it would be a known constant and not a variable. Lets say we want to move this ball into the left valley, from any starting valley. The issue here is that whatever we come up with needs to work for both starting valleys. Similar the bit erasure must be able to set a 0 to a 0, but also a 1 to a 0. In the ball case, you can move it over to the other valley but then it will have some speed left, which you need to dissipate in order for it to come to a rest at the bottom and keep this state. This is exactly where the loss happens. You can add some kind of reversible damping to "catch" this energy, but then it wont work for the case that the ball was already in the correct valley. Whatver case you design it for will always cause an energy loss for the other case, since you need to move from a case with potentially "some" kinetic energy to a state with "zero" kinetic energy, without knowing the direction of the motion. (This is similar to maxwells demon). Now how much energy do we need to dissipate? Also easy to see. In order to differentiate between the 2 bit states, there needs to be a potential barrier between them. This barrier needs to be high enough to prevent thermal movement from flipping the bit on its own. The energy you need to dissipate while "catching" the bit and bringing it to rest is directly coming from the energy you need to expend to cross this barrier. Since the barrier is temperature related (more temperature -> more thermal energy -> higher barrier needed to avoid flips), the energy loss is also temperature dependent. This is where the "T" in the equation comes from. The boltzman constant in a way is mandatory to match the units. Last piece of the puzzle is the ln(2). We can either be satisfied with using the second law as a shortcut here, but the ln(2) can also be derived directly from the "geometry" of this "information bit in 2 potential wells" problem.
@dtkedtyjrtyj2 жыл бұрын
Wow. I actually think I understood some of that. It makes intuitive sense...?
@rewe35362 жыл бұрын
Thank you! The video makes it seem like it's just magic, it just happens.
@garyw.96282 жыл бұрын
Really nice analysis of why erasing information necessitates a loss of energy. Also very appropriate to mention Maxwell's demon, since his thought experiment cleverly demonstrated the important link between information and energy. But, in the derivation of the Landauer Limit, and of the Boltzmann constant itself, there seems to be the assumption of a system consisting of the atoms and molecules of a gas. What if the computing device consisted of something smaller than atoms like photons, or neutrinos or quarks ? Would the corresponding Landauer Limit then, by necessity, have a much lower value ?
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@gotbread22 жыл бұрын
@@garyw.9628 We did not make any assumptions about what the system is made of. All we need to assume is "some element" (here a particle, but even a wave function works too), which can be in different states, and that the 2 states are separated by an energy barrier. We need this barrier in order to preserve the state (else it would evolve between the states over time), and the height of the barrier is based on the expected disturbances (thermal energy). Further, we assume that by crossing this barrier of potential energy, a certain kinetic energy is needed, which we eventually need to dissipate again. Notice how abstract that setup it, it makes no mention of the particle type, or even a particle at all (even a field configuration would work here). You are correct with the Boltzmann constant however. This carries some assumptions with it. Since this constant relates the temperature of a gas to its average kinetic energy, it all comes down to how you define "temperature" for your system. If you use different particles, or something different entirely, your definition of what a temperature is may change, thus changing the Boltzmann constant.
@fmeshna Жыл бұрын
Jade, your ability to explain complex quantitative concepts so clearly is exceptional. We need more teachers like you.
@bobdiclson41732 жыл бұрын
I love Jades vibe of having a secret to share
@vigilantcosmicpenguin87212 жыл бұрын
Yeah, that's a perfect description. The way she grins when she gets to the juicy part of the secret,
@Blue.star12 жыл бұрын
She looks like a meson
@PhysioAl12 жыл бұрын
You're right!
@sabouedcleek6112 жыл бұрын
@@김민-o2k Looks a bit like Modified Newtonian dynamics, though i dont think √(1+(GM/(RC²))² satisfies the interpolation requirement in the overview of the wiki.
@communitycollegegenius96842 жыл бұрын
Too bad. No one would watch or take her seriously without her vibe. Science and humanity loses.
@Alestrix762 жыл бұрын
I was wondering about the "many million more" at 10:00 and did some math as I thought this sounded a little too much. But turns out it's about right: A somewhat modern 64bit x86 CPU has around 5*10^8 logic gates. Let's say with each cycle 1% of those gates gets flipped and there are 2*10^9 cycles per second (2GHz), then we end up with around 1µW. Modern power efficient x86 processors need roughly 10W, which is 10 million times this. Not sure what the numbers are like with an ARM processor in a smartphone. Of course this is just ballpark-math.
@greenaum Жыл бұрын
The latest AMD CPUs have 8.2*10^10 transistors, or 82 billion. A logic gate might have, maybe 5 transistors, so you're off by a factor of about 100. A lot more than 1% of the gates get flipped. CPUs are designed to use as much of the hardware as possible, all the time. To get more processing done. You don't want bits of the chip sitting around idle. This is why you might have "4 cores 8 threads". Each core runs two "threads". That is, if, say, the chip's multiplier unit is being used by one instruction, there might be a memory access it can make, using it's memory access hardware, for another instruction. It runs two instructions at once but it's not two entire processors. Instead, it's two front-ends that work out what instructions can be run with the currently unused parts of the CPU. So it's like you get, say, 1.8 effective processors per core, with just the addition of a second front-end and a bit of logic to figure out what's compatible. There's also pipelining, where an instruction might take, say, 5 stages of operations to complete. The 5 stages are built separately, and as an instruction leaves the first stage, a new one is brought in, so all 5 stages are busy, all the time, with 5 instructions. Then there's out-of-order execution and all sorts of other mad tricks to try and get processing done in almost no time at all. CPUs will have multiple adder units and other parts. It's not just having multiple cores, each core does multiple things. So, they're busy, by design, to produce the most throughput of calculation for the silicon you buy. To have circuits sitting idle is wasteful, and indeed they analyse that at the factory, and if a part isn't pulling it's weight, they'll replace it with something else that does. It's all about getting the most processing possible done, because that's what they compete on, and what they set their prices by. In power-saving chips, like for phones, the opposite is true. They try and switch off as many sub-units as possible while still providing just enough CPU power to do whatever you're doing at that moment. Entire cores will usually be shut down, but they can wake up quickly. Plus modern phones might have 4 high-power, fast processors, and 4 slower, very low-power ones, designed with different techniques, switched on and off as the operating system decides it needs them.
@heartofdawn23412 жыл бұрын
The question then is, where does that second bit of output information go? How is is stored and used? If you simply discard it later, all that happens is that you push the source of the landauer limit further downstream.
@JB525202 жыл бұрын
No one really knows because there's no design for a reversible computer yet. The billiard ball example shows how the balls might be returned to the correct place with a logic gate that doesn't expend energy, but it doesn't show where they're stored or how they'll return at the precise time (as far as I remember; it's been a while since I read about this). I'm just guessing, but a useful metaphor might be to picture a mechanical computer where each of the waste bits is stored in a tiny spring, such that computing would be like winding a clock. Once the result is obtained, the program runs in reverse to unwind the system and return the stored energy. (How it would actually work, I have no idea.) It's also like the difference between standard car brakes and regenerative braking. The former just radiates heat, and the later runs one or more generators, storing energy to accelerate the car later. As far as I can tell, reversible computing doesn't have to be perfect, just like regenerative brakes. Even if a program can only run backward part way before releasing the remainder of its stored energy as heat, that's still better than releasing all of it, and it might be enough for a computer of the distant future to sidestep the Landauer limit.
@erkinalp2 жыл бұрын
@@JB52520 All quantum computers use reversible computational elements to prevent immediate collapses of superposition.
@glenncurry30412 жыл бұрын
@@erkinalp Qbits are not single binary bit.
@brandonklein12 жыл бұрын
@@erkinalp but you're still subject to the Landauer limit once you make a measurement.
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@nbooth2 жыл бұрын
Minor quibble but the second law doesn't imply that entropy can only increase, only that it is more likely to. You can always get lucky and have total entropy go down. I'm sure it happens all the time.
@nmarbletoe82102 жыл бұрын
indeed! And at the maximum entropy state, it is more likely that entropy will increase. Randomness is fun
@E4tHam2 жыл бұрын
I’m currently pursuing a masters in VLSI, so thanks for introducing these concepts to people! Although the built in impedances in metal and semiconductors will always overshadow the Landauer limit by several orders of magnitude. But this is an interesting thought experiment
@adamnevraumont40272 жыл бұрын
always is a long time to make a promise for
@adivp72 жыл бұрын
@@adamnevraumont4027 Even if we you eliminate metal impedances with super-conduction you definitely need semi-conducting material for a transistor. And semi-conductors will always produce heat.
@adamnevraumont40272 жыл бұрын
@@adivp7 which you then follow with a formal proof that all computation requires transistors made of semiconductors? No? Well then making a promise for "always" is beyond your pay grade. Always is a very long time. Always is not just 10 years, it is 100 years, it is 10000 years, it is 100,000,000 years, it is 10^16 years, it is 10^256 years, it is 10^10^10^10^10 years, it is G64 years, it is TREE3 years, it is BB(50) years. It is a really long time.
@dot322 жыл бұрын
@@adamnevraumont4027 lmao, it's physics. You need semiconductors for transistors. If you found something other than a transistor, you may not need semiconductors, but semiconductors are what transistors are afaik
@adivp72 жыл бұрын
@@adamnevraumont4027 Technically, none of those are "always", but fair point. What I meant to say is you need energy for switching. I can't see how you can have switching that doesn't use or release any energy.
@saggezza-artificiale2 жыл бұрын
Very nice video. I'd just suggest to clarify that, if we consider real use cases, probably we'll never get around Landauer's principle, even if we developed perfect reversible computing. This because reversible computing allows to handle information without erasing it, but it we don't rely on a device allowing to store an infinite quantity of information, soon or later we'll need to delete it, and this can't be done in a reversible way.
@danielschein68452 жыл бұрын
Amazing to think about. I spent 10 years designing actual microprocessors and always thought of energy in terms of the electrical current flowing through the device.
@ralfbaechle2 жыл бұрын
To be honest, the Landauer limit is so low - we can spend anoher century without reaching it. Figuratively speaking that is. I've not done an estimate how long we'd probably need to reach the Landauer limit from where technology is now. Because, let's face such estimates usually are wrong :-) So it's perfectly ok to concentrate on all other losses.
@DrewNorthup2 жыл бұрын
And even that is an oversimplification… The need to understand the impact of both the resistance and the reactance escapes a good many people. Switching losses are so large compared to Landauer I'd not expect the latter to factor in meaningfully for quite some time.
@triffid0hunter2 жыл бұрын
Sure, but the landauer limit says that a modern CPU must use at least a nanowatt or so, and since they _actually_ use about a hundred watts, we've got a _long_ way to go before having to deal with the limit - wikipedia's article says maybe 2080 if Koomey's law holds, although it doesn't mention which Koomey's law figure (there's two in the relevant article) was used to derive that figure.
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@Skeptical_Numbat2 жыл бұрын
@@ralfbaechle Perhaps at the scale of a single CPU of a home computer, but at the scale of the massive data processor centers for internet servers, or the supercomputers used to simulate the Climate/Weather patterns of the entire Earth, the need to to reduce the tremendous heat being generated (both in terrms of efficiency & especially the cost in cooling these mega-systems), makes current efforts to overcome the Landauer Effect financially viable. Just look at how much energy is estimated at being diverted from electrical power grids towards Cryptocurrency farming & try to estimate the vast amount of waste heat being generated (especially in hotter regions close to the equator, like Texas, where there's already a profound energy cost in maintaining functional computer environments) due to all this data processing. Any tweak to the design of computer processors which either overcomes, or more likely reduces (given our current incomplete understanding of Quantum Physics), the Landauer Effect is going to be a worthwhile achievement.
@stufarnham Жыл бұрын
This has become my favorite KZbin channel. These short, digestible discussions of deep topics. Are endlessly fascinating. I especially enjoy the discussions of paradoxes. Also, you are amgreat presenter - clear and engaging. Keep itn up, please!❤
@pedromartins14742 жыл бұрын
This took me back to my statistical physics classes! Wonderfully explained! Thank you so much!
@thattimestampguy2 жыл бұрын
0:55 Heat Production & Waste 1:55 Information has an energy cost ℹ️ The Landauer Limit 2:25 Logic Gate More than one Logical Gate makes a Logical Network 3:50 4:04 4 Possible Combinations, 2 Possible Outputs 00 01 10 11 6:37 Entropy Formula 8:57 Fewer Outputs Than Input States 10:26 Cooling The Computer 💻 🧊 11:37 Reversible Computers 12:07 Irreversible Operation, You can’t determine the inputs from the output 13:36 It’s Possible To Reverse 14:24 15:28 Billiard Balls 17:04
@leonstenutz6003 Жыл бұрын
Awesome, thx!
@cykkm2 жыл бұрын
There is a little big problem with the reversible billiard ball (RBB) computer (tangentially, the problem was noticed by Landauer himself in the original paper, but I'll translate it into the RBB language. Suppose you place ideal compressive springs at all output of the RBB logical circuit to _actually_ reverse the computation. This works indeed, but then you _don't known the result of the computation!_ You can adiabatically _uncompute_ any computation as long as you don't read its result. If you want to know the result, i.e. whether or not a ball popped out from an output and bounced back, you have no option but to touch the system. Even a single photon interacting with a ball, such that you can detect whether if there was a ball bouncing off the spring or not, transfers momentum to the ball, breaking the time symmetry of the RBB. A reversible computation is possible, _as long as you uncompute it all without reading out the result!_ The act of reading a result must increase the computer's entropy, even if the computer is reversible. This was one of the main Landauer results. His paper connected Shannon and Boltzmann entropy so clearly.
@JB525202 жыл бұрын
I hadn't read that Landauer was working from a time symmetry perspective. If you're going to run the universe backwards, there's no need for a special computer. Heat will flow back into a normal computer, cooling it and pushing electricity back into the wall with perfect efficiency. If time symmetry must actually remain unbroken and that's not a joke, there's nothing clear about this concept. Can you have a system that's not influenced by the probabilistic nature of quantum effects? Even if that's not a problem, reversible computing couldn't give off photons, including infrared, because they'd have to turn around and hit the computer precisely where and when they left. Any irreversible interaction with the environment would also be forbidden. This means reversible computing would require impossibly perfect efficiency in perfect isolation, ignoring quantum effects and spontaneously traveling backward in time, while being theoretically guaranteed to produce no results. I don't know anything anymore. This explains nothing "so clearly". At least you easily understand the incomprehensible and see utility in the useless. Screw it, everyone else is awesome and I'm wrong about everything. The more I try to learn or think, the more I realize it's a miracle I ever learned to tie my shoes. Being born was a terrible idea. Hey, I finally understand something.
@michaelharrison10932 жыл бұрын
This is along the same lines as the argument that I made in a comment I submitted. Reversibility does not eliminate the chane in entropy
@mihailmilev99092 жыл бұрын
@@JB52520 lmao well fucking said dude
@mihailmilev99092 жыл бұрын
@@JB52520 one of my favorite comments ever
@mihailmilev99092 жыл бұрын
@@michaelharrison1093 what was the context? Who's in ur pfp btw, some professor?
@0ptikGhost2 жыл бұрын
I love this video as it pertains to theoretical computation. Realizable computers today generally use electric flow through substrates that always have some level of impedance. The real reason our current day computers generate heat has nothing to do with the Landauer Limit but rather with the technology we use to build said computers. We don't stand a chance of getting anywhere near the Landauer Limit regardless of the temperature we run the computer unless we don't figure out how to make computers using superconductors.
@david2032 жыл бұрын
Yes, that is today's understanding. But the theory itself doesn't require superconductors. And CMOS does actually use less energy than, say, TTL.
@TheDavidlloydjones2 жыл бұрын
@@david203 You rather seem to miss Ghost's sensible point. Try reading it over again. Or look for E4tHam's sensible post, below in my feed, making Ghost's point in slightly different form.
@david2032 жыл бұрын
@@TheDavidlloydjones I fully agree with Ghost's point, thanks. The current required to activate computer logic causes vastly larger heat than does the actual processing of information. That's why it's called a Limit.
@greenaum Жыл бұрын
Right. Charging and discharging the gates of umpty-billion transistors requires dumping the energy as heat. Resistors make heat, though CMOS is usually quite high-impedance. There's reversible logic, but frankly it looks like a pain in the arse and it seems like most of it happens only on paper and the rest might be that someone strings together a couple of logic gates, not an entire CPU. Even then it's probably made of snooker balls!
@JanStrojil2 жыл бұрын
The fact that information contains energy always boggles my mind.
@goldenwarrior11862 жыл бұрын
It makes sense. There’s nothing to really think of (don’t mean to be mean)
@MyMy-tv7fd2 жыл бұрын
that is because it does not, there is no necessary information in a logic gate switching, it could randomly switch, or just be set to oscillate. No information is involved in mere switching, but energy obviously is. Either this is clickbait, or she does not understand that information is the intentional switching of logic gates to produce a certain storage pattern, which may or may not be volatile RAM or non-volatile like an SSD.
@dominobuilder1002 жыл бұрын
@@MyMy-tv7fd give an example then of any sort of information that does not contain energy
@MyMy-tv7fd2 жыл бұрын
@@dominobuilder100 - no information whatsoever contains energy, it is conceptual. But there is always a physicsl substrate, whether it be the page of a book or a RAM stick. The change in the substrate could contain information, or just be random, but the change itself will always require energy, the information is the 'ghost in the machine'
@paulthompson96682 жыл бұрын
@@goldenwarrior1186 Hindsight is always 20/20
@Dyslexic-Artist-Theory-on-Time2 жыл бұрын
One-way to think of ‘information’ is that the spontaneous absorption and emission of light photon ∆E=hf energy is forming potential photon energy into the kinetic energy of electrons. Kinetic Eₖ=½mv² energy is the energy of what is actually happening. This process forms an uncertain probabilistic future that is continuously coming into existence with the exchange of photon energy. The wave particle duality of light and matter in the form of electrons is forming a blank canvas for us (atoms) to interact with; we have waves over a period of time and particles as an uncertain future unfolds. The mathematics of quantum mechanics represents the physics of time with classical physics represents processes over a ‘period of time’ as in Newton's differential equations. In this theory the mathematics of quantum mechanics represents geometry, the Planck Constant ħ=h/2π is linked to 2π circular geometry representing a two dimensional aspect of 4π spherical three-dimensional geometry. We have to square the wave function Ψ² representing the radius being squared r² because the process is relative to the two-dimensional spherical 4π surface. We then see 4π in Heisenberg’s Uncertainty Principle ∆×∆pᵪ≥h/4π representing our probabilistic temporal three dimensions life. The charge of the electron e² and the speed of light c² are both squared for the same geometrical reason. This process forms a continuous exchange of energy forming what we experience as the continuous passage of time.
@sachamm2 жыл бұрын
The thought experiment I was given when learning about reversible computing referred to the elasticity of atomic bonds and how energy could be returned when a molecule returned to its original conformation.
@slevinchannel75892 жыл бұрын
Not many have the intellectual Integrity to watch the harsh History Coverage that Some-More-News did i nthe video "Our Fake Thanksgiving'.
@david2032 жыл бұрын
I don't see it. You would have to identify where the "atomic bonds" are in the logic circuits. Read Gotbread's analysis in another comment here. It makes more sense.
@ericmedlock2 жыл бұрын
Great video! I learned a bunch of this in university a million years ago but you do a super job of simplifying a really complex set of concepts. Kudos!
@fios45282 жыл бұрын
Thank you for making this video. I've genuinely spent many a sleepless night thinking about the lifespan and logistics of a minimum energy computer for a future civilization that lives in a simulation.
@berniv73752 жыл бұрын
Could you possibly do a video about data banks and how they are taking over the world. I was astonished to learn how much energy is required just to take a digital photograph and how much energy is required to cool down data banks. 🌱
@cate01a2 жыл бұрын
if the universe is a simulation, why care for energy? like the people controlling the simulation could either supply it with nuclear or better energy that would last many universes lifetimes. or more likely is they'd speed up the simulation speed to like tree(10^^^^^^^^^^^^^^^^^^^^^^^^^10), so that maintenence is a non issue because they (might, havent done the maths, can you even calculate that?? no probably not since iirc we only know a COUPLE digits of tree(3) so fuck that bigass number) would have already simulated more universes than we could literally comprehend in less than a microsecond. though thatd take fucking insane technology, probably impossible for the laws of physics, so then the simulator controllers would need to exist is a much more advanced, cooler universe, but then the persons controlling THEM would need even IMMENSELY more fucking power, like holy shit, not even a trillion bajillion shit tonne quatrillion lifecycles of the entire universe/existence could even come CLOSE to the amount of damn energy needed for that to happen so uh i guess simulation is outta the question unless like god is real and on his impossible computer he's playing the sims10 or some shit, though yknow that makes zero fucking sense too
@anywallsocket2 жыл бұрын
@@cate01a A^^A = A^A
@paulmichaelfreedman83342 жыл бұрын
Funny thing is, that if it turns out we're bound to our solar system (FTL completely impossible for complexly arranged matter) at least part of the planet's population will choose to live in a simulation created by ourselves by means of VR immersion with whatever technology is invented for this purpose in the future. Games like EVE, WoW and Elite Dangerous are examples of current day escape to a universe in which much more is possible than the real one. If you have seen the movie Surrogates or Ready Player One you'll catch my drift.
@cate01a2 жыл бұрын
@@paulmichaelfreedman8334 faster than light travel for complex matter/humans ships plants etc should be possible (not feasible (yet)) by manipulating space time, which is possible - and by making the spacetime go faster than light rather than the matter, the matter is still travelling whilst not going against the laws of physics: kzbin.info/www/bejne/foaweJZunaqepsU&ab_channel=PBSSpaceTime havent watched that specific vid but probably explains same concept
@MarkusBohunovsky2 жыл бұрын
Here is where this does not completely make sense to me. This assumes that a regular gate only outputs 1 value (the 1 or 0) from 2 inputs. But that isn't physically true at all. The gate outputs an almost infinite number of values. It just happens that we are only INTERESTED in 1 of them. So, it isn't a PHYSICAL function of the gate to have 2 inputs and one output. It is a function of our INTERPRETATION. And if you assign an energy value to that, then you are assigning an energy value to our INTERPRETATION, and if we interpreted different outputs of the gate (for example, how much its temperature increases with certain inputs, or even what exact voltage it outputs (rather than just interpreting an output voltage over a certain limit as 1 and under a limit as 0, etc., etc.) then the landauer limit should change for the gate. That would mean that our interpretation changes its energy use, without ANY physical changes. That doesn't make sense.--at least not in the classical physical sense. Information is not something that exists physically per se (or at least not in the sense presented here) It depends on our interpretations. (There may be such a thing as information per se, without any interpretation or filtering, but that would have to be represented by a much more essential unit than a computer bit...and personally I have not read anything that indicates that a good definition for such a thing even exists.) Computers do NOT physically process zeroes and ones. They execute certain physical/electrical processes, that we then INTERPRET as zeroes or ones (and then as other information, of higher levels, based on these binary bits) So again, the explanation that a traditional gate decreases entropy because it has 2 inputs and 1 output doesn't make sense. The physical component has an almost infinite number of inputs and outputs. We, however, are only interested in 2 inputs and 1 output. We could just as well be interested in 10 of its outputs. It's a mere matter of interpretation and focus.
@davestopforth2 жыл бұрын
I'm struggling with the definition of information here. Surely information is based on our perception of states and conditions. Of course for information to exist requires a transfer of energy at some point, and the information literally exists because it is a state, but it doesn't become information until we begin to perceive it. For example, a particle travelling through space can have it's direction and velocity determined, but the particle doesn't care, it just exists and it just weighs X, whilst travelling at Y towards Z. That's it. For that to become meaningful information it requires some perception. Would it not be better to say it requires energy for the creation, transfer or manipulation of information?
@heinzerbrew2 жыл бұрын
Yeah, it would have been nice if she had defined "pure information" because the information I know about doesn't need energy to simply exist. You are correct it is the creation, changing, and refreshing that requires energy. Not really sure how she doesn't get that.
@DrewNorthup2 жыл бұрын
Information at rest does actually have an energy component, but she'd be here all week explaining it.
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@compuholic822 жыл бұрын
@Dave "Would it not be better to say it requires energy for the creation, transfer or manipulation of information?" But that also means that information itself must be associated with an energy state, does it not? If the manipulation (i.e. change of state) requires or releases energy there must have been an energy level before the manipulation and an energy level after the manipulation. The difference in these energy levels is the amount of energy needed for the manipulation. In that way it is no different than any other measurement of energy. Take gravitational potential energy. If you drop a ball you can assign an energy level to the ball before and after the drop and the difference in energy levels is released as kinetic energy.
@hyperduality28382 жыл бұрын
@@compuholic82 There is also a 5th law of thermodynamics, energy is duality, duality is energy! Energy is dual to mass -- Einstein. Dark energy is dual to dark matter. Action is dual to reaction -- Sir Isaac Newton (the duality of force). Attraction is dual to repulsion, push is dual to pull -- forces are dual. If forces are dual then energy must be dual. Energy = force * distance. Electro is dual to magnetic -- Maxwell's equations Positive is dual to negative -- electric charge. North poles are dual to south poles -- magnetic fields. Electro-magnetic energy is dual. "May the force (duality) be with you" -- Jedi teaching. "The force (duality) is strong in this one" -- Jedi teaching. There are new laws of physics! Your mind creates or synthesizes syntropy! Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic. Duality creates reality!
@commandershepard61892 жыл бұрын
Cool! Good video... Some people don't understand that 0 is open circuit while 1 is closed circuit. Meaning 0 is the absence of a bit while 1 is a bit. In microchips 0 has work or foce being applied due to heat transfer from the nearby transistors and the opening of that transistor... this equats to energy loss in the application. Yeah, work applied but no work output. Cool stuff. The problem, we'll never get around thermodynamics.
@hoptanglishalive41562 жыл бұрын
Disinformation gives me the chills but information warms my soul. Like Prometheus giving the fire of knowledge to humanity, science educators like shining Jade are doing vital work.
@havenbastion2 жыл бұрын
"Pure information" still exists as a pattern in a physical substrate and the movement of physical things is heat. There's no mystery unless you imagine pure information to be non-physical, which is an existential metaphysical category error.
@thelocalsage2 жыл бұрын
yeah yeah yeah and really there are no exact circles in nature and technically pi is a rational number because there aren’t *really* uncountably many things to construct pi from and blah blah blah we’re dealing in mathematical abstraction here you don’t need to “well, actually” a really good piece of science communication by telling mathematicians they’re performing errors of metaphysics when it takes only a modicum of emotional intelligence to see these ideas are a platform for discussing purely mathematical systems
@pandoorloki12322 жыл бұрын
"unless you imagine pure information to be non-physical, which is an existential metaphysical category error." That is precisely backwards. Confusing information with physical implementations is a category mistake. P.S. The response is more of the same conceptually confused nonsense. Information is a mathematical abstraction ... it doesn't need a physical instantiation or a "physical need"--that confuses the colloquial meaning of information with the formal meaning.
@havenbastion2 жыл бұрын
@@pandoorloki1232 The information always has a physical instantiation or it couldn't be information. In order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.
@thelocalsage2 жыл бұрын
@@havenbastion a circle always has a physical instantiation or it couldn’t be a circle. in order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.
@trewaldo2 жыл бұрын
I went to watch this video with the preconceived notion about entropy's connection with energy and information. But this is different. The explanation made it better! Thanks, Jade. Cheers! 🥰🤓😍
@pandoorloki12322 жыл бұрын
Shannon entropy is not thermodynamic entropy--they are fundamentally different things.
@granduniversal2 жыл бұрын
This approach to entropy is like making a database for an application. You make that based upon the business rules. When you are trying to find out the business rules, you will have to deal with a lot of uncertainty. That uncertainty is not unlike the challenges that surround trying to figure out chaos math. Anyway, the more uncertainty, the more entropy. And the natural state for most people to be in is that of ignorance. When you are figuring out the business rules you will find varying interpretations, and varying ideas about what kind of a world those can go in. That part is not unlike how consciousness is not a broad spectrum thing, but rather about what we can focus on at any one point in time. In the end you hope to get to the truth. The truth is the only "state" that can answer all of the questions, logically lead to the end result desired for a particular circumstance. This thing about getting at the business rules, it is important as a metaphor. It speaks about how you can try to do math between apples and oranges sometimes, but we want there to be an equivalency. Well, there will be, if you borrow from the outside. I don't mean using things like big data to help you find the business rules. That will most likely result in scope creep, as it points out to you more ways to make differentiations, flinging you into what could be before you understand what is. I mean wrestling with it by using as little information to deduce what you have to as possible. If there is an equivalency, then the correct answer should be surrounded by a lot of falsehood, that is interest in remaining ignorant, or potential other path traces, or heat. Because this is hard work, and nobody should go around saying it isn't. If you are right about knowledge, or wisdom, there will always be a lot of people going around telling you that you are wrong, in other words. That's no reason to give up. The example also points out something about linguistics that I find fascinating. It only affects the consistency of one side of the two sides that should be active to gain effective communication. But we see from things like animal training that is really all that is necessary to achieve success, for the master. It is about reward. And reward is about what an object needs or wants, not what the master demands it needs or wants. Making databases, however, is about making it easier for everybody to do things repeatedly, with less error. Does this lesson tell us that is achievable with only one side being consistent? You know, with the system being built correctly? I guess we find that out when we use a good app versus a bad one?
@elroyfudbucker68062 жыл бұрын
Practically speaking, the heat generated within a microprocessor comes during the infinitesimally short time that the MOSFETs that make up the logic gates change state. When they are either conducting or not conducting current, they generate no heat. Multiply this extremely tiny amount of heat by the millions of MOSFETs that are changing state millions of times a second & you see the need to have a cooling fan on a heatsink mounted directly on the microprocessor & why it's not a good idea for your daughter to have her laptop on the bed.
@htomerif2 жыл бұрын
Minor correction: what you said was exactly true 20 years ago. Right now, with the size of junctions in gates, a good chunk of the heat generated by processors is from quantum tunneling (i.e. leakage current intrinsic to the gate). I say "gate" because in modern processors you don't have individual FETs. You have compound devices that serve the purpose of a logic gate without actually having a recognizable FET. The only way to stop that kind of power drain is to kill the power entirely to large sections of a processor, otherwise they constantly drain power, flipping bits or not. I realize I used the word "gate", both meaning the insulated physical gate that controls current flow in a semiconductor device and meaning "logic gate" here. Hopefully its relatively clear which is which.
@KirbyZhang2 жыл бұрын
@@htomerif does this mean older FET processes can produce more power efficient chips?
@htomerif2 жыл бұрын
@@KirbyZhang Its not so much about "older". Its about what they were designed for. OP isn't wrong that FETs changing state *used* to be the primary power drain, or that it is *still* a significant contribution to TDP. Its just been optimized now in CPUs for minimum power per computation and that means gate leakage outweighs state changes. Unfortunately, this gets complicated. You probably know that past a certain point (I don't know, about 100nm?) the process nodes have nothing to do with size anymore. For example, the 7 "nm" node can't make features that are 7 nanometers. Its minimum feature size depends on the type of feature but its up around 50 actual nanometers. Its kind of a big fat lie. Ultimately, the answer to your question is: using current process technology, you can optimize it for power efficiency by increasing gate sizes and increasing gate dielectric thickness but it comes at the cost of more silicon real estate. They do use older processes and equipment for microcontrollers like TI's MSP430 series or Microchip's atmega series, but its older equipment that's been continuously developed for making these slow, extremely power efficient processors. I guess it really depends what you mean by "power efficient". If you mean "minimum standby TDP" then yes. If you mean "minimum power clocked at 100MHz, with a modern design" then also yes. If you mean "absolute minimum cost per floating point operation" then no. And all of those are important considerations. A rackmount server might be going all-out for 5 years straight. A desktop processor will probably spend most of its life at a much lower clock rate than maximum and a mobile (i.e. phone) processor will probably spend a whole lot of its life in standby at or near absolute minimum power. I hope that helped. Its hard to pack a whole lot of information into a small space that someone would theoretically read.
@TechnoMageB5 Жыл бұрын
2 things: 1) With current electronic computers, there is no way even with reversible logic gates to make a net zero energy use computer. The architecture itself requires energy to operate, thus the laws of entropy cannot be circumvented. Quantum computing still needs to solve the architecture problem, but at least the computing itself could theoretically work. 2) This video reminds me of a discussion at work I had with a girl years ago, who was a college student working there at the time, about logic states. She was curious as to how computers "think". I started with a similar example, the light being on or off (while switching the light in the room on and off as an illustration), that by storing and comparing millions of on/off states, computers evaluate conditions for us, and that those "light switches" are represented by the presence or absence of voltage - more basic than this video. Next time I visit that location, she tells me that literally a few days after our talk, her professor starts talking about the same thing, and here she is in class smiling to herself the whole time thinking "I know where this is going..."
@jesuss.c.88692 жыл бұрын
Great video, Jade. Thank you for introducing such a complex topic in an easy and fun way. 👍
@gbear10052 жыл бұрын
Richard Feynman (the Nobel physicist / father of QED) gave talks on reversible computing and energy use
@NotHPotter2 жыл бұрын
Makes sense. In order to maintain a system of any kind of order (and thus store useful information), the system internally needs to expend energy to resist the natural entropy that seeks to degrade that information. Some kind of input is gonna be necessary to resist that decay.
@heinzerbrew2 жыл бұрын
we have lots of methods of storing information that don't require energy to maintain. Sure eventually entropy will destroy those storage devices, but we don't operate on that time scale.
@NotHPotter2 жыл бұрын
@@heinzerbrew Those more stable methods are also a lot slower, although even early CDs and DVDs are approaching the point where they're no longer readable. Books weather and wear. Ultimately, it doesn't matter, though, because even if you're going to quibble over time scales, there is still some necessary effort made to preserve data in a useful state.
@robbeandredstone73442 жыл бұрын
9:40 For anyone wondering, that is approximately 75eV to put it into perspective.
@itsawonderfullife48022 жыл бұрын
Insightful video. The "reversible logic gate" (referred to near the end of the video) is simply using (controlled and confined) classical scattering to compute. And we already know that scattering is a reversible process because it is directly dictated by the laws of mechanics (and conservation laws) such as Newton's 2nd law. And Newton's 2nd law is evidently time reversible (as are other fundamental laws of nature) because they are expressed as 2nd order differential equations (involving acceleration) which are invariant under a time-reversal transformation (=keep positions and reverse velocities). The question then again goes back to this: Given that the fundamental laws of nature are time-reversible, how come we have a thermodynamic arrow of time and irreversible macro processes (such as a traditional irreversible logic gate operating) and a common answer is that irreversibility (=thermodynamic arrow of time) is an emergent feature of an ensemble of many particles (its simply mathematics and probability). So the model "reversible logic gate" solves really nothing. It's just a toy model for a controlled Newtonian scattering, which we have known for hundreds of years. That does not tell us how to build computers which do not increase entropy.
@_kopcsi_2 жыл бұрын
the statement at 6:25 is wrong. entropy CAN decrease. even is a thermodynamically closed system. but it has very very small chance. the entropy law is valid only globally (in the long run). this is a probabilistic description, which is not surprising knowing that entropy, information and probability are closely connected to each others.
@Mike5009122 жыл бұрын
1. Microprocessors do output energy via driving external ports, used for driving other subcircuits. 2. A logic gate isn't a fundamental building block. It itself is made up of transistors, a more fundamental building block.
@zscriptwriter8 ай бұрын
Back in College in 1984 I created a circuit gate simulator on my TI99 computer that allowed the user to place circuit gates onto the screen, connect them and then input values And then compute the output. Given enough memory, the simulator could reverse lookup the initial values. Thank you Jade for reminding me how much fun I had in college. You are an awesome person with an unlimited imagination.
@IllIl2 жыл бұрын
Thank you very much for the video! This is by far the best explanation of the Landauer limit that I've heard, it actually makes sense to me now. One question I still have is that this equivalence between information and entropy seems to be a purely theoretical limit that comes out of the math when we take the 2nd law as axiomatic and then "balance the entropy books" of a gate that has less information on the output than the input. But this seems to make an assumption that "information" obeys the 2nd law. In reality, the information of a logic gate is something that a human interprets. The reality is just physical conductors and semi-conductors. The physical reality of _any_ implementation of _any_ logic gate should always be sufficient to preserve the 2nd law. Or is that the point? That if we found the ultimate physical implementation of a logic gate, its waste heat would be equal to (or greater than) that limit?
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@key_bounce2 жыл бұрын
One big problem: There are three-input, three-output versions of ALL the normal logic gates. These output the same inputs, just "switched". So you have the same number of input states as output states. (Checking google / wiki, one example is the Fredkin gate: if A is true, swap B and C; if A is false, leave B and C unchanged. This has been implemented in quantum computers; a 5 qbit computer can do a full add of two bits with no information loss. There may be other examples as well -- I thought there were but cannot find them.) As far as we can tell, the energy cost in this case is dependent entirely on the time factor -- since you care about the output on one line (the one that goes to the next gate), it takes time (and power) to change the state (voltage, current, or whatever is being used). Or put another way -- at some point, all your spare output lines have to hit a resistor and then the ground line, and that resistor has to generate heat (how much depends on R and I, which in turn means speed.)
@atrus38232 жыл бұрын
Great video, as always! Based on my absolutely zero research, my first thoughts are that though the entropy change (assume lossless collisions) of the gate is 0, half of the output energy is not used in the final calculation. There would need to be a way in the system as a whole to make use of the waste balls, or else you're right back where you started.
@waylonbarrett34562 жыл бұрын
This has been considered. See the Szilard engine.
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@atrus3823 Жыл бұрын
Thanks for the replies. Someday, I'll hopefully find time to follow up on this.
@LMProduction2 жыл бұрын
The Landauer limit can be circumvented by noting that entropy can increase in other conserved quantities, not just energy. A paper from 2009 (and a few more papers in the years since) by J. Vaccaro and others showed that in principle, you can transfer the kb ln(2) bit of entropy to an angular momentum reservoir, therefore inducing no energy loss. We're currently working on a qubit operation to show a single bit transfer of thermal energy from the environment into a spin bath experimentally.
@douggale59622 жыл бұрын
The fundamental component of modern computers is the field effect transistor. They are metal oxide semiconductors, so they are known as MOS transistors. There is a positive and negative construction, the two constructions are the suited for pulling (voltage) up and pulling down. Those two constructions are "complementary". So the fundamental component is more specifically, CMOS FETs. You switch FETs by charging or discharging the FET gate layer. The power used by a processor is the dissipation due to the resistance of conductors transferring charge in and out of the gate. Holding a 1 or a 0 state does not consume power, only the changes dissipate significant power.
@abdobelbida71702 жыл бұрын
Is that what "switch losses" means?
@douggale59622 жыл бұрын
@@abdobelbida7170 Yes. Usually, switching losses primarily refer to the time when a transistor is partially on, and a voltage drop across the source and drain (the switched path) dissipates power. In a switching power supply with large currents, the partially-on pulse of losses is large. That's one huge transistor with one huge pulse of losses. In a CPU, there are no large currents going through any individual transistor, but an enormous number of gates being charged or discharged simultaneously, so an enormous number of small losses sum up to a large loss that would be correctly called switching loss. All of the power that goes into a CPU is lost, all of it becomes waste heat, eventually. Another way of looking at it is, CPUs use no power, other than the power they need for the losses, once they get going.
@esquilax55632 жыл бұрын
This is one of my favourite videos of yours. Most of them cover topics I have a fair bit of familiarity with, but your "where's the mystery?" intro made me realise I've barely thought about this at all
@danielclv972 жыл бұрын
I love the video, but I'd like a follow up, maybe with the Maxwell's daemon, like, if this can solve the need for energy consumption during information manipulation, what stops the Maxwell's daemon to break the laws for thermodynamics? Or, is it still physically imposible to store information with no expense of energy? And if yes, then isn't the billboard compute useless in the sense that it will consume the same amount of minimum energy? You can't use information if you don't store it to interpret it after all. You'll have to at least read the position of the balls after they pass trough the computer.
@geraldsnodd2 жыл бұрын
Maxwell's Demon is a cool topic in itself :) Check out Eugene Khutoryansky's video.
@christopherknight49082 жыл бұрын
If I'm not mistaken, Maxwell's daemon has to expend energy to erase information. Would reversible computing eliminate this requirement? Perhaps information storage is just a different problem that would need to be solved, in addition to the energy usage of computing.
@_kopcsi_2 жыл бұрын
@@christopherknight4908 yes, information storage is totally different from computing, but interestingly the two problems have the same source: "Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment"." -- en.wikipedia.org/wiki/Landauer%27s_principle storage of information is static, manipulation of information is dynamic. these two have totally different nature. but when we erase information, we actually manipulate it. erasure is dynamic. that's why the same principle can provide a solution for the problem of Maxwell's demon and an explanation for the fundamental energy cost of irreversible logical operators.
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@_kopcsi_2 жыл бұрын
@@hyperduality2838 dude smoke less weed
@surya_112 жыл бұрын
14:48 So the output zero needs to be a component of the output from the gate itself. So we're looking for a function that looks like f(x, y)=(x, f(x, y)). Such a function would produce an output containing infinitely many xs which makes reversible gates impossible even theoretically. Because if f(x, y)=(x, g(x, y)) where f≠g, g would itself form a gate and produce heat.
@Santor-2 жыл бұрын
Exactly. A computer built like this would serve all the inputs as the output which defeats its purpose.
@Zeero38462 жыл бұрын
I got so excited that I actually manage to guess your topic of reversible computing. I had only recently learned about through some crazy crypto proposal about an energy-backed stablecoin, and it fundamentally relied on the idea of reversible computing. In particular, it suggested a way to transfer energy over the internet. A computation doesn't necessarily have to be located on the same computer. It could happen over the network if you imagine the network as one giant computer. In performing the computation, the input side would generate heat, while the output absorbed an equivalent amount, and you essentially have a new kind of heat pump. Of course, this sounds way too easy to be true, and that's because it is, but it's still definitely cool to think about. It's somewhat reminiscent of Nikola Tesla's wireless energy tower, except perhaps it's a little less scary as it's not trying to transmit energy directly through some physical medium, like the earth and sky, but rather over our telecommunications infrastructure as data.
@vansf34332 жыл бұрын
Whenever you have the cureent i runs through a logic gate or gated latch, you have an "on signal" coming out of an output , the signal is corresponding 1 , and when there is no current through another output, no signal is interpreted as "off signal" or 0 , and when you have a bunch of gated latches arranged in matrices to keep such bits of 1 and 0 , you have a compter memory
@landsgevaer2 жыл бұрын
After having spend so much time thinking how to carefully explain an unintuitive topic and managing to produce such a wonderful video, it must be so disappointing reading so many of the comments here that don't show anything has landed. Fortunately, there are a few exceptions too. Rest assured that it is the YT audience that is lacking, not the producer... 😉
@En_theo2 жыл бұрын
Well, she kinda misunderstood some concepts herself since she said @6:24 that entropy "never" goes down, which is false. It tends to increase but there is still a possibility that it goes down. If the universe is eternal, then that hypothesis becomes a certainty and you find yourself with renewed energy again (after a veryyyy long time though).
@landsgevaer2 жыл бұрын
@@En_theo That may be a theoretical note to make, but I wouldn't say that makes the statement "wrong" for this audience. We can hopefully agree that many comments are much worse... 😉
@En_theo2 жыл бұрын
@@landsgevaer Comments are entitled to the people who make them, as long as they don't pretend to be physicists bringing a new knowledge. I'm not blaming her, it's a common misconception that entropy "always" increases. A simple sentence like "it tends to, but the details are for another video" would be nice.
@itsawonderfullife48022 жыл бұрын
Great video as always. One little reminder though: At 7:53 you have to also consider the power plant (and include it in the "system") which makes the electricity used to produce the work (=energy for refrigerator's compressor) which is needed for the refrigerator (to reverse and decrease the entropy of its interior) to say that the 2nd law is maintained overall and entropy of a closed system (most often) increases.
@AdrianBoyko2 жыл бұрын
What a great presentation of these topics. This is Carl Sagan level exposition!
@phatrickmoore2 жыл бұрын
Since at some point information will be extracted from any computing system, I believe the Landauer limit cannot be evaded. Reversibility seems possible if the system remains undisturbed, but otherwise, the signal must at some point either be duplicated or wholly returned.
@compuholic822 жыл бұрын
Yes and no. You are right, that in some sense there it is not possible to evade the Landauer limit. A good way to see that is to look at mathematical calculations are fundamentally not invertible. So these are calculations that destroy information. To finish such a calculation you might need many computational steps. During the computation you can actually cheat the fundamental irreversibility of the calculation by using reversible logic. The cost of this is that you will need to add "waste bits" to the output which are not part of the actual answer. So you are actually deferring the fundamental irreversibility of a calculation till the end. After all you are only interested in the answer and not the waste bits. And by discarding the waste bits you are then destroying information and the Landauer limit will hit. But not all of these "waste bits" are necessarily uncorrelated. So even of you are discarding N bits of data, you might actually not be discarding N bits of information. And that means that if the computation was implemented in non-reversible logic, the computation was less efficient than it needed to be.
@phatrickmoore2 жыл бұрын
@@compuholic82 hi, thank you for your thought-provoking response! I would need to think about some examples on your arguments for mathematical irreversibility and the ability to cheat it with reversible logic. For one, I know there are both reversible and irreversible mathematical operations. A reversible one is adding a co start to a number. An irreversible one is taking the floor of a number (or any modulus type operation). I also know that in any finite memory digital systems there is of course some inherent irreversibility due to precision losses. I’d have to think about if any of this matters :) But! I think your thoughts about waste bits is a big part of this. Ultimately, it is believed that information is conserved at the quantum level. So, there is no information lost ever. The situation Landauer’s limit applies to then seems to shift to the energy equivalence specifically related to the amount of extracted information. Then this becomes super interesting, because we would need to pin down exactly what extracted information means. Likely this would come down to the information that is causally linked to future events of the extracting system (that is, in the past light cone). I guess this is obvious, but the Landauer limit seems like a direct analog to the thermodynamic limits of work you can extract from a system (in terms of energy). This may have already been stated in the video or somewhere!! But, I think in general, Landauer limit is precisely about the limits of extraction of information.
@paulthompson96682 жыл бұрын
Hi Jade, this video seems like a wonderful lead-in to a video on how quantum computers will require less energy due to the reversibility of their gates.
@dodokgp2 жыл бұрын
Not unless the cryogenic operation is obviated.
@paulthompson96682 жыл бұрын
@@dodokgp So let's get cracking on quantum computing at room temperature!
@Dellvmnyam2 жыл бұрын
Thanks for the video, I learned about the Landauer principle from it. But the title is misleading. Information does not give off heat by itself, the computation does if it's irreversible. Like a photon does not lose its energy by itself, only when colliding with other particles.
@micro27432 жыл бұрын
"Pure Information" (Data) can be stored on a Floppy Drive/Hard Drive/CD/DVD/Thumb Drive/SSD/etc... and it requires no energy and gives off no heat. Moving information requires Energy and produces Heat! Data stored in active memory must be refreshed and also requires Energy and produces Heat, and of coarse physical hard drives spin for a while even when they are idle. You covered logic gates, but it should also be noted that often computers are just moving data i.e. copying an image from a web server to your local hard drive, then to active memory, and eventually to your graphics card so you can see it on your screen.
@StefanNoack2 жыл бұрын
Well, the floppy media requires mass, and that is equivalent to energy ;-)
@Rithmy2 жыл бұрын
And the wear and tear of the floppy media can be said to equal energy loss. Almost like radiation in big.
@micro27432 жыл бұрын
@@StefanNoack A mass at rest is potential energy and I don't think it produces heat.
@micro27432 жыл бұрын
@@Rithmy I am talking about cold storage, and I know a floppy can last 20+ years. I don't think a CDR/DVDR lasts that long. Retail CDs/DVDs are usually pressed and degrade much more slowely. Since energy cannot be destroyed, you are correct. That magnet field has to be moving somwhere else as the Floppy degrades, and theoretically creates a small amount of heat. Would we even be able to measure it?
@Rithmy2 жыл бұрын
@@micro2743 Isn't mass alone creating heat by pressure?
@francescocolangelo69002 жыл бұрын
I see a problem here: in order to be reversible you need to store all the time all the data (e.g. here output+auxiliary input). That's way not sustainable. And the moment you discard the extra information you are generating the entropy that you where trying to save.
@jneal41542 жыл бұрын
Exactly. This was Charles Bennett's argument against this line of reasoning. You do not have infinite computational space nor time. The Maxwell's Demon thought experiment, (when assuming the demon operates at 0 energy cost), immediately violates the conservation of mass and energy. The mere act of storing or retrieving information requires energy.
@davidh.46492 жыл бұрын
So Jade ... does the Landauer Limit apply to synapses in the human brain as well? 😁 Good thought provoking video as always!
@gotbread22 жыл бұрын
It does. Its a fundamental consequence of information erasure.
@cmilkau2 жыл бұрын
If you want to use reversibility to eliminate the entropy going down, you have to make the gates produce their outputs by overwriting their inputs (e.g. like quantum computers do). If you have an extra output wire, and the gate changes it to the "correct" value, no matter what value it had before, that gate reduces entropy by 1 ("output wire has any value" to "output wire has a definite value"). The only way to compensate for that would be to "forget" (overwrite) one of the input values. The billiards ball machine does that, as the balls leave the input pipes. Quantum computers also do that, as the inputs can no longer be measured after they have been processed.
@bntagkas2 жыл бұрын
im just a stupid highschool dropout but to me it seems all of these energies are really kinetic energy. whether it moves a car, or manipulates information or produces heat, if you zoom in you move atoms in a way that benefits you, you move atoms to move the car=kinetic, you move atoms/electrons/photons to manipulate information, you move atoms etc to produce heat. so it seems to me all kinds are really one, kinetic
@ThatJay28310 ай бұрын
6:30 so, this would be important because in a computer, entropy can go down. it can simply clear chunks of memory to zero, load programs, etc. so in order to decrease the entropy in a computer, an increase in entropy must happen elsewhere (eg by fissioning uranium atoms in a power plant) to counteract this decrease in entropy. that is the theoretical limit.
@MrAttilaRozgonyi2 жыл бұрын
I remember reading somewhere that information actually has weight. In other words, a hard drive full of data is measurably a tiny bit heavier than that same drive when erased. I wish I could remember the actual reason why. Maybe that could make an interesting video? All the best! ☺️☺️
@Snowflake_tv2 жыл бұрын
I'm interested in, too.
@markricher20652 жыл бұрын
I was thinking the very same thing 🤔
@Snowflake_tv2 жыл бұрын
Is information then mass???
@MrAttilaRozgonyi2 жыл бұрын
I have a vague recollection that it *may* be limited to the older style HDD’s which have magnetic platters and it may not be the case in SSD drives… but that’s all I can remember. :) I’d love to know more about this.
@RobBCactive2 жыл бұрын
As HDD platters magnetise the coating when the drive heads set bits and sit in the Earth's magnetic field, it's possible the measured weight changes a little by adding a magnetic force but that is not gaining or losing mass.
@Kimwilliams452 жыл бұрын
Thank you. I had never heard about the Landauer limit even though I was a physics student. Very good explanation.
@rikardlalic72752 жыл бұрын
During our lifetime we are solving equations all the time, by acting, thinking, making decisions, reducing so possible solutions availability constantly and irreversibly. Is it this that creates the illusion of time flowing and flowing in one fictional direction only, making up so strange kind of awareness of the past, the presence and the future? Does our time flow equals energy loss? What happens with exhausts, the used, discarded (and lost) bits?
@mchammer50262 жыл бұрын
anyone notice that the truth table for the EQV+ gate at 13:40 makes absolutely no sense? it's just an exact copy of the inputs. in order to function anything like an XNOR gate there would need to be one column reading 1 0 0 1 from the top down, representing the output of the XNOR, and then a second column can be added to that to make it "reversible".
@bubbacat994010 ай бұрын
She decided to draw arrows from the inputs to the outputs instead of lining them up for some reason but if you match them up with what the arrows drew it does work
@mchammer502610 ай бұрын
@@bubbacat9940 Ah of course you're right, how silly of me
@allanwrobel6607 Жыл бұрын
A fundermental concept explained so clearly, you have a rare talent, keep going with these.
@ciroguerra-lara6747 Жыл бұрын
Another way to look at the Landauer principle, from what I've read, is the energy needed to erase one bit. Also I read that thermodynamic and computational reversibility are not always equivalent. Also that the Landauer limit on a Maxwell demon implies that, since in reality the demon will have eventually to erase information (we do not have infinite resources to keep information) irreversibility and entropy increase eventually catch up with us.
@beepbop6697 Жыл бұрын
I had read that it is the LOSS of information is what produces heat (or requires energy) -- which jives with what this video shows: you can't reverse this particular logic operation so information was lost when it computed the result, hence heat/energy is given off.
@jursamaj2 жыл бұрын
Problem: most of the irreversible gate changes occur from storing info in memory, not from computing that info. And that storage is the whole point of computation, so it's unavoidable.
@TheoWerewolf2 жыл бұрын
To be more precise: it's the cost of *destroying* information. You can read a bit of data with almost no energy (system inefficiencies and basics like particle interactions are the limit there - you can't violate entropy after all), but *writing information*, if you're changing state, consists of destroying the original state and THAT'S what's expensive.
@ZMacZ Жыл бұрын
The cost of information, is based on expenditure of keeping it intact as well as forming it. So, the minimum energy for any current storage system is based on the smallest particle you can hold which must then be stable as well. A photon is currently the smallest particle of information. In the future this may include quarks. The minimum expenditure for keeping a particle in it's state is variable upon the usage of the sort of particle. If you'd use a proton set (H2), it would stay stable at a minimum cost, while being supercooled to nearing 0K temps. With enough space and lack of energy to apply heat, this would stay in position. Ofc this is far from the Landauer limit, since the value there is much smaller, and only theoretically feasible. Last time I checked photons would not want to stay in the same state when held even at a temperature of 0K.
@ZMacZ Жыл бұрын
Economical lowering the Landauer limit by cooling can only be done in energy deficient space, like out of the solar system space. On Earth you need really high efficient insulation and much cooling. Not economical. In fact you can show that information, and with that energy is best stored in information deficient areas of space, since there the ambient energy is lowest, and with that the lowest density of information already present disturbing yours. Also, following information = energy = mass, means that the lowest mass particle indicates the minimum of practical nearing of the lowest entropy of a single bit of information, which can never actually be reached due to expense being above zero to keep it that way..
@mmicoski2 жыл бұрын
Very nice explanation with the billiard balls: each input has to go somewhere if not as output information, as loss energy (heat).
@jolodojo2 жыл бұрын
Great video. What i do not understand however is that the wish of every physicist seems to be to make something irreversible reversible.. It is like the quest for a perpetuum mobile all over again. In my mind this is the true waste of energy. Think what possibilties there would be if everybody starts accepting that there is no reversible universe. There is no going back in time. Only forward.
@sethrenville798 Жыл бұрын
Wonderful explanation! I would like to clarify 2-ish things with my understandings. The first is that the 2nd law of thermodynamics isn't actually a law In that it is a guarantee for all things. All it says is that systems generally tend towards disorder. No, mathematically speaking, it is all macroscopic systems, as the likelihood of a system, for example, of gas all of a sudden collecting itself into the corner of a room, is so improbable that it will likely take billions of iterations of Creation and destruction of something as complex as our universe before it ever occurs. I think the most straightforward way I could describe it as follows: All computationally bound Systems with the 3 plus 1 dimensional spacetime We are embedded tend to evolve And the most probaballistically likely manner, Which appears, microscopically as a dissipation of usable energy, As bits of information are generated, consuming tiny amounts of energy with each stroke of the universal computation That causes spontaneous symmetry breaking And the subsequent generation of physical reality from the higher dimensional probabalistic wavefunction. Lastly, even if the Computation within the computer can be made to be reversible in a meaningful way, there will still be a lower limit, as, with that setup comment you are simply removing the energetic requirement that comes from erasing the state of the It's deprived of computation, but the generation of information, regardless of its reversibility, still requires energy, due to the collapse of probabilistic wave functions call my even if those probabilistic wave functions are Only contain the probability of one outcome. Upon further processing, This seems to me to potentially be dangerous, as All energy and mass are are particular ways of encoding information, and by calculating more outcomes without erasing the previous outcomes, the density of information within those isolated systems will begin to grow exponentially, which, as we know from the behavior of mass in such a manner creating black holes , with mass just being another way of storing information, is liable to lead to a terrible outcome very quickly. If anyone Things I am incorrect in this line of thinking and would like to point out why, please do, as I am not certain I have a full understanding of the implications of The information on model of entropy. As an aside, I believe this energy consumption is Analogous to all of the potential outcomes that were not able to be actualized being Used to generate the masses we interact with through the Higgs mechanism, Which necessarily causes energy loss, with the concurrent decrease in frequency brought about through the interaction. That actually fairly well sums up the 2nd clarification I had, in that the Your responsibility of computations stems from the pairing down Of probabilities into a specific actuality. The issue with trying to get around this is that it is the nature of this irreversibility that gives meaning to the computation, As a computation that takes our particular instance of existence to and turns it back into multiple probabilistic outcomes is, as far as our experience is concerned, meaningless, as we can only experience discrete, specific instantiations of existence. Therefore, even if these calculations do occur, they will never be meaningful to us, and woll lilely mever even be able to be studied in any meaningful way, if my understanding is correct. Lastly, and this is pure conjecture, so please rebutt with And questions comments or concerns, it seems to me that these specific instances of reality are only generated Because of They're necessity, in regards to conscious experience, has the calculations that collapse the way functions cost energy, and thus, would seem to violate Newton's 1st law, and/or the law of conservation of energy, Unless there is some other force that necessitates this particular Manifestation of existence. It could very well add some other explanation, but, until I can Either think up or come across one, It's particular quirk of experience seems to point to Donald Hoffman's conscious agents theory Holding even more water than I had originally supposed.
@kozmizm2 жыл бұрын
In classical computing some information is destroyed(lost) when doing a calculation/computation, but if you do things cleverly, you can create an odd symmetry where the calculation done forward in time is also essentially the same as the calculation done backward in time, also referred to as reversible computing, in which no information is lost, even though it requires that many more states are preserved, and the result is sort of like the same thing as the initial state in the first place, adding quite a lot of difficulty into the design of the machine. Yet, in a superconducting environment, this is 100% efficient, theoretically, leading to zero energy wasted. That is, if my understanding of all that is correct. Hence, it is theoretically possible to have a special quantum computer that does not waste any electricity. Instead of destroying information by starting with multiple pieces of information that combine down into fewer bits of information, we preserve all the information during the transformation using our special knowledge of physics, and the number of inputs equals the number of outputs and everything is able to be deterministically reversed. Even though we only care about a portion of those outputs, no energy was destroyed. Every bit of energy has been transformed but preserved without any destruction. It's hard to wrap your head around, but it seems to be a sound idea. Kudos to whoever came up with that idea! Essentially it preserves everything so there is basically zero entropy and therefore no loss. Even if there are minuscule amounts of entropy, the circuit is so efficient that the energy loss will be almost zero compared to classical computing. You can't do this using regular transistors. You have to build this symmetry into the physical design of the system itself, using the natural properties of nature to outsmart entropy. It's not a simple problem, but in quantum gates using quantum computing, it is theoretically possible, I'm using that word again, "theoretical". Honestly, from a purely theoretical point of view it makes sense. Just look at superconductors and we see that entropy can be overcome in certain circumstances. Now, imagine if we can do superconducting materials combined with form of super-symmetrical logic gates to overcome entropic waste. it's like taking superconducting to a meta level. It's the supersymmetry hypercube, man. Like, totally dude! Far out!
@sunnybeach40252 жыл бұрын
I find it especially captivating that you seem to specialize in language yet use the strong physical foundation in which you conduct this display. I believe my time is best spent learning more about modern language and and and a...
@antonystark92402 жыл бұрын
A reversible computer that is sufficiently free of energy loss can do computations that are driven entirely by thermal energy. You set up the program and wait. The computer will be driven back and forth through the program entirely by thermal energy. (Imagine a really slippery, nearly friction free, mechanical computer with reversible gates. Brownian motion will drive such a mechanical computer.) If you wait long enough, a halting program will eventually wander into the final state, and all you have to do is then recognize that this is the answer before the random thermal energy once again reverses the computation. The machine will wander back and forth through the program, sometimes getting back to the beginning and sometimes getting to the end, as long as heat energy is available to drive it. The entropy does not change, aside from the entropy needed to recognize the final state when it occurs.
@JohnStephenWeck2 жыл бұрын
Good video, A computation in nature implies changing an input value into a significantly different output value. Starting with and ending with 42, for instance, is always a useless operation. “Changing structures” means the same thing as “energy”. So, all computations in nature are forced to take energy (they must change some structure). This is why reversible computations can only exist in mathematics (a separate softare universe from nature). Thanks for listening 😊
@richardleeskinneriii96402 жыл бұрын
This is interesting. In cybernetics, variety is the number of possible states in a system. The larger variety your system has, the more entropy it has. It's this link between thermodynamics and cybernetics, which is something I'm very interested in.
@MrKenny3682 жыл бұрын
The part where she talks about rversible computing and the ball computer has one fun connnection to quantum computing. In quantum computing, you have make all your operations reversible, because it uses real physical object (electron) as input/output and it follows the laws of physics. You can't destroy energy or information, and that's exactly what you would do if you used non-reversible logic gate. You would have to somehow delete some of the electrons energy/infromation from reality. So it behaves like the billiard ball computer, you have to keep both balls/electrons spins as output, but use only the righ one as input for another gate (the other one just kinda sits in the system). It could thoretically overcome the Landaeur limit. But it also introduces another problem. The whole time you run a calculation, all of the used electrons have to stay part of the system/calculation. This means you have to keep a lot of electrons entangled and stable for that time. Any instability, heat, photon or cosmic ray passing by, can flip one of those electrons. And becaue all of them have to be part of that operation, for it to be reverisble, any single spin flip will fuck it up. This greatly limits how many bits such a system can have. Just thought it put this here since it kinda blurs between physics and computer science.
@JohnSmith-ut5th2 жыл бұрын
You made a logical leap from a defeating the Landauer limit for a single gate to that of a computer. Adding multiple layers of the gates you described *exponentially* increases information unless you disregard information, which is irreversible. In 2001, I developed the mathematics of a reversible computer that only needs 1 additional bit of information per gate layer (unpublished), so only linear growth.
@runethorsen84232 жыл бұрын
"Pure information" doesn't give off heat ;) only when it is transmitted (no longer pure - but physical) does energy states change (mechanical energy with a measurable thermal output).
@jakobthomsen15959 ай бұрын
Nice explanation of reversible computers! An additional challenge implementing them seems to be that the extra outputs can't be thrown away (that would produce heat again), so they must somehow be "recycled".
@helicalactual Жыл бұрын
how can you not model the system in all permutation states prior to the Nand operation and figure out which it was? it seems short sighted to say that its irreversible or "unknowable" when you can model both system states perfectly fine. the bottom line is there is information stored in the position operator, and given an "ideal" system, you could trace the "steps" of other parts of the system and re derive not assume where the constituent parts of the system are located, and re derive all the particles. including the one that obfuscated the system. the step before the Nand.
@MrDiaxus Жыл бұрын
If we defeat the Landauer limit by increasing the outputs to match inputs, aren't we not only chasing our tails but also increasing the steps in the circle we're running in? The limit describes entropy in one form, which is a term that helps maintain energy preservation, and so on. Defeating it only changes how the entropy originates, as we need more "standard" forms of energy to maintain the larger devices, and when inefficiencies are added we are only increasing entropy by doing so. Added to this is the problem that we are now not condensing information (i.e. 2 inputs, 1 output), but are maintaining the storage of both the inputs and outputs, and thus the energy required to process and store that information icreases. To my mind, in the end we are noticeably increasing the entropy and energy requirements while losing the problem-solving and data compression capabilities in a vain attempt to thwart something that is in no way a threat. Just my thoughts.
@vishalmishra30462 жыл бұрын
3:45 Largest prime under 1 billion is *999,999,937* . Computers are indeed amazing in their ability to answer unexpected questions with such low effort, cost and time.
@KipIngram Жыл бұрын
But you see, changing the internal state of the information storage elements in a compute involves moving electrical charge, literally. You can think of it a bit like all the info elements are little capacitors. To get a high voltage (a "1") onto that capacitor, you have to charge it up. To do this, you take charge from the power supply positive rail and route it onto the capacitor. Later, when you want to change that element to a "0," you have to get rid of that charge, so you discharge it to ground. So, for each 0 -> 1 -> 0 cycle, you've moved a sip of charge from power to ground. That is energy: Q*dV. Outside the processor you use some kind of power supply to pump the charge back up from round to the positive voltage. So, there's no mystery about where the energy is being used.
@gsittly2 жыл бұрын
Channel: ♥️ Topics: ♥️♥️ Explanations: ♥️♥️♥️ Jade: ♥️♥️♥️♥️ Thank you for this great stuff you share with us 😊
@robmsmy Жыл бұрын
Jade, I discovered you recently. I have enjoyed your explanations of topics I'm interested in. One thing, though, about this one. Several times you worded explanations as "this" must happen to fulfill the law (i.e. 2nd L of T). "Because of the law, this must happen." That made me wince, because when we explain why something happens, we should give the mechanism, rather than saying that the "intention" of the agents acting is to fulfill a law. A law is a general summary of what seems to be the case. It's not a mechanism to make it be the case.
@kreynolds1123 Жыл бұрын
One way to look at the universe is that it's a perfect quantum computer performing quantum calculations on every particle interaction. Some might argue the usefulness of this view but I wonder what what we might discover if we applied ireverable vs reversible computing concepts. What predictions might be made, and could they be tested.
@lensman75192 жыл бұрын
Great content ty Nice convergences with plank length, spooky connectivity, entropy and quantum computers. Thx for the headache
@thearvie5 ай бұрын
Processors (and other electronic circuits) uses energy to flip transistors on and off, "processing" the data - calculating stuff and so on. A transistor is a TRANSfer resISTOR, that wastes a bit of energy as heat in order to switch from on-state to off-state and vice versa. So, in a very very crude way, processors and other electronic circuits are just resistors that think (and convert electric energy into heat).
@williamlangley1610 Жыл бұрын
I think your explanations are very easily understood...thanks.
@AaronTheGerman2 жыл бұрын
Did you know that all logic gates on a quantum computer are reversible? In fact, every gate on a quantum computer can be represented by a unitary matrix which is applied to the state of the qubits, so it's really all just rotations
@hyperduality28382 жыл бұрын
There is a 4th law of thermodynamics:- Equivalence, similarity = duality (isomorphism). An "equivalence gate" measures similarity, sameness or dualness. Homology is dual to co-homology -- topology. Increasing the number of dimensions or states is an entropic process -- co-homology. Decreasing the number of dimensions or states is a syntropic process -- homology. From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics. All observers have a syntropic perspective according to the 2nd law. My syntropy is your entropy and your syntropy is my entropy -- duality. Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics! Teleological physics (syntropy) is dual to non-teleological physics (entropy). Convergence (syntropy, homology) is dual to divergence (entropy, co-homology). The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science. The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion. Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism. Syntropy is dual to entropy. "Science without religion is lame, religion without science is blind" -- Einstein. Science is dual to religion -- the mind duality of Albert Einstein. Your mind uses information to converge or create optimized predictions -- a syntropic process. Making predictions to track targets, goals & objectives is a syntropic process -- teleological. Duality creates reality. "Always two there are" -- Yoda. The 4th law of thermodynamics is hardwired into mathematics.
@jwjarro7311 ай бұрын
This is one of the most intuitive descriptions of entropy I've heard
@cmilkau2 жыл бұрын
Hmm, I don't think it matters whether the gate has more inputs or outputs. The signal at the inputs is still there when the output is calculated. If you think that the output is known after the calculation, but not before, entropy in the circuit always goes down by the number of outputs, no matter how many inputs. Usually, as stated earlier in the video, the Landauer energy is associated with changing one bit (e.g. a signal on a wire), or more precisely, *overwriting* it (throwing the previous value away).
@alansmithee4192 жыл бұрын
My understanding of entropy was that the more states a system can be in *that look nearly identical to its current state* the higher its entropy? Like, the entropy of some gas all shoved into the corner of a room has much lower entropy than that same gas spread out to fill the whole room, even though in both cases it's the same amount of gas and would require the same amount of data to determine the exact position of each molecule of said gas.
@johnwiltshire87632 жыл бұрын
There are some very important distinctions missing from this and that generates muddle and confusion. That Mr Claude Shannon has a lot to answer for. In his groundbreaking paper on information theory, he was persuaded to use the term "entropy" in a subtly different sense to how it relates to thermodynamics. Consequently, "Information Entropy" is not the same thing as "Thermodynamic Entropy". The former relates to uncertainty between a sender and a receiver connected via a communications channel, and the latter to the spreading out of energy concentrations. The other important distinction is between "Information" and "Information Processing". When your computer is switched off only the information in volatile RAM (Random Access Memory) is lost. Information on magnetic discs and Solid State Drives is retained without consuming power. Information processing requires physical devices that can switch from one state to the other and it is that physical switching that consumes energy and is subject to the second law of thermodynamics which does NOT apply to information. Shannon was concerned with the behavior of communications channels and the relationship between what the sender knows and what the receiver knows. However, there is a much simpler and more intuitive definition of "information" that dispels much of the mystery. If physical arrangement A is correlated with physical arrangement B then, in principle, something about arrangement A can be inferred by examining arrangement B. Shorthand for this is - A and B contain information about each other. Nothing physical passes from my brain to your brain when you read this so "information" is immaterial. The information is passed via a complex sequence of physical correlations each of which requires energy to establish. The physical components in the chain are subject to the second law of thermodynamics but the information is not.
@michaelgonzalez90582 жыл бұрын
That perception is the overall output input reveal all knowing