Intro to Information Theory | Digital Communication | Information Technology

  Рет қаралды 182,203

Up and Atom

Up and Atom

Күн бұрын

Пікірлер: 503
@xSkyWeix
@xSkyWeix 2 жыл бұрын
It is an old video and probably no one ever will read this. But I am always amazed at how much you, Jade, care for details and the cinematic side of your videos. Truly a creative treat each time. For me, all these little antics are the best part.
@kayakMike1000
@kayakMike1000 Жыл бұрын
I read this comment.
@billyalarie929
@billyalarie929 Жыл бұрын
Not only did we read this but the very creator of this thing wanted to let YOU know that YOUR COMMENT is important. Even years after the fact.
@xSkyWeix
@xSkyWeix Жыл бұрын
@@billyalarie929 Yeah, it always crack a smile on my face when I have promts about this comment :)
@joeheintz
@joeheintz Жыл бұрын
that's all she cares about, because she doesn't understand the science side of any of her videos.
@stefangabor5985
@stefangabor5985 5 ай бұрын
No, people will read it.
@graciouscompetentdwarfrabbit
@graciouscompetentdwarfrabbit 6 жыл бұрын
The fact that english is my second language and I read it in a blink (probably 3 or 4 blinks tbh. but i aint counting my blinks) makes me pretty happy
@alexjordan8838
@alexjordan8838 6 жыл бұрын
Gabriel M. English is my first language and I couldn’t even read it! Lol, keep it up!
@vishwasnegi5184
@vishwasnegi5184 6 жыл бұрын
Nice sense of humor 😂
@graciouscompetentdwarfrabbit
@graciouscompetentdwarfrabbit 6 жыл бұрын
Blinks are a good unit for sentences and possibly paragraphs. It's like using pats or 4 syllables words for measuring the hug duration (btw my go to hug-wise are 3 pats and 2 words for casual hugs, if that hug means a little more than the usual, at least double the word count and DO NOT PAT, unless you think it's enough hugging)
@monad_tcp
@monad_tcp 6 жыл бұрын
yes, probably it depends on how many hours you spend reading
@chrisschock9283
@chrisschock9283 5 жыл бұрын
Same :D
@edmundkemper1625
@edmundkemper1625 2 жыл бұрын
That "Least amount of Yes/No questions to ask to arrive at the answer" part is spot on, one of the most intuitive explanations of entropy!
@upandatom
@upandatom 6 жыл бұрын
Really so excited to be back guys :)
@piyushh8859
@piyushh8859 6 жыл бұрын
Up and Atom🙋🙋🙋🙋🙋 welcome back 🙌🙌🙌🙌🙌
@piyushh8859
@piyushh8859 6 жыл бұрын
Up and Atom you have forgotten to pin your comment 😅😅😅
@DrychronRed
@DrychronRed 3 жыл бұрын
The volume of the music is a little high in my view. It's hard for me to grok what you're saying and to process it a moment later. This is feedback meant to be constructive. Love your videos!
@mathew4181
@mathew4181 2 жыл бұрын
Shannon’s paper ' A Mathematical Theory of Communication ' defined how information is encoded, transmitted, and decoded. His most dramatic discovery was a remarkable parallel: The math that describes the uncertainty of noise is exactly the same as thermodynamic entropy. He called noise “information entropy.” Formally, heat entropy is the principle in thermodynamics that says that the path from order to disorder is irreversible . Information entropy is similar, but it applies to data instead of heat. It’s not just a pattern of disorder; it’s also a number that measures the loss of information in a transmitted signal or message. A unit of information-the bit-measures the number of choices (the number of possible messages) symbols can carry. For example, an ASCII string of seven bits has 2^7 or 128 message choices. Information entropy measures the uncertainty that noise imposed upon the original signal. Once you know how much information you started with, entropy tells you how much you’ve lost and can’t get back. Information entropy is not reversible, because once a bit has been lost and becomes a question mark, it’s impossible to get it back. Worse yet, the decoder doesn’t report the question mark! It assigns it a 1 or 0. Half the time it will be right. But half the time it will be wrong, and you can’t know which bits are the originals and which bits are only guesses. The question marks in the received signal are bits that have become lost in noise. Noise equals uncertainty. When there’s no noise and you receive a 1, you are 100 percent sure the transmitter sent a 1. The more noise there is, the less certain you are what the original signal was. If there’s lots of noise, you’re only 50 percent sure the transmitter sent a 1, because your decoder gets it wrong half the time. Nobody knows what was originally said. Because signals (encoded messages) and noise are polar opposites, coded information can never come from noise. A broken machine makes a horrible squeal. But there’s no encoder, so the squeal is not code. (It’s not digital either. It’s an analog sound wave.) An intelligent agent has to encode that squeal into digital symbols and interpret their meaning before it can be considered information Any randomness-based theory of evolution violates the laws of information entropy. Music doesn’t get better when you scratch CDs. Organisms do not gain new features when their DNA mutates through damage or copying errors. Instead they get cystic fibrosis or some other birth defect, like legs instead of antennae growing out of a fruit fly’s head. Natural selection can clear competition by killing off inferior rivals. But it can’t work backward from a random mutation and undo the damage. For many decades, the Neo-Darwinian Modern Synthesis has claimed that adding noise to a signal can occasionally improve its content. Beneficial random mutations, together with natural selection, were allegedly the key to everything. If this were actually the case, I would have to agree that Mother Nature would possess a truly amazing built-in tool of continuous improvement. How intriguing it was, then, to confirm that in computer science, networking, and telecommunications, the concept of adding noise to a signal to improve its contents simply does not exist at all; neither in theory nor practice. Claude Shannon’s work showed the exact opposite.
@bkuls
@bkuls 4 жыл бұрын
I am a master's in wireless and signal processing. And I'll tell you, you have a better knowledge than most of my peers and so-called "engineers". Kudos to your channel!
@wrathofsocrus
@wrathofsocrus 6 жыл бұрын
This is really important for the hearing impaired as well. Having as much information as possible gives a much higher probability of someone with poor hearing understanding you. Saying words without context, in strange sequences, and without complete sentences will greatly reduce the chances that it will be understood correctly.
@amaurydecizancourt4741
@amaurydecizancourt4741 2 жыл бұрын
Probably the best and clearest presentation of a subject that has been close to my heart and neurons for the last twenty years. Bravo and thank you.
@KnowingBetter
@KnowingBetter 6 жыл бұрын
This was epic. You've really upped your game after that film course... need to sign up for one of those. Glad you're back!
@upandatom
@upandatom 6 жыл бұрын
thank you! and yeah you should it was amazing. but i actually filmed that before the course lol
@johnshumate8112
@johnshumate8112 4 жыл бұрын
Holy crap, knowing better’s here, I love your videos KB keep up the great work
@matt-g-recovers
@matt-g-recovers 3 жыл бұрын
@@upandatom You have great creative talent.
@coltenbraxton4326
@coltenbraxton4326 3 жыл бұрын
Instablaster.
@fuckyoutubengoogle2
@fuckyoutubengoogle2 3 жыл бұрын
I was just saying "I think the channels [Up and Atom and Knowing Better] thoughtlessly praise each other for more clicks. I happened to have a real distaste for Knowing Better because I found out how deceptive his vids on Christopher Columbus are after watching really good rebuttals by the channel Bad Empanada. I commented about this under the Knowing Better vid but my comments were selectively deleted. They left some of my comments up but just a few out of context to make me look bad. I didn't care much about how I appear but the links and mentions of the rebuttal were removed and this is such an important topic having deadly serious real consequences."
@mathew4181
@mathew4181 2 жыл бұрын
Shannon’s paper ' A Mathematical Theory of Communication ' defined how information is encoded, transmitted, and decoded. His most dramatic discovery was a remarkable parallel: The math that describes the uncertainty of noise is exactly the same as thermodynamic entropy. He called noise “information entropy.” Formally, heat entropy is the principle in thermodynamics that says that the path from order to disorder is irreversible . Information entropy is similar, but it applies to data instead of heat. It’s not just a pattern of disorder; it’s also a number that measures the loss of information in a transmitted signal or message. A unit of information-the bit-measures the number of choices (the number of possible messages) symbols can carry. For example, an ASCII string of seven bits has 2^7 or 128 message choices. Information entropy measures the uncertainty that noise imposed upon the original signal. Once you know how much information you started with, entropy tells you how much you’ve lost and can’t get back. Information entropy is not reversible, because once a bit has been lost and becomes a question mark, it’s impossible to get it back. Worse yet, the decoder doesn’t report the question mark! It assigns it a 1 or 0. Half the time it will be right. But half the time it will be wrong, and you can’t know which bits are the originals and which bits are only guesses. The question marks in the received signal are bits that have become lost in noise. Noise equals uncertainty. When there’s no noise and you receive a 1, you are 100 percent sure the transmitter sent a 1. The more noise there is, the less certain you are what the original signal was. If there’s lots of noise, you’re only 50 percent sure the transmitter sent a 1, because your decoder gets it wrong half the time. Nobody knows what was originally said. Because signals (encoded messages) and noise are polar opposites, coded information can never come from noise. A broken machine makes a horrible squeal. But there’s no encoder, so the squeal is not code. (It’s not digital either. It’s an analog sound wave.) An intelligent agent has to encode that squeal into digital symbols and interpret their meaning before it can be considered information Any randomness-based theory of evolution violates the laws of information entropy. Music doesn’t get better when you scratch CDs. Organisms do not gain new features when their DNA mutates through damage or copying errors. Instead they get cystic fibrosis or some other birth defect, like legs instead of antennae growing out of a fruit fly’s head. Natural selection can clear competition by killing off inferior rivals. But it can’t work backward from a random mutation and undo the damage. For many decades, the Neo-Darwinian Modern Synthesis has claimed that adding noise to a signal can occasionally improve its content. Beneficial random mutations, together with natural selection, were allegedly the key to everything. If this were actually the case, I would have to agree that Mother Nature would possess a truly amazing built-in tool of continuous improvement. How intriguing it was, then, to confirm that in computer science, networking, and telecommunications, the concept of adding noise to a signal to improve its contents simply does not exist at all; neither in theory nor practice. Claude Shannon’s work showed the exact opposite.
@edmundkemper1625
@edmundkemper1625 2 жыл бұрын
An underrated comment!
@Giganfan2k1
@Giganfan2k1 3 жыл бұрын
Had a stroke that affected my langauge center. Took me a tick or two. So happy I could read. Thanks!!! Sorry, have to do this P.S. As a person on the autism spectrum. I am really going to have to digest the last 1/4-1/3 of that for a while. My lexicon/big word making/vocabulary sometimes has alienated the people around me. I am almost paralyzed daily to express myself concisely. So instead of using five sentences on everything going on I fall back on: I amble down the street. Or I saunter down the street. Everyone asks" why can't you just walk down the street?" I say" Because I don't walk correctly because of joint problems. So I have to take breaks. When I do I am talking mental inventory, or trying to be in the moment. So I could look disheveled to the casual on looker. *I didn't want to say all that.* So I ambled down the street. Where as a saunter might be something you do walking around a music festival."
@ecsodikas
@ecsodikas 4 жыл бұрын
I love how Mark looks exactly like a guy who knows a whole lot about words. ❤️
@noahmccann4438
@noahmccann4438 6 жыл бұрын
Mark’s walk through the history of writing was very enlightening. At first it seems nonsensical that early writers would omit things like spaces and punctuation but if framed in the context of a culture with a small number of writers it makes more sense. If what you only write for or consume from a small group of people, you can afford to have very unique writing rules. We even see this today - when writing a note for yourself you may write illegibly or omit information because you expect you’ll remember part of it anyways. As a software developer I’ve seen something very similar at play in my coding - personal projects I work on won’t follow standards as closely, and I’m more likely to come up with my own patterns loosely based on common ones. But at work, we try to stick to standards from multiple levels - standards in the team, in the project, and in the industry. That said, in both the writing and coding examples above, there are other things at play - I understand that early writing was very much driven by the medium being used, and I’m sure there were time constraints that encouraged conciseness over exhaustiveness (if you have to manually copy all texts, you probably don’t want to do more work than needed assuming the recipient will understand anyways).
@IdeaStudioBKK
@IdeaStudioBKK 6 жыл бұрын
Fantastic video. I am a massive fan of Shannons work, it was really hammered into me in my undergrad days.
@upandatom
@upandatom 6 жыл бұрын
thank you!
@schmetterling4477
@schmetterling4477 3 жыл бұрын
That is one of the best explanations I have ever seen. Excellent.
@Phrenotopia
@Phrenotopia 6 жыл бұрын
NXW XLL THX CXMMXNTS SHXXLD BX XN THXS FXRMXT!!!
@upandatom
@upandatom 6 жыл бұрын
:)
@derlinclaire1778
@derlinclaire1778 6 жыл бұрын
Now all the comments should be in this format!
@NetAndyCz
@NetAndyCz 6 жыл бұрын
I only wonder whether y is vowel or consonant...
@jerrygundecker743
@jerrygundecker743 5 жыл бұрын
HXRR-DX-HXRR-HXRR!!
@amsterdamdaydreams2420
@amsterdamdaydreams2420 5 жыл бұрын
@@jerrygundecker743 XKXY MXTX CHXLL
@Dixavd
@Dixavd 6 жыл бұрын
I love the cat's turn at 7:32 and then the fact it's missing at 7:41
@upandatom
@upandatom 6 жыл бұрын
haha he's quick!
@fuzzmeister
@fuzzmeister Жыл бұрын
Your work is simply brilliant. So helpful and put together with passion and carefully curated for the education of your audience. Abundance - people like you are working hard to create it ! 😍 - thankyou!
@Jamie-st6of
@Jamie-st6of 4 жыл бұрын
you're absolutely correct, but i think that it's worth mentioning that the more confusing redundancies in english are mostly inherited from french (and probably some type of proto-german, but i dont know much at all about the germanic languages). the silent 'e' on the end of words actually comes from the french spelling system (a silent 'e' was added to make the final consonant pronounced rather than silent, and sometimes used to indicate grammatical gender). 'qu' clarifies the pronunciation of 'c' before certain vowels ('i' and 'e'), similar to the silent 'u' in 'guerilla' and the french 'ç' (cedilla is sort-of the opposite of 'qu'). some digraphs also come from french, such as the 'au' in 'aura'. unlike everything else, most of english's vowel pronunciation weirdness is an indigenous phenomenon mostly developed by the english themselves. (if you're curious, look up 'The Great Vowel Shift') side note: it's somewhat odd to use the past tense to refer to scripts without upper/lower case and without vowels, given that multiple extant scripts have those properties (hebrew, arabic, syriac, aramaic but only a handful of villages still use aramic script, and arguably chinese but it's so different it may as well not count).
@KuraSourTakanHour
@KuraSourTakanHour 5 жыл бұрын
Now I want to know the entropy of Chinese and other languages 😅
@testme2026
@testme2026 5 жыл бұрын
This is by far the best explanation ever, yet when you search for it it comes well down the list.
@NMIC374
@NMIC374 3 жыл бұрын
I love your videos!! my new favorite math, physics, scientific philosophy, (etc.), KZbinr!!!!!
@xacharon
@xacharon 6 жыл бұрын
The coin flip demo in this video (and the use of "funky" and "fair" coins) reminded me of the "Quantum coin toss" video featuring you and PhysicsGirl. :)
@doodelay
@doodelay 5 жыл бұрын
I love your channel and just now found you today! You're one of the best of the best in the math and physics youtube because you cover a HUGE variety of topics and give yourself enough time to do so WHILE adding animations and music! This channel's an absolute gold mine :D
@zrodger2296
@zrodger2296 3 жыл бұрын
Just watched a documentary on Claude Shannon. Wanted to dig deeper. Best explanation of this I've seen yet!
@Ghost572
@Ghost572 2 жыл бұрын
I think this is the first time I've binge watched a you tube channel, all the titles appeal to me. I'm surprised this didn't come into my recommended earlier.
@1495978707
@1495978707 5 жыл бұрын
4:25 4.7 is the base 2 logarithm of 26. Shannon entropy uses the base 2 logarithm to estimate the number of questions you have to ask because the way she divided the alphabet to ask questions is the most efficient way to get to an answer. Always divide up the possibilities into halves, and then the log tells you how many times you need to halve to get to knowing the answer.
@ARTiculations
@ARTiculations 6 жыл бұрын
This video totally blew my mind and I am loving this collab so much ❤️
@TroyBarton-ns6fo
@TroyBarton-ns6fo 7 ай бұрын
@gypsyjr1371
@gypsyjr1371 4 жыл бұрын
Thanks for your cool and educational videos! This is one I can possibly even understand fully. Long ago, when MOSfets had first come on the scene, I graduated from West Point with enough Mathematics for a BS, enough Electrical Engineering credits for a BS in that, and enough computer use (was a part time mainframe computer tech too) and credits for a CS if only it has existed then. But being the Army, I got a BS General Science (meaning strategy and tactics) instead. Thats all they gave out. So over the years, I worked for the Department of Defense, for contractors to same, for the Navy, and for a black box contractor to DoD who had a device (in 1980) that could be in a van and read a computer screen inside a building it was parked next to. No longer classified, and long replaced by better technology. I wrote AI, telecommunications systems, video and voice storage in databases with advanced search abilities, the first bank computer system which answered your calls and frustrated you to no end (not proud of that but it *was* and *is* a technology marvel). Among other things, I wrote code that allowed an encrypted message to be sent over the fledgling internet to a server, and then it would be relayed by satellite to the destination specified in the header of the encoded message. So this time, this video, I don't have to concentrate much on to understand. :)
@TheJacklwilliams
@TheJacklwilliams 2 жыл бұрын
@Gypsy JR, freaking wow. What amazes me is the number of people over the years that leap into tech chasing the money train. Which of course, we are all somewhat guilty of. However, I learned quite some time ago, left the business for about 5 years, that the real pull for me is simple intellectual curiousity and a deep love for all things tech. You, have had an incredible career. Mine has some significant highlights however, if I could re-wind, I'd of went down the dev rabbit hole in a similar fashion. I've dabbled. The last chapter for me is dev. I'll be here until the lights go out. So, question. What out of all of the experiences you've had, did you enjoy the most?
@maccarrena
@maccarrena 5 жыл бұрын
Cool topic and cool video. I recently read a paper about entropy of the most commonly spoken languages and it stated that English has a lot of redundancy, the Asian languages had much higher entropy (around 3.5 - 4.5 bits) and I think French was the least effective amongst all chosen (around 1.5 bits).
@齐阳-b1b
@齐阳-b1b 5 жыл бұрын
Very helpful tutorial! Finally I understand entropy! Thanks!
@LouisHansell
@LouisHansell 3 жыл бұрын
Jade, I always enjoy your videos. You restore my wave function.
@danielbrockerttravel
@danielbrockerttravel 8 ай бұрын
I came up with similar ideas years ago on objective uncertainty and belief possibility. Unfortunately I had never heard of information theory. I would have benefitted a lot from seeing this video!
@lowhanlindsey
@lowhanlindsey 6 жыл бұрын
Your channel is the best thing on the internet!
@upandatom
@upandatom 6 жыл бұрын
aww thank you! n_n
@SuviTuuliAllan
@SuviTuuliAllan 6 жыл бұрын
I use Cosmic Microwave Background radiation for all my entropy needs!
@rakeshbhaipatel1714
@rakeshbhaipatel1714 2 жыл бұрын
That's a pretty good intuitive explanation of entropy. It would be great if you could make a video on mutual information.
@will4not
@will4not 6 жыл бұрын
Oh my goodness: I came for an interesting video on understanding language. I never thought it would involve entropy. That’s so cool.
@upandatom
@upandatom 6 жыл бұрын
haha thanks! Glad you learnt something new :)
@jupahe6448
@jupahe6448 6 жыл бұрын
Even as a nonnative speaker this is very interesting, so glad you're back 😊 Greetings from Germany 🇩🇪
@upandatom
@upandatom 6 жыл бұрын
thank you so glad to be back! :)))
@otakuribo
@otakuribo 6 жыл бұрын
*Hypothesis:* dubstep contains more entropy on average than other forms of electronic music, but oddly not as much as rap which has more redundant backbeats but lots and lots of lyrical information
@upandatom
@upandatom 6 жыл бұрын
this is an interesting theory
@acommenter
@acommenter 6 жыл бұрын
U always comes after a Q!? I'll go tell my Iraqui friends about that one!
@pierreabbat6157
@pierreabbat6157 6 жыл бұрын
Donnez-leur cinq coqs qataris.
@alexv3357
@alexv3357 6 жыл бұрын
For a moment, that joke went over my head and I thought you just misspelled Iroquois
@ptousig
@ptousig 5 жыл бұрын
Qat... a very useful scrabble word.
@srinivastatachar4951
@srinivastatachar4951 5 жыл бұрын
Qantas doesn't think so...
@Salsuero
@Salsuero 4 жыл бұрын
All that Qatari money is betting against this truth.
@deadman746
@deadman746 Жыл бұрын
There are a couple of things I find particularly interesting. One is that in general it is not possible in general to fix the entropy of an utterance by itself. It is only possible to do so if you know all the information used in encoding and decoding, which you don't. This is the same with thermodynamic entropy. Also, the psycholinguists noticed that diachronic changes in language tend to follow a surprisingly precise balance between redundancy and efficiency. As soon as a language gets 3% off optimal, organically, it will adjust to correct given a large enough community of speakers.
@celsorosajunior
@celsorosajunior 6 жыл бұрын
Realy cool! Any additional video about the importance of this subject to encoding digital information? It would be great!
@upandatom
@upandatom 6 жыл бұрын
a lot of people have been asking me about this so I'll keep it on my radar :)
@zincwing4475
@zincwing4475 4 жыл бұрын
Shannon, my teenage hero as a teenager.
@JTheoryScience
@JTheoryScience 5 жыл бұрын
'Information' was NOT spelled incorrectly because i understood what you wrote and the point of writing is to convey information to someone so they comprehend it. 100% totally perfect no mistake here in my opinion. Bloody delightful to get another Australian/ NewZealander onto KZbin Science-ing.
@archeronaute5041
@archeronaute5041 5 жыл бұрын
The physic entropy is also a measure of uncertaincy. The entropy of the macroscopic state of a system can be deffine by the Boltzmann equation as S=K.ln(W) where W is the number of microscopic state possible for this macrostate. So the bigger the entropy is the more there are possible microstates which means the less you know about a system when you know its macrostate.
@burningsilicon149
@burningsilicon149 2 жыл бұрын
I like to think of entropy as the total number of possible states.And information as something that reduces the number of possible states.Like if you flip 2 coins before looking you have 4 possible states TT, TH, HT, HH.If someone tells you the second one is a Head this reduces the possible states by a factor of 2 TH,HH.That amount of reduction is the information in the statement “the second coin is heads”.
@travisterry2200
@travisterry2200 5 жыл бұрын
This channel is way more my level than .mind you decisions. or even numberphile. 👍
@awuuwa
@awuuwa 2 жыл бұрын
the music at the end is brilliant
@thesentientneuron6550
@thesentientneuron6550 6 жыл бұрын
3:56 I like that song. Gives me happy thoughts.
@MMarcuzzo
@MMarcuzzo 4 жыл бұрын
Have been watching your videos and really liking them. And surprisingly, I see something of my favourite subject: information theory!
@joshsatterwhite1571
@joshsatterwhite1571 6 жыл бұрын
Jesus, Jade, 25K subs already? You're blowing up, and you certainly deserve it.
@upandatom
@upandatom 6 жыл бұрын
thank you! but it's all thanks to my collaboration with physics girl
@PhingChov
@PhingChov 6 жыл бұрын
Physics Girls brought us there, but your the reason we stay and subscribe. Jade, keep up the quality work and we'll be back for more!
@PetraKann
@PetraKann 5 жыл бұрын
how many followers would Jade get if she decided to only dub her voice or exact audio over the top of the video footage?
@danielchettiar5670
@danielchettiar5670 3 жыл бұрын
@@upandatom 2 years and 300k+ subs later.... You deserve it!
@hyperduality2838
@hyperduality2838 4 жыл бұрын
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Repetition (redundancy) is dual to variation. Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Randomness (entropy) is dual to order (predictability). "Always two there are" -- Yoda.
@varshneydevansh
@varshneydevansh Жыл бұрын
Beautifully explained
@petrskupa6292
@petrskupa6292 4 жыл бұрын
You are great! Funky, clever, informative..
@leesweets4110
@leesweets4110 3 жыл бұрын
Thats some good dubstep. Surprising to hear it in one of your videos.
@yuxiaoluo9414
@yuxiaoluo9414 5 жыл бұрын
very good representation, really like the sample of noisy environment.
@jasperh6618
@jasperh6618 6 жыл бұрын
There's an interesting link to be made between redundancies in written communication and the (efficiency of) transmission of bits in computers. In computers, sending a message once is cheap and fault sensitive while sending a message twice is expensive, so you want to encode the message in such a way you get some benefits of both worlds. I wonder how much redundant information in written communication can be left out until only the equivalent of "send a message once" remains
@haitham3afef103
@haitham3afef103 4 жыл бұрын
This video is rich in information. It should have a high entropy!
@laurenr8977
@laurenr8977 4 жыл бұрын
Thank you for this! Was the most accessible video on this that I've found.
@Lucky10279
@Lucky10279 4 жыл бұрын
Thermodynamic Entropy is _not_ about disorder. It's about how many states potential states a system can exist in. The more states, the more entropy. The second law of thermodynamics says that the entropy if the universe will always increase or stay the same -- that's just saying that the total amount of microstates (essentially, a microstate would be the exact configuration all of the particles in a system) of the universe is increasing. Looking at it that way, it's a lot clearer how it's related to the kind of entropy she's talking about here.
@jamesmcpherson1590
@jamesmcpherson1590 Ай бұрын
I happen to recall from my UK travels that the helmet/mask behind Mark is the Sutton Hoo helmet at the British Museum.
@tinaburns1376
@tinaburns1376 4 жыл бұрын
I LOVE THIS VIDEO. It made is so easy to understand entropy. It is also funny.
@glaucosaraiva363
@glaucosaraiva363 6 жыл бұрын
You are in the right direction for a huge success! Your videos are getting better and better...Congratulations from Brazil!
@mobile2
@mobile2 4 жыл бұрын
if i could watch videos like yours when I was a secondary school student, I would learn physics more interesting. Channel capacity C=B*log2(1+S/N) told by Shannon-Hartley theorem is famous in telecommunication. I am a cellular radio network engineer. I was afraid to study semiconductor and electromagnetism when I studied electronic engineering. The Maths are very difficult (e.g Maxwell equations)
@travcollier
@travcollier 5 жыл бұрын
The higher the entropy, the higher the *possible* information in the system. In fact, entropy = maximum possible information a system can encode. A high entropy sequence could be encoding information, or it could be random noise. BTW: Information is how much knowing the state of one thing reduces your uncertainty about another thing. Information is always "with respect to" something else. So another option is that you have a high entropy thing which encodes a lot of information about something you don't care about (or maybe don't even know exists).
@AnandKumar-ql1sv
@AnandKumar-ql1sv 5 жыл бұрын
I don’t comment on videos but your ‘s vidz r worth commenting... Loved the way you explained...
@dannybaseball2444
@dannybaseball2444 2 жыл бұрын
Great teaching as always and I love the way this video anticipates the satisfaction of solving wordle.
@replynotificationsdisabled
@replynotificationsdisabled 5 жыл бұрын
I love her hand gestures.
@nosson77
@nosson77 8 ай бұрын
Unlike many videos of entropy that are not strictly technical and like to use real world situations this video seems quite accurate. But I question the explanation for the messages to friends having higher entropy. Higher entropy means having more information per bit and therfore less chance of a misunderstanding, that os true. Butte reason for the misunderstanding in todays text messages is not just because of redundancies but rather many times its because of ambiguity. And that is actually a loss of information so it lower entropy.
@nolanjshettle
@nolanjshettle 6 жыл бұрын
Jif is peanut butter. Graphics has a hard g. Idgaf what the original creator wanted to troll the internet with
@Russet_Mantle
@Russet_Mantle 11 ай бұрын
4:56 that aggressive screen shake got me rolling
@IIIIIawesIIIII
@IIIIIawesIIIII 6 жыл бұрын
She is really asking the right questions and get's the answer on point, I'm always impressed. Yet I have found, that it's much easier for me to concentrate on a subject and actually remember stuff if the narrator is an ugly average guy.
@Hecatonicosachoron
@Hecatonicosachoron 6 жыл бұрын
It's a fun topic... entropy is the average information content of a macrostate.
@PixelPi
@PixelPi 2 жыл бұрын
Jade, I couldn't figure it out. I'm autistic, so I suppose this confirms that Ænglish is my first language.
@cakeisamadeupdrug6134
@cakeisamadeupdrug6134 6 жыл бұрын
7:32 It's 1am and I think everyone just woke up...
@upandatom
@upandatom 6 жыл бұрын
lol good
@SeanVoltaire
@SeanVoltaire 4 жыл бұрын
This is REALLY well done and does a GREAT job of distilling complex concepts into simple steps for laypeople :)
@Dragrath1
@Dragrath1 5 жыл бұрын
Rather than uncertainty with the physics based definition of entropy it is probably better to describe entropy as the number of states that can produce the same result as that lets you more easily bridge the gap between information theory and statistical mechanics
@jaydeepvipradas8606
@jaydeepvipradas8606 5 жыл бұрын
English is very elaborate and that's very important. Every minute expression in the brain can be represented correctly, e.g. I feel sorry VS I feel sad. The disadvantage of this is that compression is lost in many words. E.g. the word "condolences" has lot of compressed information about the situation. But many other words being elaborate has lost compression which can be found in other languages. So amount of information is much more than just the letters in a word. Along with redundancy, there is ambiguity sometimes, like, words test and taste sound very similar. One important aspect of any word is what it triggers in the brain, e.g. the word far sounds really away from us, like faaar, and the word near, if said fast, really feels close to us. However many other words may lack this property, e.g word up trigger correct emotion in the brain, but the word height does not. I think the best language should have best of all three worlds, elaborate plus triggering correct emotion plus compression that can tell lot of background situation.
@solapowsj25
@solapowsj25 4 жыл бұрын
Get the last word that's 'missing'. Just like in free space. Light force at 'c' displaces the medium creating a vacuum during rectilinear propagation at 'c'. The Dalton atom shell does focus the energy toward its centre to a point where even force doesn't pass. The Planck length. Perhaps, regions with 'missing' matter are bound by imagined kinetic forces.. Magnetron? Neutron?
@shiblyahmed3720
@shiblyahmed3720 4 жыл бұрын
Hey, I like your style and the way you talk!! Didn't I see someone just like you who talks about math & physics on KZbin?
@JasonSpano
@JasonSpano 6 жыл бұрын
Glad you're back! This was an awesome video. Thanks for posting! :)
@blijebij
@blijebij 3 жыл бұрын
Very interesting and explained with care and passion! Love it!
@Fcc1309
@Fcc1309 4 жыл бұрын
Hi, I just want to say that the physics definition of entropy isn't the caos or disorder of a system, it is related with how many microstates has the same macro properties (like temperature and volume) . Using this point of view of the entropy we have almost the same as in computer science. Cheers from Chile
@informationtheoryvideodata2126
@informationtheoryvideodata2126 2 жыл бұрын
It is important to clarify that the number of questions we have to ask defines the entropy does not define the information in a general way. This distinction is fundamental especially in relation to new developments in the theory of information defined by the Set Shaping Theory SST.
@GenghisVern
@GenghisVern 5 жыл бұрын
So language is to be reduced to emoticons? That's diabolololical English is more resilient to entropy? love it!
@DMSG1981
@DMSG1981 6 жыл бұрын
I struggled really hard to understand entropy (physics) until I learned about information entropy. But then again, I'm a computer scientist. That might help....
@Ureallydontknow
@Ureallydontknow 4 жыл бұрын
Before we can say that the average entropy of english is 2.62 bits per character, some conditions must be met first. For example, a message length 1 does not have an average of 2.62. the entropy for length 1 is between 4 and 5 for a 26 letter alphabet. 32 letters is exactly 5 for a message length one if q no u is possible. Assuming white space is not in the alphabet.
@HYPERC666
@HYPERC666 6 жыл бұрын
Great channel. It's in my daily doseage with Sixty Symbols. Keep up the awesome work.
@juliocardenas4485
@juliocardenas4485 4 жыл бұрын
This is absolutely fabulous!!
@izzyhaze7347
@izzyhaze7347 4 жыл бұрын
Exactly what I needed to make my day
@yaminohotoke
@yaminohotoke 5 жыл бұрын
Cool video! Lots of information... wonder what's the entropy of a KZbin video?!
@baharehshahpar8674
@baharehshahpar8674 2 жыл бұрын
Amazing explanation thank you 🙏🏼🙏🏼
@klaasterpstra6119
@klaasterpstra6119 3 жыл бұрын
Good explanation of a difficult concept!
@rodrigotravitzki7535
@rodrigotravitzki7535 2 жыл бұрын
totally wonderful! just thinking about... would this "before and after the flip" be related to Bayesian approach?
@doylesaylor
@doylesaylor 4 жыл бұрын
I think overall these are excellent posts on a lot of topics I like to hear you discuss. My ‘question’ is what is the physical representation of the word, question, itself? The theory is posed in terms of text and the logic of writing information. There are strange physical concepts in such efforts like characterizing gaps in steps as instantiations. Disentangling notations and or scripts we might find interesting approaches by finding the physical meaning of ‘question’.
@originalveghead
@originalveghead 4 жыл бұрын
Really well explained. Thanks.
@gamewarmaster
@gamewarmaster 6 жыл бұрын
Yaay someone played banjo tooie too! 0:53
@MikeD-tf7dk
@MikeD-tf7dk 3 жыл бұрын
Your stuff is amazing and your adorableness makes it less intimidating and more accessible. ;) (Hey, if it works for Leonard in Big Bang Theory why not? Except you’re real!) It’s great you’re out there helping reshape the still prevalent bias towards men in your field.
@lukebradley3193
@lukebradley3193 5 жыл бұрын
There’s other ways to compute information for 5 letter word, since information tied to probability. If all 158390 5 letter words are equally likely, information in it is -log2(p) = -log2(1/158390) ~ 17 bits. With more information, like about words she is likely to choose, some words get less probable (more info) or more probable (less info). If we know with high probability she’ll choose one of ten hip fun 5 letter words, it could even have less info than the coin flips.
@AvenMcM
@AvenMcM 6 жыл бұрын
Hurray! Great to see this, welcome back!
@empire-classfirenationbatt2691
@empire-classfirenationbatt2691 6 жыл бұрын
I knew about this concept but I watched the video anyways because I knew I'd learn something new I didn't know before, it's been a while since I've seen one of your videos... and you're cute😂😂😍 Keep up the good work😂👍😝
Something weird happens at 770°C
18:45
Up and Atom
Рет қаралды 274 М.
Twin Telepathy Challenge!
00:23
Stokes Twins
Рет қаралды 62 МЛН
СКОЛЬКО ПАЛЬЦЕВ ТУТ?
00:16
Masomka
Рет қаралды 2,9 МЛН
Can you decode this ALIEN MESSAGE?
16:43
Up and Atom
Рет қаралды 259 М.
Information Theory Basics
16:22
Intelligent Systems Lab
Рет қаралды 69 М.
Claude Shannon - Father of the Information Age
29:32
University of California Television (UCTV)
Рет қаралды 359 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
Entropy in Compression - Computerphile
12:12
Computerphile
Рет қаралды 393 М.
The Bizarre Shape Of The Universe
18:39
Up and Atom
Рет қаралды 391 М.
What is NOT Random?
10:00
Veritasium
Рет қаралды 8 МЛН
Pure Information Gives Off Heat
19:21
Up and Atom
Рет қаралды 457 М.
Оу МАЙ! Огромный ПЛАНШЕТ, в котором ЕСТЬ ВСЁ! И очумелый SHARK 3!
16:43
Apple phone #shorts #trending #viralvideo
0:48
Tech Zone
Рет қаралды 730 М.
90 FPS ЗА 30.000 РУБЛЕЙ
9:06
Игорь Линк
Рет қаралды 79 М.