Claude Shannon's Information Entropy (Physical Analogy)

  Рет қаралды 140,156

Art of the Problem

Art of the Problem

10 жыл бұрын

Entropy is a measure of the uncertainty in a random variable (message source). Claude Shannon defines the "bit" as the unit of entropy (which is the uncertainty of a fair coin flip). In this video information entropy is introduced intuitively using bounce machines & yes/no questions.
Note: This analogy applies to higher order approximations, we simply create a machine for each state and average over all machines!

Пікірлер: 141
@ArtOfTheProblem
@ArtOfTheProblem 3 жыл бұрын
Link to series playlist: kzbin.info/aero/PLbg3ZX2pWlgKDVFNwn9B63UhYJVIerzHL
@lewiskamryn6813
@lewiskamryn6813 3 жыл бұрын
you probably dont give a shit but if you are stoned like me atm you can watch all of the new movies on instaflixxer. I've been binge watching with my gf for the last weeks =)
@kodamessiah3582
@kodamessiah3582 3 жыл бұрын
@Lewis Kamryn yea, have been using instaflixxer for years myself =)
@iliyasone
@iliyasone 5 ай бұрын
Thank god I found this video again Still this is the best explanation about Shannon formula that exists in KZbin
@ArtOfTheProblem
@ArtOfTheProblem 5 ай бұрын
nobody finds this anymore, glad you did!
@realfuzzhead
@realfuzzhead 10 жыл бұрын
To whoever runs ArtOfTheProblem, I want to personally thank you for all of the time you have spent making these videos. I came across one of them on a math forum a few months ago and I proceeded to consume every video you have made. Your videos led me to absolutely fall in love with information theory and persuaded me to open up a major in computer science. Claude Shannon has replaced Richard Feynman as my favorite scientist. I can't thank you enough, these videos literally changed the path I'm taking through life. 
@johnallard2429
@johnallard2429 7 жыл бұрын
He's the same guy, Khan academy liked his videos and brought them onto the site
@ArtOfTheProblem
@ArtOfTheProblem 5 жыл бұрын
stay tuned for more!
@ArtOfTheProblem
@ArtOfTheProblem 3 жыл бұрын
hey! just checking in, how did your major end up, where did it lead you?
@johnallard2429
@johnallard2429 3 жыл бұрын
@@ArtOfTheProblem Hey! Thanks so much for checking in. I ended up graduating w/ a BS in Computer Science from UC Santa Cruz and I'm currently four years into my career as a backend software engineer in Silicon Valley. I just left the first startup I joined out of college and joined a new one and I'm currently working on building a platform for training data management for enterprise ML applications. I really wanted to thank you again for making these videos, they altered the course of my studies and my career and, hence, my life. I still, to this day, read about information theory in my own time and still hold Claude Shannon up to be one of my favorite scientists. (I just noticed I'm posting from a new email address but it's me!)
@ArtOfTheProblem
@ArtOfTheProblem 3 жыл бұрын
@@johnallard2429 wow congrats. This note made my morning and is the reason I started this channel. You're probably way ahead of me now, i'm struggling to simplify my understanding of transformers at the moment for the next video :)
@mycityofsky
@mycityofsky 7 жыл бұрын
This is the most intuitive video I've ever seen about Shannon entropy, thank you!
@dangerlibya2010
@dangerlibya2010 7 жыл бұрын
I'm studying computer science and I wish all my professors explained things just like you !! that would've saved a lot of time !
@5ystemError
@5ystemError 10 жыл бұрын
Thanks for uploading a new video!!! I just started watching these a few days ago and was bummed out when I saw you hadn't uploaded one in a little while. These are awesome.
@soumyadeeproy6611
@soumyadeeproy6611 4 жыл бұрын
extremely intuitive explanation,so well explained I was wondering why the heck I couldn't understand the way you explained !
@ArtOfTheProblem
@ArtOfTheProblem 4 жыл бұрын
I'm thrilled to hear this video helped you. it was an original explanation of this concept I don't think many are aware of
@GuruKal
@GuruKal 7 жыл бұрын
Definitely one of my favorite explanations of all time
@diego898
@diego898 10 жыл бұрын
Thank you for continuing! These are fantastic!
@pasansamarakkody4053
@pasansamarakkody4053 Жыл бұрын
Simply great explanation. Thanks!
@319quang
@319quang 8 жыл бұрын
These videos are extremely informative and very well done! Thank you for all your hard work! :)
@ArtOfTheProblem
@ArtOfTheProblem 8 жыл бұрын
+quang nguyen Thanks for feedback! Our new series has begun
@kc122
@kc122 10 жыл бұрын
It's great to see a new video after a while :) thank you!
@ako969
@ako969 6 жыл бұрын
This Shannon guy is way ahead of his time. The father of data-compression in times when there were no Internet, computers or even binary-files.
@ako969
@ako969 6 жыл бұрын
Ok. Not just data-compression. All digital encoding. This guy rivals Alan Turing, if not better. (Essentially the real first computer guy in terms of 'digitizing')
@baseballsbetter
@baseballsbetter 10 жыл бұрын
Great video! Glad to see you are back
@ncaralicea
@ncaralicea 7 жыл бұрын
Very nice intuitive explanation. Thank you!
@bartziengs9602
@bartziengs9602 7 жыл бұрын
Very nice video and clear explanation!
@pezaventura
@pezaventura 9 жыл бұрын
These videos are amazing!
@SirusDas
@SirusDas 6 жыл бұрын
This the best explanation on the INTERNET! Amazing, Awesome and Thank You!
@emmamovsesyan
@emmamovsesyan Жыл бұрын
I consider that Khan Academy is one of the best things that we have in this internet era
@KenCubed
@KenCubed 10 жыл бұрын
I am greatly enjoying these!
@JankoKandic
@JankoKandic 10 жыл бұрын
Good to have you back :)
@RimstarOrg
@RimstarOrg 10 жыл бұрын
Very clear explanation. Thanks.
@stevebez2767
@stevebez2767 6 жыл бұрын
RimstarOrg not in practice, SHA Syntacts unrealised post Collins Diction of call out too non self,I,you,me,them,we,us etc item subject object cycled online from individuals (English post Celt Anglo sax,say,for example)into literate nonsense of eventual some nutter going round system as act too get all way under Bletchly wooden hut non Enigma you need learn two ring Turing tar black out clout law system running Bowe street banker non nuts into lease of no where no be biz kit shopping street mummy's coming boooo?
@TrustinDom
@TrustinDom 6 жыл бұрын
Steve Bez
@spandanhetfield
@spandanhetfield 6 жыл бұрын
To anyone who did think of this, why would you ask questions such "AB v/s CD" in first case, and "A v/s BCD" in second case - The reason is that this ensures that both answers can occur with 50% probability. If at every step, you ask 2 questions of equal probability, the AVERAGE number of times you'll have to ask the questions (in other words, follow that tree structure of question/answers), will be minimum. Why? You'll need a little bit of basic computer science experience to think this one through, maybe a good reason to take a course on data structures! More interestingly, this is also the reason why binary search is more efficient that splitting a sorted list into 3 parts. It's a simple proof, I encourage you to think it through :)
@PedroTricking
@PedroTricking 5 жыл бұрын
> If at every step, you ask 2 questions of equal probability, the AVERAGE number of times you'll have to ask the questions (in other words, follow that tree structure of question/answers), will be minimum. Why Why???? Proof please
@77Fortran
@77Fortran 8 жыл бұрын
Incredible video, thank you! :)
@midevil656
@midevil656 10 жыл бұрын
Could never understand the information entropy unit in class, but you nailed it here. Thanks!
@ky-effect2717
@ky-effect2717 3 ай бұрын
great explanation
@niharikarastogi7169
@niharikarastogi7169 7 жыл бұрын
I like the creativity u put in explaining anything. Simply wow.. :)
@ArtOfTheProblem
@ArtOfTheProblem 7 жыл бұрын
Thanks for your feedback , much appreciated
@6san6sei6
@6san6sei6 10 жыл бұрын
i simply love your videos.
@steventmhess
@steventmhess 10 жыл бұрын
Beautiful work.
@chenphillis8612
@chenphillis8612 6 жыл бұрын
very clear explanation! thank you
@jinnycello
@jinnycello 7 жыл бұрын
Very easy to understand. THank you!
@stevodestructo
@stevodestructo 10 жыл бұрын
Love your videos. Thank you.
@tmstani23
@tmstani23 5 жыл бұрын
Great video and super interesting topic. I think this definition of information is super counter-intuitive of the everyday common sense use of the word "information". People often think of information as a source of knowledge or having some inherent use. It seems like the comment-sense definition of information is more akin to education and implies some utility. Whereas this definition of information seems to remain indifferent to its utility and is closer to complexity or an implied amount of substance. This makes sense since entropy and information are increasing in the wild. It is fascinating that information can be proven to be entropic. I wonder if there is a limit to information or if entropy increasing means the universe is infinite in possibilities? Maybe we can only observe an information state but information itself is infinite.
@onecanina
@onecanina 10 жыл бұрын
Niiiiice!! Thank you, i was waiting for the longest time!
@bandulasena1668
@bandulasena1668 5 жыл бұрын
Bĺ70ĺmsm ñ
@completelymindfucked
@completelymindfucked 10 жыл бұрын
first video since i subscribed. it was awesome
@kamalgurnani924
@kamalgurnani924 6 жыл бұрын
Very nice explanation! thanks for the help :)
@ThanadejR
@ThanadejR 3 жыл бұрын
Super clear explanation.
@ArtOfTheProblem
@ArtOfTheProblem 3 жыл бұрын
love all your feedback :)
@marcinkovalevskij5998
@marcinkovalevskij5998 6 жыл бұрын
Thank you for all your hard work making these videos. Every video is made putting attention to the smallest detail. And that intro sound...
@ArtOfTheProblem
@ArtOfTheProblem 6 жыл бұрын
thanks so much, working on video now on Bitcoin i'm excited about
@sovansam2793
@sovansam2793 5 жыл бұрын
Thank you alots man....This clear explaination get me to the point.
@taylor8294
@taylor8294 10 жыл бұрын
Great, great video
@jamesimmo
@jamesimmo 7 жыл бұрын
Very good video, thanks, but I see (from my UoM physics course notes) that the natural logarithm is used instead of the base-2 logarithm: is this choice arbitrary or does it vary to suit a given situation?
@DarshanKalola
@DarshanKalola 6 жыл бұрын
Great video!
@arefsmiley4320
@arefsmiley4320 7 жыл бұрын
it was great explanation. Thanks
@cy9987
@cy9987 4 жыл бұрын
Wow such a well made video! You totally desire more views and subs my good sir!
@raghavdevgon5124
@raghavdevgon5124 7 жыл бұрын
Thanks for such videos :)
@SashankDara
@SashankDara 10 жыл бұрын
Just awesome !
@necrosudomi420thecuratorof4
@necrosudomi420thecuratorof4 11 ай бұрын
bro ii watch lots of vids and its first time i see a new approaches to describe it pretty dam well articulated ' thanks for sharing it .
@ArtOfTheProblem
@ArtOfTheProblem 11 ай бұрын
nice glad you found this
@hl2mukkel
@hl2mukkel 10 жыл бұрын
HE IS ALIVE! =D WOOHOO
@brunomartel4639
@brunomartel4639 4 жыл бұрын
Awesomely explained! what's the song at the end?,i love it,it's so, optimistical!
@ArtOfTheProblem
@ArtOfTheProblem 4 жыл бұрын
thanks all music is original for this series
@javipdr19
@javipdr19 10 жыл бұрын
Yay! New video :)
@dongzhiyang6550
@dongzhiyang6550 7 жыл бұрын
Fantastic! Thanks you
@Chr0nalis
@Chr0nalis 7 жыл бұрын
The beginning of the video is misleading and implies that the second machine is not random. It is in fact random and the difference between them is that the first generates a uniform distribution whereas the second doesn't.
@ravenimperium1756
@ravenimperium1756 7 жыл бұрын
Hmm, is it not so that the sampling (from the distribution) is random, but the actual values are not? For example, I can imagine a distribution that's just a single line at a specific value; random sampling from that distribution will then always give you the same value.
@JeffreyBos97
@JeffreyBos97 7 жыл бұрын
Thanks a lot, this was very usefull!
@TableRaw
@TableRaw 6 жыл бұрын
wow what a video :o thank you
@Qoooba95
@Qoooba95 7 жыл бұрын
It is very well explained indeed but I still have certain doubt. I would like to think about entropy in terms of random variables (like wikipedia definiton with discrete random variable X where possible events are values of X), but I fail to understand the following: how do you determine the entropy of an English word for example? What is the random variable in that case? I see that for the set of English characters we could determine the entropy according to the formula presented in the video as the X would represent the character that appears as the message and they all have certain probability of appearing so we have the probability distribution. But I've seen people determining entropy of a word by calculating the probabilities of the characters as appearance rates (ABA: 1/3 for B 2/3 for A) and considering that the probability distribution... If anyone could address that and shed some light I would be extremely grateful! Thanks! EDIT: I just realised that given a sequence of characters that appear one after another independently the entropy of such message considered as a value of a random vector would be a sum of entropies of random variables which make up for that vector (is that right?) so it kind of makes more sense to me and seems natural (every additional character provides the same amount of information per average) but I'm still puzzled with entropy of given English word... Hope someone could respond to it. Also, incredibly interesting topic! And this channel is just great!
@MN-sc9qs
@MN-sc9qs 5 жыл бұрын
Hi. Maybe this will help. What was presented has each random letter being statistically independent of the following letter. However, what you seem to be describing is the situation where the random letters are not statistically independent, which is true for the English language.
@Username-rm1rl
@Username-rm1rl 2 жыл бұрын
There's an error/typo at 1:26 where the 2nd question repeats "Is it B?", it should read "Is it C?" or "Is it D?"
@anthonyvance7
@anthonyvance7 8 жыл бұрын
Great video. Where can I find the cool music at the end?
@ArtOfTheProblem
@ArtOfTheProblem 8 жыл бұрын
+Anthony Vance Thanks, cam posts the music here (cameronmichaelmurray.bandcamp.com/album/art-of-the-problem-volume-1-gambling-with-secrets) though that song isn't yet listed
@anthonyvance7
@anthonyvance7 8 жыл бұрын
+Art of the Problem Great, thanks. I love the work of your team. Inspiringly creative.
@ramanaraju7770
@ramanaraju7770 7 жыл бұрын
Very good
@mouseeatcats
@mouseeatcats 4 жыл бұрын
Really helpful thanks
@ArtOfTheProblem
@ArtOfTheProblem 4 жыл бұрын
glad people are finding this video
@serkansandkcoglu3048
@serkansandkcoglu3048 Жыл бұрын
At 3:49, for D, it should have been 0.25X2 instead of 0.25X4, right?
@elizabethjohnson7103
@elizabethjohnson7103 9 жыл бұрын
very nice.
@gaboqv
@gaboqv 3 жыл бұрын
another way to write shannon equation would be the log base 1/2(p), that you know is how many times you would have to divide in to to approach a probability equal to p, so how many levels we need to accomodate somehow fairly that simbol with that probability
@Dzinuljak1
@Dzinuljak1 5 жыл бұрын
great!
@Scottastrong
@Scottastrong 2 жыл бұрын
There is a typo at 3:52, the last term in #bounces should be .25*2. If this is changed the #bounces=1.75.
@Scottastrong
@Scottastrong 2 жыл бұрын
And great video BTW!
@nm800
@nm800 5 жыл бұрын
Please could anybody explain the equivalence of the two formulas at 6.21? THANKS
@notmychairnotmyproblem
@notmychairnotmyproblem 3 жыл бұрын
This is a year late but by using negative exponent properties we can express 1/p as p^(-1). And then by applying properties of logarithms, we can use the power rule to bring the -1 to the outside of the expression (hence the negative sign in the front).
@paul1964uk
@paul1964uk 10 жыл бұрын
There is an arithmetical error at 3:47 (should be p_D*2)
@ArtOfTheProblem
@ArtOfTheProblem 10 жыл бұрын
shoot. good catch I'll add annotation
@chaithanyaks6553
@chaithanyaks6553 9 жыл бұрын
I am in
@agarcia3391
@agarcia3391 8 жыл бұрын
That's because every question we ask makes the number of outcomes 2 times bigger (since we are asking yes/no questions). Look at the tree of machine 1, it always has 4 outcomes so we have to ask log2(4) = 2 questions each time. however, in the tree of machine 2 you can stop with less questions, if the letter is 'A' you'll end with 2 outcomes (letter 'A' or the question "is it 'B'?"), so log2(2) = 1, just 1 question.
@ImmacHn
@ImmacHn 10 жыл бұрын
SO entropy is good for encryption but bad for communication right?
@nathanhart3240
@nathanhart3240 8 жыл бұрын
+Immac (RockLeet) Good incite! Entropy is good for data encryption (because more guesses/questions are needed to crack it), but entropy is bad for data compression (because maximally compressed data, by definition, has the highest entropy for its size). Data compression is good for communication. Your statement is correct when referring to compression, but certain aspects of communication like bandwidth efficiency or error-correcting codes can improve with entropy. :-)
@nathanhart3240
@nathanhart3240 8 жыл бұрын
+Immac (RockLeet) Sorry for misspelling "insight"
@resistancefighter888
@resistancefighter888 8 жыл бұрын
I don't get it, if we need to ask 1.75 questions to guess the next symbol of machine 2, doesn't it mean it produces MORE information?
@YumekuiNeru
@YumekuiNeru 8 жыл бұрын
+resistancefighter888 I think of it as more information meaning more questions needed to sort through it
@resistancefighter888
@resistancefighter888 8 жыл бұрын
Thanks a lot!
@kikones34
@kikones34 8 жыл бұрын
You're seeing it the other way around. Imagine that person A and B both have a certain amount of information. To get all the information from A, you have to ask them a minimum of 100 questions. To get all the information from B you have to ask them just a single question. Does A have more or less information that B? It's not a very good analogy, but maybe it can make you understand better :D
@vimukthi.herath
@vimukthi.herath 6 жыл бұрын
"what path would would the other guard show me had I asked him for the path?" And both indicate to the same path(the wrong one) which leaves you with the correct one
@FernandoGonzalez-qg7em
@FernandoGonzalez-qg7em 8 жыл бұрын
Why is the number of outcomes state by 1/p?? If the probability was .1 then the number of outcomes would be 10 and if p was .9 the number of putcomes would be 10/9?? I didn't get that =/ But these videos are amazing, thank tou for making such a good video!
@ArtOfTheProblem
@ArtOfTheProblem 8 жыл бұрын
+Fernando Gonzalez - use base 2 numbers in your examples to make it clearer. If p = 0.25 then # outcomes = 4 (1 in 4 chance = 25%), if p=0.125 then # outcomes = 8 (1 in 8 chance = 12.5%)...etc.
@johnhobson2106
@johnhobson2106 6 жыл бұрын
Using the example in the video, would you be kind enough to explain what the eight outcomes are for C for example? It seems like there are still only 4 possible outcomes, and that to get 8 outcomes after dividing 1 by p, you would have to assume that all the outcomes are equally likely (ie. also p=.125).
@mappingtheshit
@mappingtheshit 5 жыл бұрын
John Hobson Here there are, 8 possibilities. 000, 001, 010, 100, 110, 101, 011, 111. I hope this helped you. Otherwise ask
@mudarsaied6258
@mudarsaied6258 4 жыл бұрын
Hei, we did not count the probability of CD in the first machine? The first question was is it AB? then you continued only with Yes as an answer? What if it was N0?
@bingeltube
@bingeltube 5 жыл бұрын
Recommendable
@MauricioMartinez0707
@MauricioMartinez0707 6 жыл бұрын
You are a god thank you
@khkan
@khkan Жыл бұрын
5:49 In your mind visualize the binary tree. Then you can know what the "#outcomes" variable means.
@profie24
@profie24 6 жыл бұрын
素晴らしい
@profie24
@profie24 6 жыл бұрын
when u ask urself what's entropy in information theory and get the point instantaneously, nice work!
@akshithbellare7568
@akshithbellare7568 4 жыл бұрын
why outcomes equal 1/p?
@juliangoulette7600
@juliangoulette7600 8 жыл бұрын
I've heard somewhere that randomness is the most compact way to store information because it lacks pattern
@Variecs
@Variecs 8 жыл бұрын
+Julian Goulette that makes no sense
@aggbak1
@aggbak1 7 жыл бұрын
thats dumb m8
@bobaldo2339
@bobaldo2339 7 жыл бұрын
Well, if we define "entropy" as disorder, as in physics, then it would seem the more "entropy" the more "information". So, if physicists are concerned that it might be possible to "lose information" in a black hole (for instance - something they nearly all seem to find abhorrent) they are fearing disorder might become more ordered. If randomness is a sort of ultimate disorder, then it must produce the most "information". Retrieving that information is another subject.
@mappingtheshit
@mappingtheshit 5 жыл бұрын
Not clear how Markov chains is related to this. Why Markov chains were introduced?
@ArtOfTheProblem
@ArtOfTheProblem 5 жыл бұрын
check out next videos :)
@mappingtheshit
@mappingtheshit 5 жыл бұрын
Art of the Problem thank you, Art. Videos in this series or others?
@ArtOfTheProblem
@ArtOfTheProblem 5 жыл бұрын
this series (next 2 videos)
@HSAdestroy
@HSAdestroy 10 жыл бұрын
Why dont we ask those two questions from machine1 for machine2?
@ptyamin6976
@ptyamin6976 9 жыл бұрын
Because you want your series of questions to reflect the probabilities produced by each machine Machine 1: 25% A, 25% B, 25% C, 25% D BUT Machine 2: 50% A, 12.5% B, 12.5% C, 25% D
@amihartz
@amihartz 9 жыл бұрын
Because you know more about machine 2. So you would be asking more questions than you have to.
@heidyhazem7854
@heidyhazem7854 6 жыл бұрын
great (y)
@ShreyAroraev3
@ShreyAroraev3 4 жыл бұрын
5:33 why?
@devincope6450
@devincope6450 6 жыл бұрын
wait... what about a and d or a and c... so on?
@notmychairnotmyproblem
@notmychairnotmyproblem 3 жыл бұрын
2 years late but I think that has to be a typo. I've been puzzled on that part too. Assuming you're talking about 1:25
@degiatronglang6103
@degiatronglang6103 7 жыл бұрын
Great video, thank you. This make me realize how stupid the school is. They make beautiful things seem to be so dull and complex.
@sitnarf7804
@sitnarf7804 5 жыл бұрын
This is indeed art, but the tearing of the page in the end was painful. : )
@kevinmcinerney9552
@kevinmcinerney9552 6 жыл бұрын
At @3:50 it should be a .... + 0.25 x 2 and not ....+ 0.25 X 4
@stephlrideout
@stephlrideout 9 жыл бұрын
I don't know about you, but I read #bounces as "hashtag bounces" and not "number of bounces". Oh look, youtube even made it a hashtag. Thanks, internet.
@britcruise2629
@britcruise2629 9 жыл бұрын
ahahah #internet
@michahcc
@michahcc 26 күн бұрын
Stay in school!
@MrMonshez
@MrMonshez 8 жыл бұрын
b it
@XenoContact
@XenoContact 8 жыл бұрын
How the fuck do you ask 1.75 questions?
@ArtOfTheProblem
@ArtOfTheProblem 8 жыл бұрын
+XenoContact In the same way people have 2.5 children. It's just an average (expected) of # questions per symbol. Hence the example at the end of this video with 175 questions / 100 symbols
@afridgowun6623
@afridgowun6623 6 жыл бұрын
This explains null and nothing. It's only clear for those that already confusing of it for tons years. But not for me, that never thinking about it, just started to grab its meaning. But, sorry, its explanation so fast, unable to attract me, and this is not an art product. Caused it makes me get confusion.
Entropy is the limit of compression (Huffman Coding)
4:15
Art of the Problem
Рет қаралды 35 М.
What is Logic?
8:14
Art of the Problem
Рет қаралды 51 М.
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 118 #shorts
00:30
КАК СПРЯТАТЬ КОНФЕТЫ
00:59
123 GO! Shorts Russian
Рет қаралды 2,9 МЛН
MOM TURNED THE NOODLES PINK😱
00:31
JULI_PROETO
Рет қаралды 9 МЛН
I Need Your Help..
00:33
Stokes Twins
Рет қаралды 139 МЛН
What is Physical Information?
9:53
The Science Asylum
Рет қаралды 203 М.
Shannon Entropy and Information Gain
21:16
Serrano.Academy
Рет қаралды 200 М.
Entropy in Compression - Computerphile
12:12
Computerphile
Рет қаралды 390 М.
The Shannon Limit - Bell Labs - Future Impossible
5:31
Nokia Bell Labs
Рет қаралды 43 М.
Shannon Nyquist Sampling Theorem
17:19
Steve Brunton
Рет қаралды 125 М.
Claude Shannon: The Man Who Turned Paper Into Pixels
5:52
Adam Westbrook
Рет қаралды 58 М.
How space-time codes work (5G networks)
11:31
Art of the Problem
Рет қаралды 41 М.
How AI Learns Concepts
14:22
Art of the Problem
Рет қаралды 169 М.
Tech Icons: Claude Shannon
8:24
AT&T Tech Channel
Рет қаралды 59 М.
Why Information Theory is Important - Computerphile
12:33
Computerphile
Рет қаралды 145 М.
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 118 #shorts
00:30