We need a full network programming tutorial. I'm way too interested in this now.
@IQof27 ай бұрын
I work in networking and it's a really interesting bridge for me
@FunkyELF7 ай бұрын
First try really is a moving target. Back in the day first try meant it compiled and ran correctly but now with IDEs and LSPs he's guaranteed zero compile errors.
@SnowDaemon7 ай бұрын
First Try on the 3rd go around, lets go
@fgf807 ай бұрын
Huffman coding is the most efficient way to encode arbitrarily large streams of data by just growing the word size to be arbitrarily large. Other compression algorithms exist because growing the word size has exponential cost in the size of the Huffman tree, so more optimal solutions are possible even for quite large finite data streams if you can avoid the cost of transferring the encoding table itself. Limiting your encoding scheme to a fixed word size of 7 or 8 bits will hurt potential for optimization down the line. I could imagine an optimal ASCII encoding involving a pre-shared Huffman tree baked into the compression and decompression tools that’s generated from a histogram of a really large body of input data, like the library of congress, with several byte word sizes.
@dough-pizza7 ай бұрын
Poor grug no understand what this human say! Grug understand only simpler terms
@idf3da7 ай бұрын
Grug hear me, Huffman coding good for making data small. But if Grug make word size too big, tree get too big too. Other ways better for big data, no need big tree. If Grug use 7 or 8 bit words, it not best. Maybe best if Huffman tree already made from big data, like library of congress, and used by all tools.@@dough-pizza
@fgf807 ай бұрын
@@dough-pizza if your data is big enough, you get better compression out of a Huffman code if each code represents, e.g., two bytes instead of one. The problem is that makes the Huffman tree 256x bigger, so you have to be compressing a lot more data for the better compression ratio to be worth the bigger table size. If you have ENOUGH data, it’s actually been proven you can’t (in general) outperform a Huffman coding scheme where each code represents a large enough number of bytes. I was saying Prime here is limiting his possibilities for improved compression ratios by designing it around having each code correspond to exactly one byte.
@the_duda7 ай бұрын
@@dough-pizza if data big huffman good
@dough-pizza7 ай бұрын
@@fgf80 Grug understood! If one Huffman code represent multiple bytes then compression good! Prime here use Huffman to represent one byte so compression bad 😞
@Pasyy7 ай бұрын
damn this is the first time I'm this early watching a the skill-issue-agen video
@patrickrealdeal7 ай бұрын
anyone know what colorscheme is this?
@nasario.d51697 ай бұрын
rose pine messed up by tmux
@sa-hq8jk7 ай бұрын
17:06
@Gennys7 ай бұрын
This is not what I need at the moment but I'm going to watch this video anyway because I love you and your content
@airkami7 ай бұрын
What is this channel and why does KZbin suggest it?
@pavlinggeorgiev7 ай бұрын
Its a mustache enthusiasts channel.
@SnowDaemon7 ай бұрын
its a ligma channel
@maxscott33497 ай бұрын
dog butt
@tgirlshark7 ай бұрын
you just found the peak youtube channel
@felipesharkao7 ай бұрын
this is where we learn to use vim by the way effectively