Language Models For Software Developers in 17 Minutes

  Рет қаралды 158,390

Code to the Moon

Code to the Moon

Жыл бұрын

A fast guide on how you can incorporate state of the art language models into your application without relying on a pay-by-request service like OpenAI.
Code from the video: github.com/Me163/youtube/tree...
Hugging Face Course: huggingface.co/learn/nlp-cour...
Camera: Canon EOS R5 amzn.to/3CCrxzl
Monitor: Dell U4914DW 49in amzn.to/3MJV1jx
SSD for Video Editing: VectoTech Rapid 8TB amzn.to/3hXz9TM
Microphone 1: Rode NT1-A amzn.to/3vWM4gL
Microphone 2: Seinheiser 416 amzn.to/3Fkti60
Microphone Interface: Focusrite Clarett+ 2Pre amzn.to/3J5dy7S
Tripod: JOBY GorillaPod 5K amzn.to/3JaPxMA
Mouse: Razer DeathAdder amzn.to/3J9fYCf
Keyboard (sometimes): Keychron Q1 amzn.to/3YkJNrB
Computer: 2021 Macbook Pro amzn.to/3J7FXtW
Lens: Canon RF24mm F1.8 Macro is STM Lens amzn.to/3UUs1bB
Caffeine: High Brew Cold Brew Coffee amzn.to/3hXyx0q
More Caffeine: Monster Energy Juice, Pipeline Punch amzn.to/3Czmfox
Building A Second Brain book: amzn.to/3cIShWf

Пікірлер: 144
@lebogang_nkoane
@lebogang_nkoane Жыл бұрын
Possibly the best tutorial on how to plug and play with the models. Much appreciated, I gained valuable insight. Respect!✊
@codetothemoon
@codetothemoon Жыл бұрын
thanks, glad you got something out of it!
@denisblack9897
@denisblack9897 Жыл бұрын
this
@mediahost2243
@mediahost2243 Жыл бұрын
Agreed
@clivebird5729
@clivebird5729 Жыл бұрын
Thank you. This was a very impressive, hype free, presentation. 10 out of 10.
@joerua470
@joerua470 Жыл бұрын
Thanks for the best plug and play model video out there. Super short and right to the point.
@codetothemoon
@codetothemoon Жыл бұрын
thanks for watching, glad you got something out of it!
@TheMzbac
@TheMzbac Жыл бұрын
As always, very informative and clear. Explains the essentials.
@codetothemoon
@codetothemoon Жыл бұрын
thanks, glad you liked it!
@AdrianMark
@AdrianMark Жыл бұрын
Thanks so much. I just read through the privateGPT py files, and I don't have any python experience. This video helped me understand the reasoning behind that code.
@siddharthmanumusic
@siddharthmanumusic Жыл бұрын
Wow man, this is so good and simple! Didn't know it were this easy to get a model and run it!
@codetothemoon
@codetothemoon Жыл бұрын
nice, really happy you got something out of the video!
@Mustafa-099
@Mustafa-099 Жыл бұрын
Wow man you have put so much effort into this tutorial! I love it, thank you :)
@codetothemoon
@codetothemoon Жыл бұрын
i did, thanks for watching, glad you got something out of it!
@mu3076
@mu3076 Жыл бұрын
*I just found myself a Goldmine, definitely subscribing*
@simonabunker
@simonabunker Жыл бұрын
This is a really great tutorial! The obvious followup would be fine tuning an LLM with your own data. Or maybe using LangChain to do it?
@codetothemoon
@codetothemoon Жыл бұрын
thanks, and I agree! I'd love to make videos on both of those topics.
@FabrizioPalmas
@FabrizioPalmas 11 ай бұрын
@@codetothemoono it then 😁 (great video)
@subem81
@subem81 Жыл бұрын
This was really great! Would love to see something on how to LoRA train a model for text generation. Would be really cool!
@codetothemoon
@codetothemoon Жыл бұрын
thanks glad you got something out of it! fine-tuning in general is definitely something I'd like to cover if I can, as it's likely something most productized uses of LLMs will benefit from...
@sbx1720
@sbx1720 Жыл бұрын
Great video, thanks!
@codetothemoon
@codetothemoon Жыл бұрын
thanks for watching!
@chris.dillon
@chris.dillon 11 ай бұрын
Your REPL is a manual ML evaluation. Many devs don't like manual stuff and so eventually we will all realize that figuring out if it works (like asking it the questions the base model flubbed) is manual testing. It's sort of similar to functional testing. "I just coded up something but how do I know it works?". We have a choice to make on whether to quickly manually test it with the massive trade-off of: you cannot possibly ask it every question. Put another way: ask it 10,000 questions every time you change something (code/scripts/model/parameters). Give it a score. The score is still not enough without context. The model itself it not just bigger for the sake of being bigger but would be improved and measured. We would be attracted to better performing models. In other words, we expect to consume software and models that are well-functioning and performant. Those things use automation. We've been doing this for a long time with plain old software. In ML, this is the new bit. The data and how the result is not deterministic. This is different than plain software.
@powpowpony9920
@powpowpony9920 11 ай бұрын
Great tutorial!
@DreamsAPI
@DreamsAPI Жыл бұрын
Subscribed. Thank you
@codetothemoon
@codetothemoon Жыл бұрын
fantastic, really happy to have you on board!
@chrispycryptic
@chrispycryptic Жыл бұрын
This is amazing! Thank you so much! It is not some "black box" that so many of these doomsayers bleat incessantly, but rather the closest we can really get to a superpower for those who thoroughly understand them!
@codetothemoon
@codetothemoon Жыл бұрын
You're very welcome, glad you got something out of it! 😎
@otakarbeinhauer
@otakarbeinhauer Жыл бұрын
I'm not a doomsayer, but I agree with the "black box" tag that AI models get. I don't understand how this video proved to you that it is not a "black box". The guy making the video was often literally surprised by the answers he got from the language model. And when you're unable to predict the answer, that's basically a "black box" by definition. You probably think that "black box" means we don't understand how the language models work. That is not correct. What is meant by the phrase is, that we don't know what those models will return. I hope i clarified things for you.
@ohorit
@ohorit 9 ай бұрын
Excellent!
@codetothemoon
@codetothemoon 9 ай бұрын
thank you, glad you liked it!
@ibriarmiroi
@ibriarmiroi Жыл бұрын
Thank you for your content I would like to ask you to make a video on the web app and the new developments in that regard.
@codetothemoon
@codetothemoon Жыл бұрын
thanks for watching! I will definitely be making more content around Rust full stack web applications.
@FabianBarajas
@FabianBarajas Жыл бұрын
Dude, this is amazing
@codetothemoon
@codetothemoon Жыл бұрын
Nice, glad you got something out of it!
@staystealth
@staystealth Жыл бұрын
thank you so much for this
@codetothemoon
@codetothemoon Жыл бұрын
thanks for watching, glad you got something out of it!
@normanlove222
@normanlove222 Жыл бұрын
I would love if you did a video taking this one step further and shopw how to connect to our local data using llama index
@robbbieraphaelday999
@robbbieraphaelday999 Жыл бұрын
Thank you!
@codetothemoon
@codetothemoon Жыл бұрын
thanks for watching, glad you got something out of it!
@WillyKusimba
@WillyKusimba Жыл бұрын
Impressive!
@codetothemoon
@codetothemoon Жыл бұрын
thanks, glad you liked it!
@s1v7
@s1v7 11 ай бұрын
actually even scary, how simple everything is
@a_k__
@a_k__ Жыл бұрын
I love this one
@codetothemoon
@codetothemoon Жыл бұрын
glad you liked it!
@Metruzanca
@Metruzanca Жыл бұрын
This is excelent. Just the other day I was wondering if I could take a markdown wiki (or obsidian vault) and feed it to a LM to ask questions about it. Specifically I wanted to take a messy wiki from work and giving it to a LM and then asking the LM questions. Then we can improve the wiki and get an ever better chatbot that can answer questions. Then can do something like feed it the python and django docs as a baseline.
@codetothemoon
@codetothemoon Жыл бұрын
nice, yes I think what you describe is definitely achievable!
@Christian-op1ss
@Christian-op1ss Жыл бұрын
Hi, thanks for the nice video! A question, in your Rust video you refer to, you load libtorch, or pytorch. However if you build that program in Rust and give it to someone, would they not also have to install libtorch on their machine and set their environment? And if so, do you know of a way to embed libtorch or pytorch?
@codetothemoon
@codetothemoon Жыл бұрын
Good question, I'm actually not sure!
@ChrisMeans
@ChrisMeans 11 ай бұрын
Thanks!
@codetothemoon
@codetothemoon 11 ай бұрын
wow thank you so much for the support! much appreciated! really happy you got something out of the video.
@juanmadelaflor6406
@juanmadelaflor6406 Жыл бұрын
Is this possible on a node environment?Thanks.
@eyemazed
@eyemazed Жыл бұрын
Awesome tutorial. Would it be possible to make a video on how to use one of these models to further fine train it on our own corpus of content? Sort of make it learn additional things on top, is that too complicated?
@daviskipchirchir1357
@daviskipchirchir1357 Жыл бұрын
Yeah this would be greattttt
@cybersamurai99
@cybersamurai99 Жыл бұрын
search for PriavateGPT and you will find your answer!
@PeterAdiSaputro
@PeterAdiSaputro 11 ай бұрын
The diagram in minute 9:30 reminds me when I tried to learn SVMs.
@kellybrower7634
@kellybrower7634 Жыл бұрын
I’m curious about your vim workflow. Nice colorings etc. also how do you use the keyboard to open a penv terminal automatically sized side by side?
@codetothemoon
@codetothemoon Жыл бұрын
thanks, actually its doom emacs! To open the python terminal you can just use M-x then select "run python". I'm using the doom-monokai-pro theme.
@papa-pete
@papa-pete Жыл бұрын
Can I use tensorflow instead of pytorch?
@bruninhohenrri
@bruninhohenrri Жыл бұрын
How can i load an list of "intentions" for a language model to classify ? Like when i send some question like: "Please turn on the lights" i would like that the language model classify the "turn_on" intentions as true with the entity "lights"
@KlausSzyska
@KlausSzyska Жыл бұрын
your example works nicely in Jupyter notebook after making sure the working version of 'transformers-4.28.0.dev0' is installed.
@KlausSzyska
@KlausSzyska Жыл бұрын
spoke too soon. Base model works nicely for the [blue] sky but the common pet question gets the spokesman for the Associated Press involved: Enter something: What is a common pet for humans to have? ["W: I'm sorry, but I'm not sure what happened. I'm not sure what happened."] ['h - h - h'] ['spokesman for the Associated Press said the agency was unable to comment on the matter.'] ['t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t t'] [' '] ['i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i. i']
@KlausSzyska
@KlausSzyska Жыл бұрын
Interestingly this bad (t5-large) model behavior only happens when run inside your FOR loop structure. When a FOR loop uses a function call then the large model gets the answers right even for Apollo13 when the question is more specific, i.e. aborted landing.
@autohmae
@autohmae Жыл бұрын
I realized something, the tokens is what we also see in The Matrix digital rain ? 🙂
@codetothemoon
@codetothemoon Жыл бұрын
maybe!
@IroAppe
@IroAppe Жыл бұрын
I know this tutorial wanted to show how to save on the expenses and run a model on your own machine. But ironically, the outcome actually showed me what we're paying OpenAI for. These models get already very basic things wrong, really. I don't know if it's much of use with these models, yet.
@normanlove222
@normanlove222 Жыл бұрын
the idea is to combine this idea with our local data and then it becomes powerful. we would do that via Llama_index. Just not too many videos out there on how to do this.
@dwivedys
@dwivedys Жыл бұрын
For running transformers on windows machine - do I need pytorch or tensorflow or both installed?
@codetothemoon
@codetothemoon Жыл бұрын
you'd definitely need one or the other (this particular example uses pytorch but I think you can just replace "pt" with "tf" for tensorflow). Not sure about any of the specifics on getting it working in Windows though.
@occamsrazor1285
@occamsrazor1285 Жыл бұрын
13:49 There were two: Apollo 8 and 10
@ma34529
@ma34529 Жыл бұрын
I love the random Rust shoutout
@codetothemoon
@codetothemoon Жыл бұрын
not entirely random - much of my audience comes from my Rust content 🦀 gotta build a bit of a bridge there!
@ma34529
@ma34529 Жыл бұрын
@@codetothemoon I'm a rustacean, I loved it.
@oneproductivemusk1pm565
@oneproductivemusk1pm565 Жыл бұрын
Isn't this fire 🔥 ship narrator? Sounds like him! Is this your new channel?
@itsmedheeraj
@itsmedheeraj Жыл бұрын
Before this Video, AI was blackbox for me, now i know basics, how it works
@codetothemoon
@codetothemoon Жыл бұрын
nice, really happy you got something out of the video!
@rembrandtharmenszoon5889
@rembrandtharmenszoon5889 Жыл бұрын
Hey you use vim, it's kind of unexpected since the context is AI and stuff. That's great! 👍 But can I ask why though?
@codetothemoon
@codetothemoon Жыл бұрын
actually its doom emacs! Which has evil mode (vim bindings) enabled by default. I use the vim motions because I feel like they make me more productive, and I use emacs mostly because of org mode - if there was no org mode I might be using Neovim.
@shanukumarsharma9264
@shanukumarsharma9264 Жыл бұрын
Subscribed. Just include a few practical projects. Most people just want to replicate. How to make useful things which might work in real world....
@codetothemoon
@codetothemoon Жыл бұрын
nice, very happy to have you here! I do plan to make more videos about practical applications of language models. Stay tuned...
@animeshsahu2803
@animeshsahu2803 Жыл бұрын
Just a question, do these language models hold context for previous questions and replies?
@codetothemoon
@codetothemoon Жыл бұрын
The way I'm using it in this video - the answer is no. There are models specifically trained for this task though, check this out huggingface.co/models?pipeline_tag=conversational&sort=downloads
@animeshsahu2803
@animeshsahu2803 Жыл бұрын
@@codetothemoon hey thanks, appreciate your time to reply back.
@being-awe-some
@being-awe-some Жыл бұрын
pyhton IDE you're showing is within hugging face?
@codetothemoon
@codetothemoon Жыл бұрын
actually I'm using doom emacs in this video!
@valueray
@valueray Жыл бұрын
how can i add my own corpus to a base model and weighting it? i have a big searchable database but want to make it "transformer searchable". Does such a solution exist? Pls someone point me in the right direction. Thank You
@flipper71100
@flipper71100 Жыл бұрын
I wanted to clear up something about the example. When the tokenizer separated the word "undoubtedly" into two tokens, "_un" and "doubtedly," you mentioned it was mainly for efficiency. However, what I learned is that it's called subword tokenization strategy, and it helps the model handle words not found in its training data. Is not that the purpose ?
@codetothemoon
@codetothemoon Жыл бұрын
I think that's another aspect of it. But it's also my understanding (could be wrong) that it's also to simplify the model (and potentially improve performance) by having it learn the semantic meaning of the "root" or a word, instead of having to independently learn the semantic meaning of each slightly different flavor of a word
@flipper71100
@flipper71100 Жыл бұрын
@@codetothemoon I dug deep and found something interesting and you were right about the performance point , he is a summary of what I found: the large language models typically have a fixed vocabulary size due to memory constraints. Subword tokenization enables the model to represent a more significant number of unique words by encoding them as subword tokens. This helps in handling a wide range of words while keeping the vocabulary manageable. Also, another goal is to reduce the overall number of tokens in a text, making the input sequences shorter. This leads to more efficient memory utilization and faster computation during training and inference
@VaibhavShewale
@VaibhavShewale Жыл бұрын
so to use it do we have to download the model first?
@codetothemoon
@codetothemoon Жыл бұрын
yes, but this will be done automatically the first time you run the program (if written the way we do here). By default, the Transformers package completely abstracts away the process of downloading the model
@theLowestPointInMyLife
@theLowestPointInMyLife Жыл бұрын
What use are these models to your own app/site? Wouldn't you need to train them on your own data that's relevant to your site?
@codetothemoon
@codetothemoon Жыл бұрын
They are pretty capable with no fine tuning, it’s very possible they might already do what you need without any additional training. But fine tuning can definitely make them better - I’m up for making a video about that if folks are interested
@theLowestPointInMyLife
@theLowestPointInMyLife Жыл бұрын
@@codetothemoon I'm thinking like site functionality, so a user is asking a question about how to use the site in some way, or some rules about the site. If it's just a general model I don't see the point of every site having one.
@codetothemoon
@codetothemoon Жыл бұрын
@@theLowestPointInMyLife right - actually you can cover quite a bit of ground by having an app-specific prompts that you interpolate user-specific values into. Using this approach you can generate app-specific output without actually doing any fine tuning to the model itself. That may or may not be sufficient depending on what you're doing.
@anuradhaamarasinghe9649
@anuradhaamarasinghe9649 Жыл бұрын
@@codetothemoon Please create a video about fine-tuning if you can. Thanks for this one. Great stuff as always.
@structuralcoverage
@structuralcoverage Жыл бұрын
Same here
@edgeeffect
@edgeeffect Жыл бұрын
Disappointed it didn't know about Apollo 10!
@jxn22
@jxn22 Жыл бұрын
I: "In whose legion fights the most formidable armed combatant?" O: "british army" 🤦‍♂ Great video, CttM.
@codetothemoon
@codetothemoon Жыл бұрын
thank you!
@ikemkrueger
@ikemkrueger Жыл бұрын
How can I fix errors the model makes, like the ones with the Apollo question?
@codetothemoon
@codetothemoon Жыл бұрын
I'm willing to bet it could be fixed with fine tuning, especially if you have some idea of the specific types of questions the model will be expected to answer accurately. The other option is to use larger flavors of the same model. You should be able to get what you're going for using one or both of those strategies
@yash1152
@yash1152 Жыл бұрын
2:24 tasks of NLP
@snk-js
@snk-js Жыл бұрын
do you fed the AI with all your channel and now it does spit out the whole video for you? it's super cool those edit tho
@codetothemoon
@codetothemoon Жыл бұрын
I wish! Definitely spent many hours editing this one 😎
@dinoscheidt
@dinoscheidt Жыл бұрын
🚨 Keep in mind, that many open source models DO NOT have commercial licensing
@codetothemoon
@codetothemoon Жыл бұрын
great point! I think I originally mentioned this in the video but I may have left it out of the final cut. But the one we're using in the video (T5 Flan) is Apache 2.0 and thus should be fine to use for commercial purposes.
@dinoscheidt
@dinoscheidt Жыл бұрын
@@codetothemoon Yes, the ones you showed are save. Just wanted to point it out… ran in a start-up last week that proudly pitched their product and than were devastated that a superficial tech DD reveled that they break licenses left and right. Licenses always have been the annoying accounting part everyone dreads, but for ML models it’s currently 10x as confusing. And probably the only reason, why the „you can actually run stuff locally“ needs extra care in that department. Anyway: Great video as always Mr Moon !
@jobilthomas64
@jobilthomas64 11 ай бұрын
🎉
@hotdog2c
@hotdog2c Жыл бұрын
Have you found one model to be particularly good for coding?
@codetothemoon
@codetothemoon Жыл бұрын
I haven't experimented with enough open source models to have a good answer to this, sorry!
@ToddDunning
@ToddDunning Жыл бұрын
Typescript pleeze, or is that not cool this week?
@codetothemoon
@codetothemoon Жыл бұрын
not sure if I'll make it to a TypeScript version, though I imagine such a thing would be immensely popular. I am likely going to make a chatbot webapp in 100% Rust - it might be fun to do a followup of the same webapp using something like SvelteKit - we'll see
@XzcutioneR2
@XzcutioneR2 Жыл бұрын
How do I get the max amount of tokens supported by each model? I'm trying to build a generic scraper that scrapes the given documentation of an app, finds all relevant REST APIs, webhooks etc along with the relevant details. I would then convert them into embeddings and insert into a database. I'll then search on these semantically. What I'm struggling with right now is how to find the structure of the documentation. I think a LLM model would help with that but with most, I feel that I'll have a problem with the max number of tokens when I pass the HTML to them. I have access to GPT-4 but AFAIK, the 32k model hasn't been released yet. And even if I had access to it, I'll run into cost constraints.
@hubstrangers3450
@hubstrangers3450 Жыл бұрын
Thank you, however, believe most of the famous tech corps where aware of this space, prior 2017 paper (~7-5 yrs), and was masking this technology from the public space, though they used public dataset, Public should really seek for accountability using a proper legal framework, if only politicians were skillful, knowledgeable and talent enough, least they could seek from these tech corps is to provide cloud platform workspace for closer to a zero pricing scenario.
@olsuhvlad
@olsuhvlad Жыл бұрын
35 But some man will say, How are the dead raised up? and with what body do they come? 36 Thou fool, that which thou sowest is not quickened, except it die: 37 And that which thou sowest, thou sowest not that body that shall be, but bare grain, it may chance of wheat, or of some other grain: 38 But God giveth it a body as it hath pleased him, and to every seed his own body. 39 All flesh is not the same flesh: but there is one kind of flesh of men, another flesh of beasts, another of fishes, and another of birds. 40 There are also celestial bodies, and bodies terrestrial: but the glory of the celestial is one, and the glory of the terrestrial is another. 41 There is one glory of the sun, and another glory of the moon, and another glory of the stars: for one star differeth from another star in glory. 42 So also is the resurrection of the dead. It is sown in corruption; it is raised in incorruption: 43 It is sown in dishonour; it is raised in glory: it is sown in weakness; it is raised in power: 44 It is sown a natural body; it is raised a spiritual body. There is a natural body, and there is a spiritual body. 45 And so it is written, The first man Adam was made a living soul; the last Adam was made a quickening spirit. 46 Howbeit that was not first which is spiritual, but that which is natural; and afterward that which is spiritual. 47 The first man is of the earth, earthy: the second man is the Lord from heaven. 48 As is the earthy, such are they also that are earthy: and as is the heavenly, such are they also that are heavenly. 49 And as we have borne the image of the earthy, we shall also bear the image of the heavenly. 50 Now this I say, brethren, that flesh and blood cannot inherit the kingdom of God; neither doth corruption inherit incorruption. 51 Behold, I shew you a mystery; We shall not all sleep, but we shall all be changed, 52 In a moment, in the twinkling of an eye, at the last trump: for the trumpet shall sound, and the dead shall be raised incorruptible, and we shall be changed. 53 For this corruptible must put on incorruption, and this mortal must put on immortality. 54 So when this corruptible shall have put on incorruption, and this mortal shall have put on immortality, then shall be brought to pass the saying that is written, Death is swallowed up in victory. 55 O death, where is thy sting? O grave, where is thy victory? 56 The sting of death is sin; and the strength of sin is the law. 57 But thanks be to God, which giveth us the victory through our Lord Jesus Christ. (1Co.15:35-57)
@codetothemoon
@codetothemoon Жыл бұрын
do you think LLMs could be used to create a more accessible translation of the Bible?
@jereziah
@jereziah Жыл бұрын
** and * are called 'splat' operators when they're used this way: kzbin.info/www/bejne/qn2Uq5-qjdp7hZo
@codetothemoon
@codetothemoon Жыл бұрын
thanks for pointing this out!
@Jay-kb7if
@Jay-kb7if Жыл бұрын
I suspect tokenization stems from cognitive psychology and morphemes. If a token is a meaning unit then a morpheme is the smallest possible meaning unit. I.e., un + happy, un changes the meaning of the word.
@codetothemoon
@codetothemoon Жыл бұрын
I suspect this is true! it's interesting how advances in language models might wind up providing more hints about how the human brain works...
@Jay-kb7if
@Jay-kb7if Жыл бұрын
@@codetothemoon Yep! I lecture in cognitive psychology and there are a lot more parallels. I did a class with a huggingFace GPT model where you could change the token length to illustrate meaningful sentence production/comprehension (i.e., Wernicke's aphasia). I am fairly certain that an interdisciplinary field of academia will spawn in the next few years that marry machine learning and cognitive psychology. I'm sure there's the odd journal here and there but I think it'll take the place of conventional cognitive psychology.
@BlackIcexxi
@BlackIcexxi Жыл бұрын
what in the hell is that line number schema? Lollll. Great video though!
@anon_y_mousse
@anon_y_mousse Жыл бұрын
Interesting, but let's see you do it from scratch in C.
@alurma
@alurma Жыл бұрын
Awesome video, thanks, this helps! Not a fan of clickbait title though
@codetothemoon
@codetothemoon Жыл бұрын
Thanks for the feedback! What do you think a better title would be?
@burnere633
@burnere633 Жыл бұрын
@John *flan*-t5
@burnere633
@burnere633 Жыл бұрын
I thought it was less clickbait and more tongue-in-cheek (in retrospect -- I didn't have the context until I watched the video).
@codetothemoon
@codetothemoon Жыл бұрын
@@John-wh9oh flan is an amazing dessert 🍮!
@poulticegeist
@poulticegeist Жыл бұрын
​@@John-wh9ohflan is the name of the library primarily used in this video, which also happens to be the name of a dessert
@browaruspierogus2182
@browaruspierogus2182 Жыл бұрын
They teach how to import pretrained data but never show how to train models and create system of neural network layers to use it as normal dev would all that AI stuff is just monopoly on computation patterns
@codetothemoon
@codetothemoon Жыл бұрын
pretrained models are an incredible resource, as the vast majority of developers don't have the resources or budget to perform such trainings. I'm not sure what you mean by "create a system of neural network layers to use as a normal dev would", as the models in this video are built upon neural networks
@browaruspierogus2182
@browaruspierogus2182 Жыл бұрын
@@codetothemoon to show how to build NN matrixes with predictors for specific models like shape recognition or sound recognition
@DrCHE42
@DrCHE42 11 ай бұрын
Great presentation, but the presenter needs, IMHO, to speak more slowly
@codetothemoon
@codetothemoon 11 ай бұрын
thanks and thanks for the feedback!
@mail2toan
@mail2toan Жыл бұрын
I took it a little too far. It's totally me. 😂
@astroganov
@astroganov Жыл бұрын
Thanks for the video 👍. I just can't stop wondering, how come such an impossible poor, slow and inconvenient programming language with the development environment from the past century became a standard for machine learning tasks... It doesn't even have method names autocomplete, it doesn't show the full list of object's methods available, no parameters list, no types - just nothing. 💩 Crazy. return_tensors="pt" 🤦
@robertlawson4295
@robertlawson4295 Жыл бұрын
Just a bit of food for thought ... all CPU chips ONLY understand how to manipulate bits and bytes in registers. EVERY computer language is simply a way to create those bit patterns to feed the hardware chip. Each language chooses how much abstraction to use and how the abstractions are interpreted. Some do it this way and some do it that way, but the end result in ALL cases is a sequence of machine (binary) codes that the Assembly language for the chip can write directly with NO abstractions except, perhaps, some mnemonics (memory aid) to avoid memorizing all the binary values associated with each machine instruction. Programming, paradoxically, is much easier in Assembly language than all others because you've gotten rid of all abstractions and interpretations that you have to wrap your head around. Of course, you still have the abstractions needed for solving the problem you are tackling but you don't have abstractions within abstractions. That's what makes computer programming challenging. Just sayin' ... 😉
@codetothemoon
@codetothemoon Жыл бұрын
I've never encountered a problem that was easier to solve in assembly than a higher level language. The abstractions of higher level languages are there to reduce our cognitive load, and they are incredibly effective at doing so
@robertlawson4295
@robertlawson4295 Жыл бұрын
@@codetothemoon Absolutely, I agree. Abstracting things is important in many fields, not just programming. I was principally interested in peeling the onion to show why something like programming CAN be made more difficult because of abstractions but, of course, we have to understand that creating abstractions is an art in itself. It can be done well or done poorly. The effectiveness of using assembly language also depends on the programmer having a rock-solid understanding of the underlying electronics hardware. As an electronics engineer, I have had to work with many chips that didn't have compilers available so all coding had to be done in assembly and carefully mapped out with only the most rudimentary abstractions. This is also true of the microcoding that exists internally in every chip to interpret the binary machine code instructions themselves.
@sabin97
@sabin97 Жыл бұрын
just came here for the thumbnail. we're not "coders" a trained monkey can code. we are programmers. if you dont understand the difference, it's the same difference between writing and typing. yeah......typing/coding is just hitting the keys in order to form the words. writing/programming is thinking. not even gonna watch the video. the thumnail earned it a dislike.
@TheOleHermit
@TheOleHermit Жыл бұрын
Been self learning Arduino & Python on YT for several projects over the past few years. But, I have absolutely no idea what is being said in this video. Must be a different language than English. 🤷‍♂ I'll circle back around in another couple of years, after gaining some more smarts.
The End Of Programming
24:55
Matthew Berman
Рет қаралды 138 М.
Larger Scale Software Development (and a Big Trap)
17:17
Code to the Moon
Рет қаралды 90 М.
MEGA BOXES ARE BACK!!!
08:53
Brawl Stars
Рет қаралды 33 МЛН
I’m just a kid 🥹🥰 LeoNata family #shorts
00:12
LeoNata Family
Рет қаралды 16 МЛН
3 wheeler new bike fitting
00:19
Ruhul Shorts
Рет қаралды 49 МЛН
Object Oriented Programming vs Functional Programming
18:55
Continuous Delivery
Рет қаралды 747 М.
GPT/LLM in Enterprise Software Development
54:04
Keyhole Software
Рет қаралды 494
Let's build GPT: from scratch, in code, spelled out.
1:56:20
Andrej Karpathy
Рет қаралды 4,4 МЛН
The Best Tiny LLMs
1:02:26
Trelis Research
Рет қаралды 12 М.
[1hr Talk] Intro to Large Language Models
59:48
Andrej Karpathy
Рет қаралды 1,9 МЛН
how NASA writes space-proof code
6:03
Low Level Learning
Рет қаралды 2,1 МЛН
NEW AI Jailbreak Method SHATTERS GPT4, Claude, Gemini, LLaMA
21:17
Matthew Berman
Рет қаралды 318 М.
How principled coders outperform the competition
11:11
Coderized
Рет қаралды 1,6 МЛН
Gen AI Course | Gen AI Tutorial For Beginners
3:19:26
codebasics
Рет қаралды 170 М.
Will the battery emit smoke if it rotates rapidly?
0:11
Meaningful Cartoons 183
Рет қаралды 31 МЛН
Ждёшь обновление IOS 18? #ios #ios18 #айоэс #apple #iphone #айфон
0:57
Lid hologram 3d
0:32
LEDG
Рет қаралды 7 МЛН
WWDC 2024 Recap: Is Apple Intelligence Legit?
18:23
Marques Brownlee
Рет қаралды 6 МЛН
cute mini iphone
0:34
승비니 Seungbini
Рет қаралды 4,8 МЛН