Great explanation... Can you please create one on GitHub , what is guthub, how it works, examples...etc etc
@francoislepron23014 сағат бұрын
This is a really handsome step by step tutorial. You mentioned that we can use the gui package that we want. Did you try with PySide instead of Tinker ?
@sumangautam40165 сағат бұрын
Can you do one with LLM application developement within cloud such as Azure and deploy as container?
@PythonSimplified10 минут бұрын
Check out my NVIDIA NIM tutorial with Llama 3.1, it's pretty close to what you're looking for (as it is also Docker-based): ⭐ NVIDIA NIM Beginners Tutorial: kzbin.info/www/bejne/rHabYYaCoJJnhq8 I'm using NVIDIA Launch Pad in terms of cloud, but Azure has fully integrated NIM microservices. I hope it helps! 🙂
@e-vdКүн бұрын
Awesome tutorials, thank you 👍🏻
@PythonSimplified9 минут бұрын
Thank you! Glad you liked it :)
@adilivni8688Күн бұрын
Hi . When i convert python file to an exe one and i want to add the logo should i put a code row for logo in my python file?
@actsimsКүн бұрын
Your enthusiasm is contagious... I very much need it when i am deep down tired of staring the monitor for hours... Ahhwww
@razikhan3799Күн бұрын
I'm seeing you after a long time and you've become even more beautiful. What library did you use?
@BitsgeekКүн бұрын
Yo Yo Yo Thank you 😊🎉
@TravisBlake-ew5bx2 күн бұрын
You are the reason I subscribed to this channel. You truly have beautiful eyes.
@Bikslol2 күн бұрын
hi there, you're great from madagascar
@PythonSimplified2 күн бұрын
Thank you! Greetings from Vancouver! :)
@hauntedhand51402 күн бұрын
Did you go to college to learn all this?
@PythonSimplified2 күн бұрын
I just finished uni, but they don't really teach it there! These type of stuff you learn on your own :)
@joelleon22562 күн бұрын
Does HasData have a better way to scrape Facebook galleries without using Selenium? I've been having some issues lately.
@bicycleninja16852 күн бұрын
can’t wait
@Viewpoint3142 күн бұрын
There's been a lot of hype around the new Chinese AI, Deepseek, with some saying it’s better than ChatGPT. I’ve tested it myself, and while it provides good answers, I didn’t find them significantly different from what ChatGPT offers. However, the biggest advantage of ChatGPT-and why I continue using it-is its memory. For someone like me, who is working on long-term AI projects and a book series, having an AI that remembers past conversations is a game-changer. It allows me to build on ideas over time without having to start from scratch every day. Imagine working with a team member who forgets everything after 24 hours-how useful would that be? A good collaborator remembers past discussions, learns from them, and helps refine ideas. Sure, competition in AI is great, and improvements from different companies will benefit everyone, but memory is what makes an AI truly valuable for productivity and collaboration. Right now, ChatGPT still leads in that area, making it the best choice for serious projects that require continuity and deeper engagement. Just my two cents, but I’d love to hear what others think. Does AI memory matter to you?
@headhunterz1352 күн бұрын
UT''T'T UT''T'T
@Tobs_2 күн бұрын
watched this stream at 1000x speed, made perfect sense.
@diananancy13 күн бұрын
are you using smartphone here?
@christianmontagx84613 күн бұрын
Do I always have to reload the models from Internet when I start the docker container? Is there no way to cache the models?
@PythonSimplified3 күн бұрын
You could, but you'd have to store it on your host system. Check out a comment below by @webuildyourcloud regarding mounting volumes 🙂
@YuyangGE3 күн бұрын
UnicodeDecodeError at /myapp/myview/ 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte meow meow I am stuck here, help me the kitten
@webuildyourcloud3 күн бұрын
Hmm interesting video, until your started about the filesizes. You do not put files of that size into a container, it goes against all basic containerization principles. It’s more affective (and simpler) to first create a volume the let the container mount the volume and then download the model into the container.
@PythonSimplified3 күн бұрын
Aren't volumes located on the host system? The purpose of this video was to address the safety concerns that people have with Deepseek. If you store it on your system for efficiency purposes, some of these concerns will remain.
@tundebabalola48993 күн бұрын
Since I watch your video on pandas library once I am looking to understand anything in python. I search for your video on that area in python first. I saw a lot of negative and distracting comments here. And I thought I just needed to share this. Thank you very much Ma.
@theycallmemrglass3 күн бұрын
Yep, agree with others here, that was a bloody awesome breakdown video. And love the prose you have in your storytelling!
@sooo0kie3 күн бұрын
You are pretty.
@HylkeHellinga-r7q3 күн бұрын
How do I customize the installer process, or create my own installer, in order to install other non-python dependencies that my python dependencies depend on? And how can I for example do installations based on hardware detections? Like for example, installing NVIDIA Cuda Toolkit if the user has an nvidia card and doesn't yet have the toolkit installed? So that the installer could install pytorch with cuda support? And how do I create installers for different operating systems, not just windows? I need installers for: mac os, linux, android, ios etc. These are all questions I would need to somehow find a solution to. Thanks for the explanation on how to create a basic windows installer, but I need a more in-depth explanation. Do you recommend any sources I can look at to do what I'm trying to accomplish?
@ItsMeTheUser3 күн бұрын
You are amazing. Thanks Mariya!
@tcbquick3 күн бұрын
How about a video setting up KZbin video streaming sharing multiple device streams terminals, vnc, vlc, OS's....
@tcbquick3 күн бұрын
You should get a few Raspberry Pi's and hook yourself up with the ability to go multiple cameras and KZbins and, and, and..for cheap!
@csowm5je3 күн бұрын
Desktop Nvidia 3080, didn't download the 1.5 but only takes a couple of seconds with the 7B model. With Ollama, it starts right away streaming the response.
@ISCCS14 күн бұрын
The most beautiful part of this is seeing your face and sweet smile. I'll be waiting for your news.
@WilsonMar14 күн бұрын
What does NIM stand for? Is that an acronym or a made-up word?
@PythonSimplified3 күн бұрын
Initially, before it was public, it was NVIDIA Inference Microservices. Now, they call it NVIDIA NIM, so it's not an acronym anymore! 😃 (Sharing some insider information here, as I started working on this video before NIM was officially announced 😜)
@antor444 күн бұрын
Hi Mariya, I think you are wrong in criticizing DeepSeek or the news hype around it. Apart from that, your video is very good and funny as always. Focusing solely on the large R1 model, in addition to being free on the DeepSeek website, the cost per million tokens for the paid API is much lower than what OpenAI charges. These models also have open licenses, no royalties for use, and published documentation on how they work. All of this has been achieved in a model that offers results that compete with the best models from multinational companies. Additionally, there are quantization models of the R1 model that can be run on GPUs with over 128 GB of VRAM. It is worth noting that ARM computers share all RAM with the GPU, and in some new processor models or x86 APUs, the integrated GPU can use large amounts of system RAM. Apple computers based on ARM processors have the great advantage that both the GPU and the ANE chip accelerator for AI share RAM with the CPU. These computers and laptops can optionally be purchased with many Gigabytes of RAM, up to 128 GB for laptops and up to 256 GB for the most powerful computers. This means that free versions of the largest AI models could be trained or run on these devices, which in certain scenarios may be a better option than the most powerful Nvidia enterprise graphics cards, such as the Nvidia A100 and H100 with a maximum of 80 GB of VRAM. As for the versions of the smaller models for common users, I am not sure if they are better than others of similar sizes. In any case, a slightly larger model can make a big difference in the quality of responses. Normal users can run very good quantization models on GPUs like the 4090, although nothing compares to the large model DeepSeek R1.
@dixalex024 күн бұрын
Didn't watch the video, but I agree with you sentiment. The AI big players tried to force the AI revolution into the paid subscription for life model. You can see that it was solely profit driven with how ChatGPT acted: starting off as non-profit, deceptive naming convention, obfuscation of source training material, obfuscation of algorithim, $200/month price tag for their optimal model. Devin too, at $500/month trie d to abuse the hype train. I'm all for smaller players coming on the scene to shake up the meta. I reject the hype train and media wave that is currently going on. It unnecessarily politicizes this new technology when in the grand scheme of things it's just a revoliton/advent similar to Google/Netflix/the internet/browsers - profound enough to make positive change for society, but no need to go so money hungry (let it grow and adapt organically), if there is a monetization method it will come forth. Not everyone has to be forced through a capitolism lens.
@PythonSimplified3 күн бұрын
You know I'm not a big fan of OpenAI! (gently speaking! I think I even got in trouble for it last year 😅) Just to clarify though, my criticism was not towards Deepseek, but rather the media! (It was 1:30 hours in, stream-wise, so most folks didn't even see it!) Instead of focusing on the fact that they open sourced it and they improved the training efficiency, the media focused on falsely claiming that Deepseek does not require extensive GPU processing power. However, it does! Based on my analysis of the distilled models, it looks like inference requires MORE processing power than than their non-distilled equivalents! 1.5B Qwen responds much faster than Deepseek 1.5B Qwen! And it makes sense, because producing the thinking tokens in addition to the response itself comes very costly! The fact that it trains faster - is fantastic! I love it! but training is not the only process when it comes to deep learning! So in my humble opinion, I think there has been a crazy misrepresentation of facts and I don't blame it on Deepseek! blame it on the media. I think the model is pretty cool, I just don't trust Deepseek enough to use their API or apps. I prefer my nice and cozy Docker container, even if it means that I can only run the distilled versions 😜
@PythonSimplified3 күн бұрын
@@dixalex02 There are already plenty of open source players in the field! they may not be as sophisticated, but they're good enough! (I even showed how to use some them on this channel😀) From my perspective, the open source nature is an incredible advantage! but the revolution is more in terms of training efficiency and detailed response, that includes the logical processes that the model goes trough to obtain and answer. But like everything in life, there are benefits and downsides! and that's exactly what I show in this stream! 🙂 I am 100% against ChatGPT subscription style services, and I never supported it or gave it a platform. I've been vocal about my issues with them right from the get go... so I'm not even sure why you guys bring them up in my context 😅😅😅
@antor443 күн бұрын
@@PythonSimplified Generally, media news about technology have limited quality, at least in my country, Spain. However, in the case of DeepSeek, they are not so untrustworthy. Some comparisons suggest that even the small models of DeepSeek, called distilled based on Qwen and Llama, are very good, may be except for coding. Distrust of China is unjustified when the models are open-source and can be run locally or by any company. At most, they may be criticized for avoiding or censoring issues that politically affect China, such as the Tiananmen Square incident. In any case, anyone can retrain or fine-tune these models with their own data.
@jonconnor66973 күн бұрын
@antor44 you dont get to define what is right or wrong. Criticizing the hype was a necessity, especially since no one has seen the "open source" training method that was used!!
@cdmsvt4 күн бұрын
Will you marry me? ❤
@Василий-й1у4 күн бұрын
Это же ШИЗА , 🤣🤣Your question:hello what is the biggest fish on the earth? Okay, so I need to figure out what the biggest fish on the Earth is. I remember hearing about it somewhere, maybe in a school biology class or something. Let me think about this step by step. First, I know that big fish are often in the ocean because they live there. But wait, some fish are in the deep ocean, and others might be in the sea, which is the part above water. So maybe the biggest fish is in the deep ocean. I think there's a specific name for the largest fish. I've heard it referred to as the "Great White Shark," but I'm not sure if that's a fish or a shark. Let me check my memory. Oh yeah, the Great White Shark is a type of shark, not a fish. So the question is about the fish, not the shark. So, the next step is to figure out which fish is the biggest. I recall that the Blue whale is the largest mammal on Earth, but I'm not sure if there's a bigger fish. I think the Blue whale is a mammal, so maybe there's a fish that's bigger than the Blue whale? But that doesn't make sense because the Blue whale is already the largest mammal. Wait, maybe it's a different type of fish. I remember hearing about the Great White Shark, but again, that's a shark. So, perhaps the largest fish isn't a shark but something else. I'm trying to remember if there's a fish bigger than the Great White Shark. I think the Great White Shark is the largest shark, but maybe there's a larger fish. Let me think about other fish. There's the Blue Marlin, which is a fish, but I don't think it's the largest. The Blue whale is bigger, but that's a mammal. Wait, maybe there's a fish that's bigger than the Blue whale. I think I've heard that the Great White Shark is the largest fish, but I'm not sure. Let me try to visualize. The Great White Shark is a shark, so it's an animal with a lot of teeth and fins. It's known for being a predator. But is there a fish that can be bigger than the Great White Shark? I'm not sure. Maybe the Great White Shark is the largest fish, but I'm not certain. I should double-check this information. I think the Great White Shark is the largest fish, but I'm not entirely sure. Maybe there's another fish that's bigger. Let me try to recall. There's the Blue whale, but that's a mammal. The Great White Shark is a shark, so it's an animal, not a fish. So, the largest fish would be the Great White Shark, but that's a shark, not a fish. Wait, but the question is about the biggest fish on the Earth. So, perhaps the Great White Shark is the largest fish, but it's a shark. So, maybe there's another fish that's bigger. I'm not sure. I think I've heard that the Great White Shark is the largest fish, but I'm not certain. I should probably look this up to confirm. But since I'm trying to figure it out, I'll go with what I remember. The Great White Shark is the largest fish on Earth, but it's a shark. So, maybe the answer is the Great White Shark, but it's a shark, not a fish. Wait, the question specifically asks for the biggest fish, so maybe it's the Great White Shark, but it's a shark. So, perhaps the answer is that there isn't a biggest fish on the Earth because the Great White Shark is a shark, and there are other larger creatures, but that doesn't make sense. I'm getting confused here. Let me try to break it down. Fish are smaller than sharks, so the Great White Shark is a larger animal than a fish. So, the biggest fish would be the largest fish, which I think is the Blue whale, but that's a mammal, not a fish. Wait, no, the Blue whale is a mammal, so the largest fish would be the largest fish, which is the Great White Shark, but that's a shark. So, maybe the answer is that there isn't a biggest fish on the Earth because the Great White Shark is a shark, and there are other larger creatures, but that doesn't make sense. I'm not sure. I think I need to look up the answer to confirm. But since I'm trying to figure it out, I'll go with what I remember. The Great White Shark is the largest fish, but it's a shark. So, perhaps the answer is that the Great White Shark is the largest fish on Earth, but it's a shark, not a fish. Wait, but the question is about the biggest fish, so maybe it's the Great White Shark, but it's a shark. So, perhaps the answer is that the Great White Shark is the largest fish on Earth, but it's a shark. So, maybe the answer is that there isn't a biggest fish on the Earth because the Great White Shark is a shark, and there are other larger creatures, but that doesn't make sense. I'm getting stuck here. I think I'll have to conclude that the Great White Shark is the largest fish on Earth, but it's a shark, not a fish. So, the answer is that the Great White Shark is the largest fish on Earth, but it's a shark, so there isn't a biggest fish on the Earth because the Great White Shark is a shark, not a fish. The Great White Shark is the largest fish on Earth, but it is classified as a shark, not a fish. Therefore, there isn't a biggest fish on the Earth because the Great White Shark is a shark, and the largest fish, if it were a mammal, would be the Blue Whale. However, since the question specifically asks for a fish, the answer is that there isn't a biggest fish on the Earth because the Great White Shark is a shark, not a fish.
@Василий-й1у4 күн бұрын
причем он почему-то говорит что учился в школе
@besthobbit4 күн бұрын
Ty for this. The IBM Data Analyst course on Coursera did an absolute shit job of it. You made it so easy to understand. Helps to have a real human speaking too, instead of a robot.
@anabelberumen4 күн бұрын
Ok sweetie I'm going to the gym right now but when I comes back I'll do this.
@harisjaved13794 күн бұрын
Hehe nice! If you are nefarious you can still have docker access your local files lol. Create a temp directory in your dockerfile, use streamlit file selector lol. I just did this today for a project at work.
@Phil-W4 күн бұрын
If you are worried about your AI escaping the container you need to be running Podman, not Docker.
@qooldeluxx4 күн бұрын
i like ur other mic better
@sephiroth78184 күн бұрын
@3:00 calling collect used to get expensive @4:30 looks like it still does
@deepak50744 күн бұрын
awesome maria
@deavea4 күн бұрын
a weeek! a freeking week wasted on trying to understand woh it works! Thank you!
@PythonSimplified4 күн бұрын
I had a similar experience when I learned it the first time!!!! 😅 I just took a pen and paper, and traced a tiny input through the entire algorithm - and that's how this tutorial was born! 🙂
@abnlog1044 күн бұрын
hi. tell me please the name of this microphone? and model name