YOUR CODE! AT SCALE! Amazon SageMaker Script Mode

  Рет қаралды 10,866

mikegchambers

mikegchambers

Күн бұрын

Пікірлер: 34
@davidk7212
@davidk7212 Ай бұрын
Extremely helpful, thank you.
@KellyDiversified
@KellyDiversified 2 жыл бұрын
Outstanding, Mike - thank you!
@mikegchambers
@mikegchambers 2 жыл бұрын
Glad you enjoyed it!
@goodwilrv
@goodwilrv Ай бұрын
Hi Mike, How will this end point be used by other application outside AWS ? a production mobile application, for example ?
@mikegchambers
@mikegchambers 16 күн бұрын
For that architecture you would want to place it behind an API. Typically you would use the API Gateway and a Lambda function.
@AIwithMahmudul
@AIwithMahmudul Жыл бұрын
Very good one, straight to the point. Thanks for making it
@satishb9975
@satishb9975 9 ай бұрын
Thank you very much this is really awesome
@saivinilpratap9208
@saivinilpratap9208 2 жыл бұрын
thanks for the video mike! it was insightful, it was really helpful. it would be even more better if you can take a model which is in local and convert it to sagemaker compatible python script and recording it parallely.
@mikegchambers
@mikegchambers 2 жыл бұрын
Yeah you can do that. Hosting models for inference only. I’ll keep in mind for other videos.
@DeepakSingh-ji3zo
@DeepakSingh-ji3zo 2 жыл бұрын
@mikegchambers Excellent Content! I was lost in Abalone code and you saved me, Thanks again. Can you please let me know, how I can use this endpoint to create an inference pipeline (independent of this notebook)
@hariscolic5215
@hariscolic5215 2 жыл бұрын
Nice video! I'd be interested to see how to deploy an endpoint with a custom inference script for the input and output_handlers, if you got time on your hands!
@mikegchambers
@mikegchambers 2 жыл бұрын
Sounds like a plan! :)
@kscolina
@kscolina 2 жыл бұрын
@@mikegchambers Yes pleasee.
@mikegchambers
@mikegchambers 2 жыл бұрын
@@kscolina so I’ve been working on a demo project. Can I confirm what is wanted in terms of input and output handlers? Are we talking pipeline model with data pre-processing?
@kscolina
@kscolina 2 жыл бұрын
​@@mikegchambers That I am not sure of yet. By the way, I raised a question in a separate reply. :)
@of7104
@of7104 7 ай бұрын
@mikegchambers great video - just wondering if there would be any additional adaptations to make if you were training a deep learning model with eg. pytorch/fastai?
@maryamr.aliabadi6101
@maryamr.aliabadi6101 Жыл бұрын
Thanks for your great video. i assume that every single line of code of this jupyter notebook can not run on a local host. We need to run it on sagemaker notebook. Is that right? So, where the python s cript should be located ??
@KhaledAbdelfattah-f1s
@KhaledAbdelfattah-f1s 8 ай бұрын
In Canvas, is there a python script for training the data (includes the algorithm used to train the data) gets created that I can download, other than the model notebook ?
@KhaledAbdelfattah-f1s
@KhaledAbdelfattah-f1s 8 ай бұрын
Is this endpoint needs an API gateway, and maybe lambda, so I can inference from outside AWS world ?
@karangupta_DE
@karangupta_DE 8 ай бұрын
great video, SM_MODEL_DIR, SM_CHANNEL_TRAIN, SM_CHANNEL_TEST, these have default /opt/ml locations defined in sagemaker python sdk? And once we pass the s3 bucket location, the data from the s3 bucket is automatically pulled into the sagemaker training job containers?
@sndrstpnv8419
@sndrstpnv8419 Жыл бұрын
how you use VS code 1) locally on your computer and make calculations remotely on aws sagemaker, then how you connected to aws sagemaker or 2) use VS running on aws , then how you set up vs code?
@keerthang5557
@keerthang5557 Жыл бұрын
Hello Mike, I need additional python modules which I wish to place in requirements.txt file, will that be picked up by Sklearn Container to install the modules?
@Kmysiak1
@Kmysiak1 Жыл бұрын
This is awesome! I've been struggling with using the sagemaker SDK and this allows me to use pure python and its open source ML packages but on top of the compute resources of AWS. I like to separate my ML pipeline into their unique notebooks (ie. processing, training, tuning, etc.). Can we use multiple scripts? You've earned my subscription.
@mehuljan26
@mehuljan26 Жыл бұрын
Great content!!! how do you think we can perform distributed computing on GPU with pytorch/tensorflow in script mode?
@u123123123123123
@u123123123123123 2 жыл бұрын
great! As an MLOps engineer trying to persuade data scientists to use sagemaker, I found this useful. Basically, they can use a same notebook to pass different hyper params and data to generate different training jobs, am i right?
@mikegchambers
@mikegchambers 2 жыл бұрын
Absolutely. You can use notebooks as you would normally, and use the SageMaker SDK to train jobs at scale, and use many more tools like AutoML, Data Wrangler, etc etc. So much power, yet in a familiar interface.
@Kmysiak1
@Kmysiak1 Жыл бұрын
@@mikegchambers Do you mind doing video on AutoML, data wrangler, etc. I swear you can explain ML to my grandmother lol
@mikegchambers
@mikegchambers Жыл бұрын
@@Kmysiak1 on the way!
@GS-gi9bc
@GS-gi9bc Жыл бұрын
Hi Mike. Very informative presentation. I need to create a model trained only on Mainframe code artifacts (COBOL, JCL, DB2, etc.) I have a full set of GPT prompts, scripts and templates that generate all variations of full program code for my industry the first 4 months of this year. The biggest drawback that prevents companies from adopting the LLM approach is that public models don't give them secure protection of their code and data. If someone could guide me how to create a locally housed model that can be language, token or template interrogated, we can make a lot of money. The model doesn't need to be trained for email replies, excel formulas, document summaries, etc. It needs to absorb our entire code base and add it to any working model that has some level of intelligent COBOL / Mainframe code generating prowess. Is there some way to co-opt the ChatGPT 4 code base for COBOL, SQL, JCL and add it to our code base on a local machine? I think in 2 years this will be the standard method of project development. Some companies may soon be overrun by those that are willing to be the initial movers in this arena.
@hejden
@hejden 2 жыл бұрын
It seems unnecessarily complicated that sagemaker demands we put the training code in a separate script, it would be easier if we could just put it into the notebook with everything else. It also makes it difficult to monitor and debug the actual training scipt when it's implemented separately and run as a monolith. Why is this required and could you just as well just put everything in the notebook?
@mikegchambers
@mikegchambers 2 жыл бұрын
Hey. I hear your frustration, and I have some thoughts here. So this method is all about running the code at scale, and specifically not running it inside the notebook itself. In other words we are using the notebook, not for ML code, but to orchestrate other infrastructure to run our ML code. So, when I comes to debugging, you would want to do list debugging earlier in in the process, with the ML code (probably in a notebook somewhere) and once your happy with it, we move to this method described here. As for debugging the ‘at-scale’ production deployment, there are ways to do this, that I didn’t cover in this video but I think I should in a future video. I hope that helps put things into perspective. I appreciate you raising that point, and I’ll see how I can clarify for the future.
@hejden
@hejden 2 жыл бұрын
@@mikegchambers Thank you for the response :) When you say "in a notebook somewhere" are you talking about somewhere in AWS Sagemaker? I would like to use Sagemaker both for development (running on smaller datasets, check that model is correctly setup, monitor convergence etc.) and then later maybe for large scale training. Where do I turn for the former?
@mikegchambers
@mikegchambers 2 жыл бұрын
@@hejden yes absolutely. So you can spin up a notebook in SageMaker Notebooks, SageMaker Studio, or even SageMaker Studio Labs (for free) and run the ML code in the notebook ’locally’ (to the notebook server). When you’re happy you can ‘migrate’ the code into prod scale as shown here. It’s basically the setup I run through here. I show how the ML code works in the notebook, then get it working in the container using SageMaker. Maybe what I could clarify, is that in this video I use a the same notebook to explore the solution and then get it working in SageMaker managed containers. There is no need to have both these steps in the same notebook, and in many real world scenarios you probably wouldn’t. Steps: - Get your ML code working. Could be done on your own machine, or on a notebook server like SagaMaker Notebooks, Studio, or Studio Labs. This code should include methods to load data, and to serialise and deserialise the model for storage. - Transfer the code into a .py file with the necessary function hooks that SageMaker will be looking for, for the lifecycle of the ML tasks. (Load data, save model, etc). - Create some SageMaker code to orchestrate getting your .py file into a managed SageMaker container. This code can also run anywhere you have access to AWS SDKs, so your own machine, and EC2 instance, a SageMaker Notebook or Studio (but probably not Studio Labs at this time.) - Run your orchestration code and SageMaker will handle the rest. As a preference, I run all code in SageMaker when I can. I don’t like local development and dealing with dependencies etc. it sounds like this is you’re preference too. Make sense? (I’m typing this on my phone, fingers crossed there are not too many typos!)
Game On! - SageMaker STUDIO vs SageMaker NOTEBOOKS
11:53
mikegchambers
Рет қаралды 23 М.
What is Amazon SageMaker?
14:26
mikegchambers
Рет қаралды 75 М.
Creative Justice at the Checkout: Bananas and Eggs Showdown #shorts
00:18
Fabiosa Best Lifehacks
Рет қаралды 35 МЛН
Quando eu quero Sushi (sem desperdiçar) 🍣
00:26
Los Wagners
Рет қаралды 13 МЛН
Support each other🤝
00:31
ISSEI / いっせい
Рет қаралды 49 МЛН
Amazon SageMaker Notebooks - Intro to Jupyter and hands on!
26:34
mikegchambers
Рет қаралды 19 М.
PMunch solves AoC 2024 - Day 5
54:31
Peter Munch-Ellingsen
Рет қаралды 56
Advent of Code: Day 2 - Kotlin
51:22
Paul McBurney
Рет қаралды 45
Turning bad React code into senior React code
13:10
Cosden Solutions
Рет қаралды 93 М.
06 - Let's Learn Linux Kernel Development - Our first kernel change!
1:26:28
Introducing Amazon SageMaker Pipelines - AWS re:Invent 2020
21:53
Julien Simon
Рет қаралды 50 М.