Why I often use AWS lambda and serverless architecture

  Рет қаралды 13,102

Web Dev Cody

Web Dev Cody

Жыл бұрын

📘 T3 Stack Tutorial: 1017897100294.gumroad.com/l/j...
🤖 SaaS I'm Building: www.icongeneratorai.com/
💬 Discord: / discord
🔔 Newsletter: newsletter.webdevcody.com/
📁 GitHub: github.com/webdevcody
📺 Twitch: / webdevcody
🤖 Website: webdevcody.com
🐦 Twitter: / webdevcody

Пікірлер: 62
@vytasgavelis
@vytasgavelis 2 ай бұрын
Would be keen to see a complete production setup that you are running. There are so many ways to deploy lambdas: SST, cdk by it's own, terraform etc... All of these have different drawbacks especially when it comes to local development environment experience but it is hard to find resources on the internet what real companies are doing
@INKILU
@INKILU Жыл бұрын
Enjoying the aws vids 👍🏼
@w9914420
@w9914420 Жыл бұрын
Hi Cody, many thanks for your insights into lambda, would be cool to see an example of how you would use puppeteer to generate pdf's from html (with css styling)😊
@baracka448
@baracka448 Жыл бұрын
8:30 Lambda as Docker images supports a max 10GB, so that would help you with the 256MB limit.
@SeibertSwirl
@SeibertSwirl Жыл бұрын
Good job babe!!!! I’m finally first again! But most of all I’m so proud of you and all this work you’ve done ❤
@WebDevCody
@WebDevCody Жыл бұрын
Awww thanks babe!
@TwenTV
@TwenTV Жыл бұрын
You can also get around the 250MB constraint by wrapping your lambda in a docker image and building your lambda from ECR instead. It will require a bit more on the development side with versioning but allows up to 10GB in dependencies
@WebDevCody
@WebDevCody Жыл бұрын
Any clue if that slows down cold starts
@constponf7403
@constponf7403 Жыл бұрын
@@WebDevCody Not too much, since AWS provides specific docker base images. I am currently deploying every lambda as a docker image, because it simplifies the deployment process.
@uome2k7
@uome2k7 Жыл бұрын
why would you want to do that? Lambas should be short running (there's a time limit on how long it can run for) and stateless. Anything that needs more than 250MB throws up a lot of red flags because it sounds like it would be doing way too much. Wrapping them in a docker image means you have to add the docker registry lookup, download that image and all those dependencies and then start up that docker image, even if its docker running inside docker. Being able to run a lamba inside docker for local testing makes sense, but I think keeping it within the base restrictions should be the goal. Remember you are paying for compute cycles and memory space on AWS so you want to be as small and as fast as you can get.
@TwenTV
@TwenTV Жыл бұрын
@@uome2k7 Its 250MB for dependencies. There are many cases where you have a quick and useful lambda, but library requirements just surpass the 250MB allowed in layers :)
@WebDevCody
@WebDevCody Жыл бұрын
@@uome2k7 there are many use cases, for example I need a lambda to generate a pdf for my users. In order to generate a pdf, you need a bunch of binaries which use a LOT of space. Generating the pdf takes maybe 1 second, but installing a chromium binary can take like 80mb by itself, not to mention your all your required node_modules. I'd also say using docker as a way to standardize your lambdas is actually very useful. I've wasted hours debugging issues that work locally but fail on lambda because of incorrect setup of the binary paths, etc.
@Pyrospower
@Pyrospower Жыл бұрын
thanks for the interesting video!
@B1TCH35K1LL3R
@B1TCH35K1LL3R Жыл бұрын
serverless works fine when there are budget limitations, but another good approach for this might be to have your express/nest API web server as a container and deploy it using a management service such as ECS or even better, Kubernetes. However as I mentioned that won't apply to every project (specially if there are budget restrictions)
@habong17359
@habong17359 Жыл бұрын
if you get a chacne, can you walkthrough your ci/cd for lamda functions? great video btw!
@driden1987
@driden1987 Жыл бұрын
I've been using Serverless Stack (SST) lately, it's pretty awesome
@male3399
@male3399 4 ай бұрын
Do you use lambda with SST?
@driden1987
@driden1987 4 ай бұрын
@@male3399 Yes
@jatinhemnani1029
@jatinhemnani1029 Жыл бұрын
we can get an API url without API Gateway right directly using the function url
@corygrewohl8180
@corygrewohl8180 Жыл бұрын
so one thing I've wondered for a while, is does lambda/serverless architecture replace a normal backend built in Express, Django, etc.? I'm beginning to work on a web portal for an app startup idea, and I'm generally just confused out what the best way to build up a back end is. in the past I've used lambda for a lot, but I'm wondering if building an Express API is better, or if generally companies use both. I guess i'm just confused on where each comes into play. thanks for the help!
@WebDevCody
@WebDevCody Жыл бұрын
I typically have a single express app that I wrap and get deployed to a single lambda function. There is a library called aws-serverless-express you can use to wrap your express app and deploy to a lambda, super easy to use. You can also use existing tools such as the serverless framework to host an express app directly to lambda. Honestly, deploying a django app to heroku or some other container service host works perfectly fine, I’m just a big fan of only pay for what you use. Some people deploy a separate lambda for each endpoint, but that causes deployments to take forever
@corygrewohl8180
@corygrewohl8180 Жыл бұрын
@@WebDevCody oh that's actually really cool you can wrap it and then reap the benefits of lambda still. i guess im just wondering what the benefit of deploying it to a server is then.
@saman6199
@saman6199 Жыл бұрын
Hey Cody, would it be possible to have a very small express app and run it on lambda to show us how we could configuring it. It would be appreciate it
@cringelord511
@cringelord511 Жыл бұрын
are they similar to azure functions?
Жыл бұрын
What about when you have your lambda backend connected to a database and there is a spike in the number of users (number of concurrent lambda executers at the same time) but your db has a maximum number of connections? How do you handle that?
@uome2k7
@uome2k7 Жыл бұрын
you have to scale everything based on expected demand, ideally with some extra margin, but none of these should have unlimited growth allowances because nobody has unlimited pockets.
@WebDevCody
@WebDevCody Жыл бұрын
You’d either need to scale or configure your db to accept more concurrent connections, or you need to wrap your database behind a connection pool service gateway which will limit how many connections can be made, and your lambdas will invoke that via rest. Lambdas do stay warm so you can potentially keep a database connection open between requests as long as you put the connection outside of the scope of the handler into a global code level. But like joe mentioned, there is no silver bullet, you need to re analyze your system when you hit certain levels of scale. If you choose to just use planetscale from the start, you’ll get that scaling in the future more than likely for a price tag
@yuhanna_kapali
@yuhanna_kapali Жыл бұрын
using rds proxies connect to the lamda will help on number of connection
@moodyhamoudi
@moodyhamoudi 10 ай бұрын
http or websocket based connection models support serverless by design. There's no reason to try to finagle a stateful connection model to work with a stateless compute platform, you'd only be bandaiding an gunshot wound and you will hit a wall if you intend on going past ~100 concurrent users. Firestore is probably the most well established solution in the category but there are a few reasonable providers for both SQL and noSQL like AWS Aurora, Planetscale (not actually stateless but can handle a ton of connections), DynamoDB , MongoDB Atlas (only if you're using the new Data API) to name a few. Sorry for venting; this exact issue nearly ended me.
@rustystrings0908
@rustystrings0908 Жыл бұрын
Can you show your set up on how you deploy this stuff to Lambda specifically? Do you use AWS SDK or are you writing shell scripts to do it?
@WebDevCody
@WebDevCody Жыл бұрын
Usually I’d use the serverless framework. I can try making a video on it if I get time
@chris94kennedy
@chris94kennedy Жыл бұрын
@@WebDevCody I'd also like to see that Cody. Thanks as always
@johndebord7802
@johndebord7802 Жыл бұрын
Is it best practice to just have one lambda function for your application? Or one lambda function per DynamoDB? Or multiple lambda functions for multiple REST-like requests? Kind of confused about this
@WebDevCody
@WebDevCody Жыл бұрын
I typically do one lambda for my entire api, but you can deploy each lambda separate but it’ll take a while to deploy
@johndebord7802
@johndebord7802 Жыл бұрын
@@WebDevCody I suppose having one lambda would be best because it would minimize the possibility of cold-starts as well. But I'm almost sure that cold-starts will be a relic-of-the-past someday
@WebDevCody
@WebDevCody Жыл бұрын
@@johndebord7802 yeah much higher chance stuff will be warm, but it does mean every endpoint will need to be configured the exact same. So if one endpoint requires more memory, you need to increase it for all endpoints. If one endpoint needs a higher timeout, you need to increase for all. It’s just trade offs
@digitnomad
@digitnomad Жыл бұрын
Hi Cody, when 1 AppSync do all the job, why busy with Gateway+Lambda?
@WebDevCody
@WebDevCody Жыл бұрын
I don’t use graphql
@xorlop
@xorlop Жыл бұрын
I am very interested to hear your thoughts on CF Workers. You can even create them programmatically on an enterprise plan. CF Workers also have environments so you can do a test deploy. There is also built-in monitoring now, too. I have some basic experience with Lambda, but not much. The CF Edge also doesn't have cold starts, I think (I could be very wrong about this!). Idk about limits... but I do know there are ways to change and alter them to be unbound I think. I only use it for work, so idk about pricing.
@WebDevCody
@WebDevCody Жыл бұрын
I've never worked with CF workers, so if I do maybe work with them I'll make a video
@roach_iam
@roach_iam Жыл бұрын
Curious, what did you guys decide to do about the pdf issue?
@WebDevCody
@WebDevCody Жыл бұрын
We managed to get puppeteer working on lambdas by making sure the deployed .zip had as little as possible in it. If we his limits again I think we’ll need to move to running docker containers on lambda
@Xmasparol
@Xmasparol Жыл бұрын
Somehow lambda containerization is good I get pip issues in python psycopg2 and layers doesn't work I did google and chatgpt didn't help and got to the point to call AWS support
@dandogamer
@dandogamer Жыл бұрын
I've worked with serverless and AWS for 3 years and never had any fun developing on it. Debugging was always painful, documentation is poor and you end up having to buy into a ton of other services just to have a functioning yet complicated system. Not to mention the developer experience sucks, issues always cropping up on AWS but not local or you accidentally forget a permission and have to wait 10+ mins each time for the build to upload. If you're a small company who needs to move fast I would stop and consider whether it's worth slowing down your team for the sake of penny pinching. On the other hand if you are a larger team I would probably adopt a platform engineering approach to help your developers so they dont have to worry about all the intricacies
@WebDevCody
@WebDevCody Жыл бұрын
I agree with everything you said 😂 aws can turn into a convoluted nightmare
@Harish-rz4gv
@Harish-rz4gv Жыл бұрын
When do u start project series??
@WebDevCody
@WebDevCody Жыл бұрын
Soon
@andriisukhariev
@andriisukhariev Жыл бұрын
Cool thanks
@michaelscofield2469
@michaelscofield2469 Жыл бұрын
please make project series
@WebDevCody
@WebDevCody Жыл бұрын
Starting tomorrow
@devippo
@devippo Жыл бұрын
I think there are many devs who love maintaining and fiddling with AWS. To be honest I just want the the app to work for the client ASAP.
@WebDevCody
@WebDevCody Жыл бұрын
Absolutely
@illiakhomenko6405
@illiakhomenko6405 8 ай бұрын
Ligma-lambda is now forever in my mind😂😂😂😂
@WebDevCody
@WebDevCody 8 ай бұрын
🤣
@Grahamaan27
@Grahamaan27 6 ай бұрын
Ok but the competition to serverless is not EC2, but ECS or containers. Why compare against VMs???
@rohangodha6725
@rohangodha6725 Жыл бұрын
leaked env vars unlucky 💀
@WebDevCody
@WebDevCody Жыл бұрын
Nothing on that db
@freshhorizonswithjakub
@freshhorizonswithjakub Жыл бұрын
roll it to 69 right? I see what you did there.
@WebDevCody
@WebDevCody Жыл бұрын
By accident, but it works out
@Chris-se3nc
@Chris-se3nc Жыл бұрын
What a sticky mess and terrible local dev experience. I’ll stick to kubernetes. Multi cloud out of the box.
@Grahamaan27
@Grahamaan27 6 ай бұрын
Zero benefits I heard are not native with using ECS on AWS. Logging, auto scaling, metrics are all supported out of the box. Lambda has so much overhead and duplicated effort , if you like being inefficient but stupidly simple I see the appeal. But if youre running a production enterprise environment, I can't imagine you wouldnt want to save money and execution time by implementing autoscaled container services.
I think I was wrong about AWS Amplify
30:39
Web Dev Cody
Рет қаралды 56 М.
路飞被小孩吓到了#海贼王#路飞
00:41
路飞与唐舞桐
Рет қаралды 46 МЛН
КАРМАНЧИК 2 СЕЗОН 7 СЕРИЯ ФИНАЛ
21:37
Inter Production
Рет қаралды 523 М.
Самое Романтичное Видео ❤️
00:16
Глеб Рандалайнен
Рет қаралды 4,7 МЛН
ОДИН ДЕНЬ ИЗ ДЕТСТВА❤️ #shorts
00:59
BATEK_OFFICIAL
Рет қаралды 9 МЛН
Robust Distortion-free Watermarks for Language Models
54:11
Google TechTalks
Рет қаралды 175
How I structure my next.js applications
23:19
Web Dev Cody
Рет қаралды 21 М.
How to Make S3 Bucket Public
5:10
dataEnthusiast
Рет қаралды 951
Explained : Serverless vs Serverful Backends ⚡
11:50
Mehul - Codedamn
Рет қаралды 39 М.
Why I'm moving my side project from Vercel to AWS
15:47
Web Dev Cody
Рет қаралды 62 М.
Top 5 Use Cases For AWS Lambda
12:36
Be A Better Dev
Рет қаралды 79 М.
What is Serverless? | Serverless Vs Monolith | AWS Lambda
36:11
Piyush Garg
Рет қаралды 82 М.
Monolith vs Microservices vs Serverless
23:05
Code With Ryan
Рет қаралды 75 М.
路飞被小孩吓到了#海贼王#路飞
00:41
路飞与唐舞桐
Рет қаралды 46 МЛН