Support the channel plz 😊: www.buymeacoff... Send emails in lambda: • Automate Sending Email... Save data to database: • Build a CRUD Serverles...
Пікірлер: 106
@FelixYu3 жыл бұрын
Video on send emails in lambda: kzbin.info/www/bejne/o32QZYObqqaojcU Video on save data to a database: kzbin.info/www/bejne/i6WYdJ6Jr5uBiJI
@IRFANSAMS2 жыл бұрын
Can you make a video on Credit card fraud detection and entire transformation from capturing streaming data , Historic data to uploading in s3 and for prediction insagemaker
@nandyalasurparaju28342 жыл бұрын
I learned a lot of real-time use cases of AWS services from your channel. Thanks! for your efforts and please keep doing Felix.
@FelixYu2 жыл бұрын
Glad that u found my tutorials helpful!!
@saltdomeguy3 жыл бұрын
Glad I found you. I've been looking for some walk-thru projects. Keep em coming.
@totachava62352 жыл бұрын
thank you for writing the code step by step and explaining it, its really helpful for beginners :)
@FelixYu2 жыл бұрын
Glad that u found it helpful :)
@rahuldeo58133 жыл бұрын
This is a great tutorial, previously, I just have basic idea about AWS stuffs, now with this tutorial, I learned the Kinesis data streaming part very easily. which can help me with my upcoming Interviews. Thanks a lot :) Felix.
@FelixYu3 жыл бұрын
You are welcome!! Best of luck with ur upcoming interview :)
@sentinelese67172 жыл бұрын
Great stuff and awesome explanation. Throughly enjoyed.
One thing that will improve content is explaining maybe each concept as if hovering over it in vs code // where it explains things ie: say stringing then give a small summary for it because everyone won’t know these things
@reeturajchatterjee3 жыл бұрын
You saved my Day! Thanks a lot Felix.
@FelixYu3 жыл бұрын
np..Glad that it helped :)
@techdesk9153 жыл бұрын
Nice content Felix : ) love to see more of these projects. it helps learn a lot : )
@FelixYu3 жыл бұрын
Thank you thank you :)
@bindureddy6148 Жыл бұрын
Hi Felix Yu, You are videos are clean, precise, and very informative. Can you make a video on Kafka?
@Namaryop2 жыл бұрын
good quality video, keep up with the good work!
@FelixYu2 жыл бұрын
Thank you!! :)
@BlackSk8ter1003 жыл бұрын
Good stuff Felix. Keep going.
@FelixYu3 жыл бұрын
Thank you thank you :)
@mehmetkaya4330 Жыл бұрын
Thanks a lot for another great tutorial!!
@FelixYu Жыл бұрын
Glad you liked it!
@AS-zw4lk3 жыл бұрын
Thanks for your content. It is really helpful.
@rashigupta5995 Жыл бұрын
Thank you Felix !
@kristofferhaukaasberg62583 жыл бұрын
This is great! Thanks so much, Felix! I was looking for a simple tutorial that scratch the basics of all these fragments in AWS, and also in Javascript - so this was spot on. I'm building a service that posts messages to a websocket handler (Lambda function) and I want to save that information coming from the websocket (basically health data from an iot-device). So from there, AWS Kinesis seems like a good idea to collect all incoming data from these devices, and then consume the data into a ie. DynamoDB. BTW: Are you open for freelance work with small AWS-based projects?
@FelixYu3 жыл бұрын
Thanks man!! I appreciate the offer but I don’t think I have time for freelance work atm!!
@anandchiluka76112 жыл бұрын
Nice video, thanks for the detailed explanation
@FelixYu2 жыл бұрын
Glad that it’s helpful!!
@Karthik_I_Am_2 жыл бұрын
Im glad that i found you and your videos really help me and many like me. So my request would be AwS Pyspark with Big Data pipelines and Aws Dynamo DB. Please consider this and thanks in advance. Much appreciation!
@sagar16892 жыл бұрын
Thank you for this video. Very helpful 👍
@FelixYu2 жыл бұрын
Glad that it’s helpful!!
@abelcarvajalgil67052 жыл бұрын
great demo!!
@FelixYu2 жыл бұрын
Glad that u found it helpful 👍
@rajibmahato26082 жыл бұрын
awesome sir
@muditmishra9908 Жыл бұрын
Hi Felix, does in real time system, is it common to have get data first in to s3 and then produce to kinesis data stream using lambda ? Can you please make a video on different type of producers and which one to choose ? I want to know how to decide the producer if we are designing real time system say with latency of milliseconds and also for near real time like having latency of say in minutes to hour. Thanks
@lavakumar51812 жыл бұрын
very clear explanation...please try to do lambda functions on python too
@FelixYu2 жыл бұрын
Thank you!!
@pohboonxin41373 жыл бұрын
Hi Felix , It is a nice and clear video. One question, may I know if there can be applied on other types of sources data such as pdf, jpg? Can I add a aws textract in between to read the text from document or image rather than just text file ?
@FelixYu3 жыл бұрын
yes..in 5:17 of the video, if you leave the Suffix field blank, u can upload any file types and it will trigger the producer lambda
@pohboonxin41373 жыл бұрын
@@FelixYu Yes, it will trigger the producer lambda. Could you show something like using textract and convert other formats of source data into csv ?
@_Documentation2 жыл бұрын
💯
@monuramadutta4542 жыл бұрын
Good work
@nainaarabha9186 Жыл бұрын
Can we directly send the data from lambda functions to consumers? If yes, tell me the cons
@kpashupati3 жыл бұрын
I am great fans of your aws videos.Kindly help me to understand how one lambda is calling another lambda and also updating 2 diffrent databases..Like product is purchased by customer and also updated in product inventory. In node js please😊
@FelixYu3 жыл бұрын
u have to like all my videos before i answer this question......just joking lol there are 3 ways u can enable one lambda to call another lambda: 1. use a "middle man" like kinesis or sqs. lambda #1 construct a data object send it to kinesis/sqs and then it triggers lambda #2. lambda #2 then extracts the data object and do whatever u want with it 2. attach an alb (application load balancer) to lambda #2 so it acts as an API and then lambda #1 calls the alb to invoke lambda #2 3. lambda #1 calling lambda #2 directly using the invoke method. here is a reference to that (www.sqlshack.com/calling-an-aws-lambda-function-from-another-lambda-function) i personally like the first approach the most but all 3 would work!! for ur second question, u can just perform 2 dynamo requests (one per each table). just change the dynamoTableName in the request params (e.g., github.com/felixyu9/serverless-api-tutorial-code/blob/main/index.js#L80-L83)
@kpashupati3 жыл бұрын
@@FelixYu thanks lot...I have successfully implemented ..Actually I was trying to implement ticket reservation system..While booking the ticket it will also minus the seat from the main table also keep booked seat no in users table... Thanks again..I am just having 1 month of exp in AWS.
@FelixYu3 жыл бұрын
Nicee..glad that u figured it out and good luck with the aws exploration 👍
@rashigupta5995 Жыл бұрын
Can you share the difference between using apache kafka and kinesis?
@frankyan81352 жыл бұрын
thank you
@FelixYu2 жыл бұрын
Glad that u found it helpful
@igordatsenko52832 жыл бұрын
thanks!
@FelixYu2 жыл бұрын
glad that u found it helpful 👍
@Overbound2 жыл бұрын
thanks
@FelixYu2 жыл бұрын
👍
@ganeshnayak41793 жыл бұрын
Can you please make a video with Python instead of Node JS. Though the content was quite explanatory. Thanks Felix for awesome content
@FelixYu3 жыл бұрын
I can write the code in python and then push it to GitHub when I get a chance. All other setups will be the same!!
@evanserickson2 жыл бұрын
Could you also use step functions and put it all in one?
@2mahender3 жыл бұрын
Can you do one video on Athena,Glue,and EMR
@RedCloudServices2 жыл бұрын
Is there a NodeJS function that connects to a remote JMS ConnectionFactory endpoint for use inside the vpc?
@samyaktjain6982 жыл бұрын
Hi Felix, I need producer for CLICKSTREAM then what is the changes we need in the lambda code? Please correct me If I am wrong.
@dotan22113 жыл бұрын
Hi Felix, I have question that can we modify the data received from S3 bucket or API gateway, then save it to DynamoDB by using Lambda Function
@FelixYu3 жыл бұрын
yes u certainly could..for example, in consumer 1 lambda, u can read in the data as json/strings/numbers and then u make changes to it to make it a different (e.g., numeric calculation, string concatenation, etc.) and then u can save this new data to dynamodb
@weilichen29612 жыл бұрын
Hi Felix, love your video, if it is possible to be Python code in lambda? Thanks
@FelixYu2 жыл бұрын
Yea u can write the lambda in python as well
@2mahender3 жыл бұрын
nice, hi sir, can these lambda functions usefull for Video,audio files also?
@FelixYu2 жыл бұрын
i think so. u just needa find a lib that handles video files and add the logic in the lambda code
@khandoor72283 жыл бұрын
Felix loving your channel, great content! Can we do more with Lambda functions? They seem to be an important link to many other services in AWS. For these 2 consumers maybe send an email with 1 and save to dynamo with the other?
@FelixYu3 жыл бұрын
tyty 😄 and yessir, we can certainly do more with lambda functions!! lambda has gained a lot of popularities in recent years becuz it is light weight, easy to set up and great for horizontal scaling. u just reminded me that i do already have videos for send emails in lambda and saving data to dynamo database: send emails in lambda: kzbin.info/www/bejne/o32QZYObqqaojcU save data to database: kzbin.info/www/bejne/i6WYdJ6Jr5uBiJI
@darshithaindupally15152 жыл бұрын
I am getting errors like "Cannot read property '0' of undefined", and "event. Records is not iterable", can you rectify the error
@nishantshah47182 жыл бұрын
where does cloud front comes into picture?
@pradeepsharma90353 жыл бұрын
Hi Felix what if I have s3 in one aws and kinesis on another . And if I added cross account policy on lambda as well. What should I do in lambda to send the data to kinesis on another aws account.
@FelixYu3 жыл бұрын
if u have the producer lambda in account A and then wanna write data to a kinesis stream in account B, u will needa configure 2 roles. 1. in account B, create a role that has write access to the kinesis stream (e.g., putRecord, PutRecordBatch, etc.). let's call it kinesis-role-in-account-B. 2. in account A, create a role (let's call it lambda-role-in-account-A) and use this role to assume the kinesis-role-in-account-B that was created in step #1. attached this role to the lambda and it will be able to send data to kinesis in account B!!
@pranayshukla99802 жыл бұрын
My use case is Amazon connect data -> kinesis data stream -> delivery stream lambda -> se can you help me with lambda heree
@parikshithshivaprakash55232 жыл бұрын
this is same as s3 streaming files right
@kavyap31843 жыл бұрын
Hello Felix, Thanks for the content. Is it possible to have the Json file (Source with JSON format) template from s3 to Kinesis?.
@FelixYu3 жыл бұрын
yes, just change txt to json in the s3 trigger suffix 5:18 of the video and u can upload json files as the data source (e.g., test_file.json) (or just leave that field blank so it will take all file types). let me know if that works
@kavyap31843 жыл бұрын
@@FelixYu Hi Felix, Thanks for the quick response. I have tried with changing the suffix of the file in S3. But this does not work either .
@FelixYu3 жыл бұрын
@@kavyap3184 gotcha....lets try a few things here. 1. when u have the txt suffix set up and upload a txt file (the exact setup i have in the video), does everything work fine? 2. after u change txt to json and upload a json file, does it trigger the producer lambda? if not, it might be a permission issue. 3. if it triggers the producer lambda, lets console.log out dataString and see how it looks like (9:29 of the video)
@kavyap31843 жыл бұрын
@@FelixYu Thanks for the suggestion. It is just that my file is too big to process. Also do we see the records in the AWS data stream ?. Because specific data stream monitor shows no data even though i see the data in console.
@FelixYu3 жыл бұрын
Yea if ur file is big, u needa increase the producer lambda timeout and memory....and I don’t think u can look at the data from aws console. Under monitoring u can see metrics like incoming data in bytes, get record counts, etc
@sormagupta91702 жыл бұрын
I am getting error "stream/kinesis-stream because no identity-based policy allows the kinesis:PutRecord action" while porsting data to kinesis stream
@FelixYu2 жыл бұрын
Make sure u have the kinesis policy for ur iam role (1:55 of the video)
@Roop-tk2rb3 жыл бұрын
Please advise where can we find the source code for this example
@FelixYu3 жыл бұрын
sorry i dont think i keep the code after i uploaded the video :(( try to pause the video and copy the code that way. let me know if u run into any problems!!
@victoriabressan45572 жыл бұрын
I want to stream but to a event bus. I couldn't find a way yet...
@austinchettiar67842 жыл бұрын
Can i get the same code in python pls?
@subhamchakraborty68223 жыл бұрын
Hi, is it possible to generate a real-time drill rig data stream by AWS Kinesis?
@FelixYu3 жыл бұрын
Yes, u would have to build a data producer that process the drill rig data and then sends it to kinesis thou
@subhamchakraborty68223 жыл бұрын
@@FelixYu Thanks! Is it possible to build a data producer via AWS?
@FelixYu3 жыл бұрын
@@subhamchakraborty6822 i guess a better question is - are u able to connect aws with ur drill rig system cuz idk how u have ur drill rig system set up and where u store the data
@subhamchakraborty68223 жыл бұрын
@@FelixYu I am planning to build the drill rig via MATLAB Simulink. But I don't know it's possible to connect the system with AWS or not.
@FelixYu3 жыл бұрын
@@subhamchakraborty6822 the easiest way i can think of is that..after matlab saves teh data to ur machine, u can write a script/cron job/program to upload the data to aws s3 and have that trigger a lambda function to send the data to kinesis
@priyaavhad3688 Жыл бұрын
Hi need the producer code can you pls help
@mohammedshihab85773 жыл бұрын
Video was very good to understand, would you mention your mail id I had some queries related to iot service
@saswanthikaschannel77083 жыл бұрын
I am getting an error like cannot read property of 0 of undefined
@FelixYu3 жыл бұрын
its prob caused by a typo....make sure it is event.Records[0]
@suresh10882 жыл бұрын
Link for the code ?
@asifhossain1874 Жыл бұрын
Please upload the code in Python here Sir...
@abstractionGod6 ай бұрын
Oh nvm lol
@ALDARPRATHAMESH6 ай бұрын
there's error that says "The specific log group: /aws/lambda/consumer1 does not exist in this account or region."
@randomvideos61173 жыл бұрын
Consumer1 - send to elastic search
@stevebalu3 жыл бұрын
got The specific log group: /aws/lambda/Producer does not exist in this account or region. errors. can't see the triggered logs
@FelixYu3 жыл бұрын
two things to check here: 1. make sure the cloudwatch console u are viewing in the same region as the lambda function 2. make sure the iam role the lambda is using has the CloudWatchLogsFullAccess policy attached to it