This is literally the best solution walk-thru I have watched on YT. Clear, instructive, and it actually works. You are a hero!
@knandi732 жыл бұрын
There is a CSV file in the local computer and it is uploaded to AWS S3. If any changes are made in that CSV file on the local computer, the changes should reflect in AWS S3 automatically using AWS Lambda function. What are the steps to achieve this?
@renukasrivastava116710 ай бұрын
Thank you for such a simple and good explanation
@AshokSharma-yv6mw2 жыл бұрын
Good tutorial. However, the first 15 lines of Python/Boto3 code for the Lambda trigger are not readable. Please share.
@AjishPrabhakar Жыл бұрын
But this can get easily failed for uploading large files, say for eg file size over 500GB . The lambda runtime execution timeout will happen.
@balajikubendran91204 жыл бұрын
Hi @Prabhakar i need to unzip the zip file in the sub bucket, is this possible to extract the zip fin in its sub bucket, can you please inform
@PawanKumar-gl4yw2 жыл бұрын
Great Explanation. I would like to know that when Lambda copy data from source bucket to target bucket, where it stores the data ? And if data is let's say 1Tb, then how Lambda would work ?
@franklinbulmez4989 Жыл бұрын
how can I copy only the file and not all the prefix where it resides?
@shivamgarg4958 Жыл бұрын
can you create a lambda function to compress images using python
@Videos-rj1ek2 жыл бұрын
can we see the log of this copy event..labda copying...you put print statement...does it publish to cloudwatch?
@shreyashmakadia89513 жыл бұрын
Error when trigger create "Unable to validate the following destination configurations"
@satyamKumar-mr5gc Жыл бұрын
How can run this program for long time
@mohammadanas6755 Жыл бұрын
Sir I want to transfer a file from one aws s3 to different aws s3 using bash script .
@kudlamolka14292 жыл бұрын
Is the source_bucket name is obtained by the trigger?
@baluchittela30162 жыл бұрын
Thanks for your clear explanation. I followed your steps, as you said, but I am getting errors while running the lambda function. Could you please help me ASAP? Error:- { "errorMessage": "module 'urllib' has no attribute 'unquote_plus'", "errorType": "AttributeError", "requestId": "0cb2294e-a023-4ab2-8395-05f70689e10f", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 17, in lambda_handler object_key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key']) " ] }
@shreerangaraju10132 жыл бұрын
where's the event json for this?
@dianaan20803 жыл бұрын
I am getting key error: 'Records'.. What to do?
@hardikmaghrola2 жыл бұрын
Create an AWS Lambda function to count the number of words in a text file. The general requirements are as follows: Use the AWS Management Console to develop a Lambda function in Python and to create its required resources. Report the word count in an email using an Amazon Simple Notification Service (SNS) topic. Optionally, also send the result in an SMS (text) message. Format the response message as follows: The word count in the file is nnn. Replace textFileName with the name of the file. Specify the email subject line as: Word Count Result Automatically trigger the function when the text file is uploaded to an Amazon S3 bucket. Test the function by uploading several text files with different word counts to the S3 bucket. Forward the email produced by one of your tests to your instructor along with a screenshot of your Lambda function.
@lakshmisharon57562 жыл бұрын
i tried the code but its not working for me i dont know why. its not getting copied to target bucket can anyone help
@NamasteErwin Жыл бұрын
Hi , that's a great info and thanks for the tutorial...i have question and if this can be answered, can solve my problem..so i have a custom app and we have integrated with AWS event bridge and wants events to be targeted out side of AWS ..one we are using is Google cloud storage..so will the similar python script which can solve my problem
@shivagyaneshwar11062 жыл бұрын
{ "errorMessage": "'Records'", "errorType": "KeyError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 17, in lambda_handler source_bucket = event['Records'][0]['s3']['bucket']['name'] " ] } getting this error
@shwetabari9980 Жыл бұрын
you solve this error...?
@mejiger2 жыл бұрын
nice one but python 2.7 is not supported on aws anymore and the code is not working for me for python 3+
@carlosperal51632 жыл бұрын
Same
@TonySpark-er2hj Жыл бұрын
@@carlosperal5163 from __future__ import print_function import boto3 import time, urllib import json """Code snippet for copying the objects from AWS source S3 bucket to target S3 bucket as soon as objects uploaded on source S3 bucket @author: Prabhakar G """ print ("*"*80) print ("Initializing..") print ("*"*80) s3 = boto3.client('s3') def lambda_handler(event, context): # TODO implement source_bucket = event['Records'][0]['s3']['bucket']['name'] object_key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key']) target_bucket = 'techhub-output-data-andy' copy_source = {'Bucket': source_bucket, 'Key': object_key} print ("Source bucket : ", source_bucket) print ("Target bucket : ", target_bucket) print ("Log Stream name: ", context.log_stream_name) print ("Log Group name: ", context.log_group_name) print ("Request ID: ", context.aws_request_id) print ("Mem. limits(MB): ", context.memory_limit_in_mb) try: print ("Using waiter to waiting for object to persist through s3 service") waiter = s3.get_waiter('object_exists') waiter.wait(Bucket=source_bucket, Key=object_key) s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source) return response['ContentType'] except Exception as err: print ("Error -"+str(err)) return e This works for me with the newer version of PYTHON well 3.7 anyway:) cheers
@poppadoesitpropa3 жыл бұрын
Great demo, any chance there is a AWS LAMBDA to copy from S3 to FSx windows?
@jatin_khera3 жыл бұрын
I have one doubt if the bucket contains multiple objects and any file from one particular folder is overwritten will it reflect in the new bucket as well ?
@sunnysandeep2024 жыл бұрын
Sir I want to add data and fetch data to postgresql through c# using lambda. Kindly help me here
@MrElsocio3 жыл бұрын
This's pretty awesome. Thanks! We can also do the same with CRR or SRR within S3. But very helpful video to understand Lambda. Thanks again :).
@abelrozario27573 жыл бұрын
Thank you 👍🏻, can we do similar copy using different aws account for input s3 bucket?
@dodokwak3 жыл бұрын
Thank you. I also use two buckets ( destination and incoming one)+ lambda function for resizing images. On the server side I use django-storages and it's const AWS_S3_CUSTOM_DOMAIN which points to a bucket with resized images. Buckets and their objects have public access. Everything works almost well but I've got a strange bug: 404 error when trying to get image for the first time which turns into 200 OK after refresh. Has somebody got the same issue?
@vishwarajgupta19633 жыл бұрын
HI Sir, Do you teach as well ? I am looking for lambda coaching.
@eminedogan31252 жыл бұрын
Great video, Thank you for the clear explanation!
@syedahmadzada31662 жыл бұрын
Your video is really helpful but the code keep giving me an issue line 16
@bikramchandradas41204 жыл бұрын
Any bodey help me to create website In the website Dashboard ,have to pute ,aws,start,stop, options... To give to user to use their own vps server
@sunnysandeep2024 жыл бұрын
Great Artice. It helped me a lot.
@technologyhub15034 жыл бұрын
Thanks for your motivational words!!
@abhishekroxz2 жыл бұрын
What if the data size is huge 10 tb can we transfer the entire data within 15 min?
@technologyhub15032 жыл бұрын
Above Lambda example to demonstrate Lambda capabilities to perform operations on S3 bucket. In this case huge data around 10TB+, we can perform the data transfer between buckets using one of the following option s: 1. Cross-region replication or same-region replication 2. S3 batch operation 3. S3DistCp with Amazon EMR 4. Use aws DataSync
@eladlevi472 жыл бұрын
Someone has a manual for the same procedure but with python version 3.x ??
great video, can you upload the text to secret manager instead of another s3 bucket?
@divyanshjha7672 Жыл бұрын
hua hi nahi :(
@kalyanijagtap44483 жыл бұрын
Great video sir u have explained it very well
@multitaskprueba12 жыл бұрын
Fantastic video! Thank you! You are a genius!
@sunnysandeep2024 жыл бұрын
can i delete object from destination bucket as soon as object with same name deleted from source bucket using lambda function? if yes how can i do that?
@technologyhub15034 жыл бұрын
Yes, we can tweak the lambda function code as per our requirement. We can delete an object, copy object, we can use copied object data to insert into mysql, postgreSQL, DynamoDB, we can also use this data for Alexa training data set and etc. If we want to delete an object from destination bucket as soon as source bucket object is deleted with object name. 1. Apply lambda function on source bucket with DELETE event. 2. As soon as you delete a file from source bucket, first we need to cross verify whether same object/file already exists in destination bucket then we can write a code snippet for deleting an object from destination bucket. s3.delete_object(Bucket=bucket, Key=destination_object_key) Please let me know if you need any help on the same.
@whathowwhywhenandhere91683 жыл бұрын
While running above code I am getting this error lease somebody help Response { "errorMessage": "'Records'", "errorType": "KeyError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 17, in lambda_handler source_bucket = event['Records'][0]['s3']['bucket']['name'] " ] }
@jaimearielchitaybautista67193 жыл бұрын
I have the same error
@saipranav31533 жыл бұрын
upload file in s3 and execute it through trigger. If I am not wrong. I guess you have used test to execute the labda
@shaikfarheen89062 жыл бұрын
Same thing I am getting ..pls give me a solution
@smdmatheen22452 жыл бұрын
anyone fix the error
@ramyahello4 жыл бұрын
Good video, please upload a video what will you do if you want to add prefix and suffix, like TXT file to one bucket and jpg to another
@gridofmemories4 жыл бұрын
I followed the video but on uploading to my source bucket my file is not copying to the target bucket
@davidcloes90483 жыл бұрын
my uploaded file was also not copying to the target bucket. I had inadvertently not attached the AWSS3FullAccess policy to the role I had created. I only noticed because I had also neglected to add the AWSLambdaBasicExecutionRole to the role, so monitoring wasn't working either. Attached them both and viola!, file was copied to the 2nd bucket.
@sekmer0093 жыл бұрын
@@davidcloes9048 i followed all the steps. but still didnt copy to destination. can you help pls. Anything to set permissions or enable at s3 bucket. One more observation that I dont see Enable check box while creating the trigger. Won't this work on AWS basic user login?
@RaamVersion2O3 жыл бұрын
you should maintain runtime python 2.7 only then only you got it.
@dianaan20803 жыл бұрын
I am getting key error:'Records'
@kudaykumar12613 жыл бұрын
Thank you so much sir ... its really work.
@PraveenKumar-ic5zo Жыл бұрын
Nice Video.
@nithinbhandari30753 жыл бұрын
Nice Video. Thanks.
@gandheshiva89434 жыл бұрын
thankyou
@krishnamurali85224 жыл бұрын
Super
@abhilashak16282 жыл бұрын
hi the solution didnot work for me can you help me can you shae me mail id so that i can share the error details