AWS Lambda and Snowflake integration + automation using Snowpipe | ETL | Lambda Deployment Package

  Рет қаралды 8,267

Knowledge Amplifier

Knowledge Amplifier

Күн бұрын

Пікірлер: 38
@paracha3
@paracha3 3 жыл бұрын
Very nice. Good you did not give up and got it to work finally.
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 жыл бұрын
Yeah thanks paracha3! Happy Learning :-)
@observatoirelibredesbanque1743
@observatoirelibredesbanque1743 3 жыл бұрын
This is exactly what I was looking for to complete an assignment. Thank you for the good work. #Stayblessed
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 жыл бұрын
Glad it was helpful Observatoire Libre des Banques Africaines! Happy Learning :-)
@nabarunchakraborti3700
@nabarunchakraborti3700 2 жыл бұрын
very nice demo. keep making good and informative videos.
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
Thank you nabarun chakraborti for your kind words ! Happy Learning :-)
@subramanyams3742
@subramanyams3742 Жыл бұрын
why do we use boto3.client in the beginning and for the second part it is boto3.resouce. Could you please clarify. Thanks
@nadianizam6101
@nadianizam6101 Жыл бұрын
Excellent Explanation
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Жыл бұрын
Thank you nadia nizam! Happy Learning
@aishlaxmi8201
@aishlaxmi8201 2 жыл бұрын
Could you please explain snowflake architecture using the lambda functions for interview purposes
@varnanthirugnanasambandan559
@varnanthirugnanasambandan559 3 жыл бұрын
Always you are rock ing. Proud of you.
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 жыл бұрын
Thank You Sir :-)
@hghar8964
@hghar8964 2 жыл бұрын
Great video, but I am having trouble using WSl on my Pc. is there a way to create the zip file with the lambda python code and all its dependencies without WSL?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
Hello H ghar, You can create lambda layer using ec2 if you want some alternative of wsl or deployment zip , for that , you can refer this video -- kzbin.info/www/bejne/ZoKXqoltfcdqjNU Hope this will be helpful! Happy Learning :-)
@hghar8964
@hghar8964 2 жыл бұрын
@@KnowledgeAmplifier1 I did this task using lambda layers but I am getting this error when I test my function: Test Event Name lamdatestevent Response { "errorMessage": "'Records'", "errorType": "KeyError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 6, in lambda_handler s3_file_key = event['Records'][0]['s3']['object']['key']; " ] } What am I doing wrong?
@adithyabulusu8812
@adithyabulusu8812 2 жыл бұрын
Could you please help me on How do I split the large file(6 GB) from one S3 while transferring to another S3(Multiples files) and then move to Snowflake. S3(Source having Large file)-> Lamda function(split and move)- > S3(Destination)-> snowpipe->snowflake
@ravikreddy7470
@ravikreddy7470 2 жыл бұрын
Whats the difference between s3 client and resource?
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
Hello Ravi K Reddy, you can refer this -- stackoverflow.com/questions/42809096/difference-in-boto3-between-resource-client-and-session
@ravikreddy7470
@ravikreddy7470 2 жыл бұрын
@@KnowledgeAmplifier1 Thank you so much !! This helps
@conjoguam
@conjoguam 3 жыл бұрын
Thanks for the Tutorial. Do you know how to handle Updates and deletes in snowflake tables? I know, streams are used to do it, but I can't find an example from real application like from database or aws S3 to Snowflake for handling updates and deletes. I am trying to push application data from Postgres (where data can be inserted, updated and deleted) into Snowflake
@danishshadab5201
@danishshadab5201 3 жыл бұрын
Here is the steps. You need snowpipe,stream and task to set it up. 1. Consider a table t1_raw. The table will be uploaded using snowpipe. 2. Create a stream t1_stream on top of table t1_raw. what it will do is, as soon as there is a new record in t1_raw, The stream t1_stream will have those record. for example if t1_raw has 1000 record till today and if u load new 20 record in t1_raw. then those new 20 record will also be in t1_stream 3. Now U have another table t2_modelled. use the value from t1_stream to update the table t2_modelled. And in case u want to automate the process, use Task to update the record in t2_modelled. PseudoCode is bellow. Note the stream has data function. CREATE TASK mytask1 WAREHOUSE = mywh SCHEDULE = '5 minute' WHEN SYSTEM$STREAM_HAS_DATA('t1_stream') AS INSERT INTO t2_modelled(id,name) SELECT id, name FROM t1_stream WHERE METADATA$ACTION = 'INSERT';
@adithyabulusu8812
@adithyabulusu8812 2 жыл бұрын
HI I tried below steps to copy the file from one s3 bucket to another s3 bucket but when finally uploading the file in source bucket the file is not getting copied to destination bucket. Step1: Created two S3 buckets. Source_bucket Target_bucket. Step2: Created a role (lamdba based) with "s3 full access" Step3: Created lambda function with below parameters Runtime Python 3.8 Execution Role lamdbas3role (which I created newly) Step4: I created trigger with below parametes Location: s3 Bucket Name: source_bucket Event Type: All objects event type And finally enabled "Recursive Invocation" and then clicked on "Add" Step 5: I clicked on Code and entered below code. import json import boto3 import urllib.parse print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): # TODO implement bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8') target_bucket = 'lambdatargetbucket2022' copy_source = {'Bucket': source_bucket, 'Key': object_key} print ("Source bucket : ", bucket) print ("Target bucket : ", target_bucket) print ("Log Stream name: ", context.log_stream_name) print ("Log Group name: ", context.log_group_name) print ("Request ID: ", context.aws_request_id) print ("Mem. limits(MB): ", context.memory_limit_in_mb) try: print ("Using waiter to waiting for object to persist through s3 service") waiter = s3.get_waiter('object_exists') waiter.wait(Bucket=source_bucket, Key=object_key) s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source) return response['ContentType'] except Exception as err: print ("Error -"+str(err)) return e Step 6: When I finally save and test the code I am getting below error. { "errorMessage": "'Records'", "errorType": "KeyError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 11, in lambda_handler bucket = event['Records'][0]['s3']['bucket']['name'] " ] } Function Logs START RequestId: 0e41efd6-9a71-4348-9d09-f63d6fc5723e Version: $LATEST [ERROR] KeyError: 'Records' Traceback (most recent call last): File "/var/task/lambda_function.py", line 11, in lambda_handler bucket = event['Records'][0]['s3']['bucket']['name']END RequestId: 0e41efd6-9a71-4348-9d09-f63d6fc5723e REPORT RequestId: 0e41efd6-9a71-4348-9d09-f63d6fc5723e Duration: 1.97 ms Billed Duration: 2 ms Memory Size: 128 MB Max Memory Used: 68 MB Init Duration: 405.42 ms And also the final is not copying to another s3 bucket(Target). can you please help me where I did the mistake
@hanuman1414
@hanuman1414 Жыл бұрын
Love you man❤
@Buzzingfact
@Buzzingfact 2 жыл бұрын
Thanks for the amazing tutorials
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
Glad you like them BRIGHT SPARK! Happy Learning :-)
@sukumarreddy4150
@sukumarreddy4150 2 жыл бұрын
Thanks a lot Bro. Your content is really has lot of stuff many of them get benefited. May I know play list name of AWS and Snowflake related stuff.
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
Hello Sukumar , you can check this playlist for Data Engineering with AWS & Snowflake -- kzbin.info/aero/PLjfRmoYoxpNopPjdACgS5XTfdjyBcuGku Hope this will be helpful! Happy Learning :-)
@Ferruccio_Guicciardi
@Ferruccio_Guicciardi 2 жыл бұрын
Thanks a lot !
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
You are welcome Ferruccio Guicciardi! Happy Learning :-)
@kuldeepkumarsahu6618
@kuldeepkumarsahu6618 3 жыл бұрын
Sir is there any way we can ask help related to some different concept with you.so will you please provide the email id for same. Thank you.
@raddy9215
@raddy9215 2 жыл бұрын
Snowflake separate in playlist sir it will help for us
@KnowledgeAmplifier1
@KnowledgeAmplifier1 2 жыл бұрын
Hello Raddy , please check this below link -- doc.clickup.com/37466271/d/h/13qc4z-104/d4346819bd8d510
@raddy9215
@raddy9215 2 жыл бұрын
Tqu sm sir
@WannaBeFamous-i9w
@WannaBeFamous-i9w 3 жыл бұрын
Hardcoding the access_key_is and security_access_key_id is a very bad practice. I know this is just a demo. This is ok for a demo, but in actual project this should be avoided. Instead use AWS role and give the necessary permission.
@KnowledgeAmplifier1
@KnowledgeAmplifier1 3 жыл бұрын
Hello Subhamay , very valid point , I just used for demo , it is always better if we use aws secret manager or kms for storing secret credentials or using IAM for creating external stage . I just shown this for demo as lot of concepts might confuse and deviate from original topic ,if you want to make secure system with KMS , you can refer this video -- kzbin.info/www/bejne/o3PSqXuKlp2rY9k and for Snowflake stage creation using AWS Assume role , you can refer this video -- kzbin.info/www/bejne/g5vIaHR7pd2mgqM Happy Learning :-)
@Jk-ko4kv
@Jk-ko4kv Жыл бұрын
cant laugh when u said im showing this and delete later🤣🤣🤣
@KnowledgeAmplifier1
@KnowledgeAmplifier1 Жыл бұрын
😄
@middleclassmelodies99
@middleclassmelodies99 2 жыл бұрын
Topics are advanced but explanation is like idiotic
Snowflake Lambda Data Loader - Example with AWS S3 Trigger
26:42
Knowledge Amplifier
Рет қаралды 9 М.
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 42 МЛН
How Many Balloons To Make A Store Fly?
00:22
MrBeast
Рет қаралды 197 МЛН
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 110 МЛН
Подсадим людей на ставки | ЖБ | 3 серия | Сериал 2024
20:00
ПАЦАНСКИЕ ИСТОРИИ
Рет қаралды 601 М.
Snowflake Snowpipe - Email Alert Mechanism
22:54
Knowledge Amplifier
Рет қаралды 5 М.
Load JSON file into Snowflake table using Snowpipe
20:49
Knowledge Amplifier
Рет қаралды 6 М.
Top 5 Use Cases For AWS Lambda
12:36
Be A Better Dev
Рет қаралды 83 М.
Configuring a Snowflake Storage Integration to Access Amazon S3
19:18
Knowledge Amplifier
Рет қаралды 10 М.
AWS SQS + Lambda + DynamoDb Step by Step
31:03
LoveToCode
Рет қаралды 9 М.
Event Driven Architecture | AWS S3 . SNS . SQS . Lambda
19:54
ListenToLearn
Рет қаралды 19 М.
Using Snowpipe | How to ingest data from AWS S3 | Snowflake Demo
11:57
Mastering Snowflake
Рет қаралды 27 М.
How to treat Acne💉
00:31
ISSEI / いっせい
Рет қаралды 42 МЛН