Very nice. Good you did not give up and got it to work finally.
@KnowledgeAmplifier13 жыл бұрын
Yeah thanks paracha3! Happy Learning :-)
@observatoirelibredesbanque17433 жыл бұрын
This is exactly what I was looking for to complete an assignment. Thank you for the good work. #Stayblessed
@KnowledgeAmplifier13 жыл бұрын
Glad it was helpful Observatoire Libre des Banques Africaines! Happy Learning :-)
@nabarunchakraborti37002 жыл бұрын
very nice demo. keep making good and informative videos.
@KnowledgeAmplifier12 жыл бұрын
Thank you nabarun chakraborti for your kind words ! Happy Learning :-)
@subramanyams3742 Жыл бұрын
why do we use boto3.client in the beginning and for the second part it is boto3.resouce. Could you please clarify. Thanks
@nadianizam6101 Жыл бұрын
Excellent Explanation
@KnowledgeAmplifier1 Жыл бұрын
Thank you nadia nizam! Happy Learning
@aishlaxmi82012 жыл бұрын
Could you please explain snowflake architecture using the lambda functions for interview purposes
@varnanthirugnanasambandan5593 жыл бұрын
Always you are rock ing. Proud of you.
@KnowledgeAmplifier13 жыл бұрын
Thank You Sir :-)
@hghar89642 жыл бұрын
Great video, but I am having trouble using WSl on my Pc. is there a way to create the zip file with the lambda python code and all its dependencies without WSL?
@KnowledgeAmplifier12 жыл бұрын
Hello H ghar, You can create lambda layer using ec2 if you want some alternative of wsl or deployment zip , for that , you can refer this video -- kzbin.info/www/bejne/ZoKXqoltfcdqjNU Hope this will be helpful! Happy Learning :-)
@hghar89642 жыл бұрын
@@KnowledgeAmplifier1 I did this task using lambda layers but I am getting this error when I test my function: Test Event Name lamdatestevent Response { "errorMessage": "'Records'", "errorType": "KeyError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 6, in lambda_handler s3_file_key = event['Records'][0]['s3']['object']['key']; " ] } What am I doing wrong?
@adithyabulusu88122 жыл бұрын
Could you please help me on How do I split the large file(6 GB) from one S3 while transferring to another S3(Multiples files) and then move to Snowflake. S3(Source having Large file)-> Lamda function(split and move)- > S3(Destination)-> snowpipe->snowflake
@ravikreddy74702 жыл бұрын
Whats the difference between s3 client and resource?
@KnowledgeAmplifier12 жыл бұрын
Hello Ravi K Reddy, you can refer this -- stackoverflow.com/questions/42809096/difference-in-boto3-between-resource-client-and-session
@ravikreddy74702 жыл бұрын
@@KnowledgeAmplifier1 Thank you so much !! This helps
@conjoguam3 жыл бұрын
Thanks for the Tutorial. Do you know how to handle Updates and deletes in snowflake tables? I know, streams are used to do it, but I can't find an example from real application like from database or aws S3 to Snowflake for handling updates and deletes. I am trying to push application data from Postgres (where data can be inserted, updated and deleted) into Snowflake
@danishshadab52013 жыл бұрын
Here is the steps. You need snowpipe,stream and task to set it up. 1. Consider a table t1_raw. The table will be uploaded using snowpipe. 2. Create a stream t1_stream on top of table t1_raw. what it will do is, as soon as there is a new record in t1_raw, The stream t1_stream will have those record. for example if t1_raw has 1000 record till today and if u load new 20 record in t1_raw. then those new 20 record will also be in t1_stream 3. Now U have another table t2_modelled. use the value from t1_stream to update the table t2_modelled. And in case u want to automate the process, use Task to update the record in t2_modelled. PseudoCode is bellow. Note the stream has data function. CREATE TASK mytask1 WAREHOUSE = mywh SCHEDULE = '5 minute' WHEN SYSTEM$STREAM_HAS_DATA('t1_stream') AS INSERT INTO t2_modelled(id,name) SELECT id, name FROM t1_stream WHERE METADATA$ACTION = 'INSERT';
@adithyabulusu88122 жыл бұрын
HI I tried below steps to copy the file from one s3 bucket to another s3 bucket but when finally uploading the file in source bucket the file is not getting copied to destination bucket. Step1: Created two S3 buckets. Source_bucket Target_bucket. Step2: Created a role (lamdba based) with "s3 full access" Step3: Created lambda function with below parameters Runtime Python 3.8 Execution Role lamdbas3role (which I created newly) Step4: I created trigger with below parametes Location: s3 Bucket Name: source_bucket Event Type: All objects event type And finally enabled "Recursive Invocation" and then clicked on "Add" Step 5: I clicked on Code and entered below code. import json import boto3 import urllib.parse print('Loading function') s3 = boto3.client('s3') def lambda_handler(event, context): # TODO implement bucket = event['Records'][0]['s3']['bucket']['name'] key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8') target_bucket = 'lambdatargetbucket2022' copy_source = {'Bucket': source_bucket, 'Key': object_key} print ("Source bucket : ", bucket) print ("Target bucket : ", target_bucket) print ("Log Stream name: ", context.log_stream_name) print ("Log Group name: ", context.log_group_name) print ("Request ID: ", context.aws_request_id) print ("Mem. limits(MB): ", context.memory_limit_in_mb) try: print ("Using waiter to waiting for object to persist through s3 service") waiter = s3.get_waiter('object_exists') waiter.wait(Bucket=source_bucket, Key=object_key) s3.copy_object(Bucket=target_bucket, Key=object_key, CopySource=copy_source) return response['ContentType'] except Exception as err: print ("Error -"+str(err)) return e Step 6: When I finally save and test the code I am getting below error. { "errorMessage": "'Records'", "errorType": "KeyError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 11, in lambda_handler bucket = event['Records'][0]['s3']['bucket']['name'] " ] } Function Logs START RequestId: 0e41efd6-9a71-4348-9d09-f63d6fc5723e Version: $LATEST [ERROR] KeyError: 'Records' Traceback (most recent call last): File "/var/task/lambda_function.py", line 11, in lambda_handler bucket = event['Records'][0]['s3']['bucket']['name']END RequestId: 0e41efd6-9a71-4348-9d09-f63d6fc5723e REPORT RequestId: 0e41efd6-9a71-4348-9d09-f63d6fc5723e Duration: 1.97 ms Billed Duration: 2 ms Memory Size: 128 MB Max Memory Used: 68 MB Init Duration: 405.42 ms And also the final is not copying to another s3 bucket(Target). can you please help me where I did the mistake
@hanuman1414 Жыл бұрын
Love you man❤
@Buzzingfact2 жыл бұрын
Thanks for the amazing tutorials
@KnowledgeAmplifier12 жыл бұрын
Glad you like them BRIGHT SPARK! Happy Learning :-)
@sukumarreddy41502 жыл бұрын
Thanks a lot Bro. Your content is really has lot of stuff many of them get benefited. May I know play list name of AWS and Snowflake related stuff.
@KnowledgeAmplifier12 жыл бұрын
Hello Sukumar , you can check this playlist for Data Engineering with AWS & Snowflake -- kzbin.info/aero/PLjfRmoYoxpNopPjdACgS5XTfdjyBcuGku Hope this will be helpful! Happy Learning :-)
@Ferruccio_Guicciardi2 жыл бұрын
Thanks a lot !
@KnowledgeAmplifier12 жыл бұрын
You are welcome Ferruccio Guicciardi! Happy Learning :-)
@kuldeepkumarsahu66183 жыл бұрын
Sir is there any way we can ask help related to some different concept with you.so will you please provide the email id for same. Thank you.
@raddy92152 жыл бұрын
Snowflake separate in playlist sir it will help for us
@KnowledgeAmplifier12 жыл бұрын
Hello Raddy , please check this below link -- doc.clickup.com/37466271/d/h/13qc4z-104/d4346819bd8d510
@raddy92152 жыл бұрын
Tqu sm sir
@WannaBeFamous-i9w3 жыл бұрын
Hardcoding the access_key_is and security_access_key_id is a very bad practice. I know this is just a demo. This is ok for a demo, but in actual project this should be avoided. Instead use AWS role and give the necessary permission.
@KnowledgeAmplifier13 жыл бұрын
Hello Subhamay , very valid point , I just used for demo , it is always better if we use aws secret manager or kms for storing secret credentials or using IAM for creating external stage . I just shown this for demo as lot of concepts might confuse and deviate from original topic ,if you want to make secure system with KMS , you can refer this video -- kzbin.info/www/bejne/o3PSqXuKlp2rY9k and for Snowflake stage creation using AWS Assume role , you can refer this video -- kzbin.info/www/bejne/g5vIaHR7pd2mgqM Happy Learning :-)
@Jk-ko4kv Жыл бұрын
cant laugh when u said im showing this and delete later🤣🤣🤣
@KnowledgeAmplifier1 Жыл бұрын
😄
@middleclassmelodies992 жыл бұрын
Topics are advanced but explanation is like idiotic