can we also load incremental data automatically by scheduling glue job?
@durgakolapkar50992 жыл бұрын
Very informative video. Thank you very much for sharing :)
@ragook3 Жыл бұрын
Why do you create s3 endpoint to access RDS which is in private. Is it not create RDS endpoint.
@aishwaryawalkar59502 жыл бұрын
Hey Can I do it with EC2 sql..I mean s3 to EC2 sql instance using glue
@somaguttadamodharreddy84352 жыл бұрын
Can we do update operation in mysql db with glue
@Arvindkumar-mb8yj2 жыл бұрын
Can you also load below CSV type ? There is CSV file with some comment in initial 2-3 line and then header starts and at the end there is one comment line which says total number of records in CSV file. How to crawl such CSV file and load into snowflake ?
@vishnuMSify Жыл бұрын
Amazing, thank you very much
@durgarasane-kolapkar18422 жыл бұрын
In our case, files are going to be loaded in S3 from on-prem file-system. S3 then has to check file integrity (md5 checksum and row count comparison between on-prem FS and S3). The files which pass this integrity check have to move from S3 to Posgres. Can you please suggest any way to do this?
@kshitijbansal3672 Жыл бұрын
Hey you got any solution for this?
@joaovitor12full2 жыл бұрын
thank you very much!!!
@sunjith71772 жыл бұрын
How can we load data from S3 to Amazon Aurora PostgreSql
@hannagirma2849 Жыл бұрын
Good question, please share the info if you are able too. tnxs