Hi Sanjay - why storage integration is not done ? is it already done before ?
@jignesh100110 ай бұрын
Nice 👍 video !!
@akshaychakre284510 ай бұрын
Hi Sanjay, Nice demonstration. I am facing issue, files are getting load when I am manually refreshing pipe. Seems auto ingest not working. Please suggest
@saumyakapoor677211 ай бұрын
Thank you for the video. If I have an on prem oracle database as the source and the frequency with which the data changes is quite high (insert, update and delete) and the current requirement from the client is near real time data insights. so do you recommend using snowpipe to ingest data from azure(adls in this case) into snowflake tables. but what should be the load mechanism or tool or technology used for data ingestion into azure from the source database? Thanks in advance :)
@NdKe-j3k11 ай бұрын
what if the file size is greater than 50 MB? will the pipeline run?
@pink_bear7773 Жыл бұрын
Hey Sanjay, really great video, well explained. Thank you.
@rajnava944 Жыл бұрын
Thank you. Very useful
@deepthikotipalli Жыл бұрын
hi, how can we know the name of the snowpipe while loading
@pvijayabhaskar1520 Жыл бұрын
Sir snowflake learn enough SQL or python
@ashu60071 Жыл бұрын
Hi Sir I cannot join whatsapp group. Please assist 🙏.
@rangamharsha Жыл бұрын
I am from NON IT background .. planning to move to IT INDUSTRY. which is the best technology to be choosed.. 1) Snowflakes, SQL 2) power bi, power apps, power automate,SQL 3) POWER BI, (SSIS/ ADF),SQL I am confused to those.. please guide me...
@isrivmrao5764 Жыл бұрын
un Structured data should be uploaded to Snowflake and that's it .. using Pyspark
@isrivmrao5764 Жыл бұрын
Many tables are needed, and Snowflake Destination Tables should be made separately again.
@AlAtv-rc4or Жыл бұрын
your video is no longer reflecting the latest changes and cannot be used to establish snowpipe, you should create a new one
@shashankm2859 Жыл бұрын
how can i configure a snowpipe to grab the same filename from an s3 bucket when the file is refreshed and re-uploaded?
@ShubhamKumar-rm3qe Жыл бұрын
Materialized views are faster than normal views.
@hussainshaik5365 Жыл бұрын
hello sir, can you please tell me HOW TO CREATE CONNECT TO DATABRICKS WITH SNOWFLAKE
@אלכסנחשונוב-פ5ע Жыл бұрын
Hi Sanjay, I have a small question, I am trying to build a dwh pipeline, I got 2 json source files in on an s3 bucket, I created a pipe from s3 to my staging schema but I am not sure how do I continue from here, is there a way to create a pipline between two schemas\ tables or should I create a task + procedure to refresh every few minutes to load the data into prod? Thanks Alex
@chetangowda2403 Жыл бұрын
i have a scenario where two accounts are in same region but diff provider .if we are sharing tables and in consumer account we want to do dml on tables after creating database on share would it be possible?
@anjalikp2680 Жыл бұрын
Hi..so this is for migrating an on premise db to snowflake.. we can use talent job to load daily S3 files to SF right? Donu have any videos on that
@anjalikp2680 Жыл бұрын
So informative
@neelamkushwaha17032 жыл бұрын
i have tried as per the steps mentioned.but still data is not loading into table
@kushalappabe19932 жыл бұрын
i could'nt get the files from s3 to snowflake inspite of following your process of configs
@praffulgupta8092 жыл бұрын
can anyone help me out? please
@praffulgupta8092 жыл бұрын
can I get these codes? Please
@praffulgupta8092 жыл бұрын
I did the same but unfortunately, my data is not coming in the table. can you help me out?
@sri6892 жыл бұрын
Sanjay i want sf notes pls give the link
@sri6892 жыл бұрын
Hi Sanjay ur videos excellent
@eladed74342 жыл бұрын
wouldn't it be faster and more efficient to cut the lambda in the middle, and stream directly from api gateway to kinesis? or what about using a lambda to directly put record onto firehose?
@anandmathad56782 жыл бұрын
super sir.. very informative.. may be you can slow a bit when explaining things...it helps beginners understand better
@raosree69552 жыл бұрын
Excellent
@satishyalalla38422 жыл бұрын
how to convert sql query to snowsql query with out error?
@appasahebraghobaanure86112 жыл бұрын
Very good, Sanjay. 🙏
@raymondbyczko2 жыл бұрын
Your video was informative. Thank you!
@stevemiller1232 жыл бұрын
for those following along don't forget to include skip_header=1
@anjanashetty4822 жыл бұрын
Do we have to build data model within Power BI if our source for reporting is Snowflake warehouse?
@kbkonatham17012 жыл бұрын
good demo from end to end details. thanks for your efforts
@humanityislife59312 жыл бұрын
Thank you sir
@tonydoberman212 жыл бұрын
Nice comprehensive video
@AGurau2 жыл бұрын
Where are the previous videos? I am new to Talend and this is too advance for me. I can't find this "code" thing in the component. More than this I need to incorporate in the email's text the content of some store variable (or the results of running sql server stored procedures). I'd like a video that addresses this. Thanks.
@prasadsd10942 жыл бұрын
Hi Sanjay can you Connecting SSIS To SNOWFLAKE Using ODBC driver
@thomsondcruz2 жыл бұрын
Great video. Typically in projects we use a Azure SF Storage Integration instead of using a SAS key while creating a Stage.
@raghavsengar Жыл бұрын
yup, if Integration is there , then better use it rather than SAS keys which is temp
@iamincognito57652 жыл бұрын
@Sanjay Kattimani Bro Awesome explanation... 👌 Using this explanation I have created project on my resume using my current ETL project business with additional transformation mapping rules... I can now start marketing my resume for snowflake positions... Thank you so much...