Snowflake Time Travel
13:48
5 жыл бұрын
Snowflake Data Sharing
15:29
5 жыл бұрын
Snowflake Secure Views
9:39
5 жыл бұрын
SnowSql - Snowflake's query language
12:02
Connecting Power BI to SnowFlake
9:00
Introduction to Talend
12:31
7 жыл бұрын
Пікірлер
@jansivarun
@jansivarun 3 ай бұрын
Hi Sir, I have configured all those things you shown in this video. The pipe is in running status but the data was not loaded into the table.
@mohindersingh0711
@mohindersingh0711 3 ай бұрын
Thank you so much Sanjay for this awesome video
@RonuRaj2
@RonuRaj2 5 ай бұрын
Do we have any snowflake whatsapp group?
@jk17
@jk17 5 ай бұрын
Can you send the WhatsApp group link?
@MarQKis
@MarQKis 5 ай бұрын
What if there's a requirement to archive files after they've been processed.
@user-np5fw6hz4o
@user-np5fw6hz4o 3 ай бұрын
You got your answer for this?
@tysonandkai1
@tysonandkai1 6 ай бұрын
Hey Can you please guide how to SQL server to Snowflake using Matillion
@freakzisback
@freakzisback 8 ай бұрын
I’m getting error below: [ODBC Source [2]] Error: SQLSTATE: HY000, Message: [Cloudera][Hardy] (35) Error from server: error code: '1' error message: 'Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask'.;
@jignesh1001
@jignesh1001 9 ай бұрын
Hi Sanjay - why storage integration is not done ? is it already done before ?
@jignesh1001
@jignesh1001 10 ай бұрын
Nice 👍 video !!
@akshaychakre2845
@akshaychakre2845 10 ай бұрын
Hi Sanjay, Nice demonstration. I am facing issue, files are getting load when I am manually refreshing pipe. Seems auto ingest not working. Please suggest
@saumyakapoor6772
@saumyakapoor6772 11 ай бұрын
Thank you for the video. If I have an on prem oracle database as the source and the frequency with which the data changes is quite high (insert, update and delete) and the current requirement from the client is near real time data insights. so do you recommend using snowpipe to ingest data from azure(adls in this case) into snowflake tables. but what should be the load mechanism or tool or technology used for data ingestion into azure from the source database? Thanks in advance :)
@NdKe-j3k
@NdKe-j3k 11 ай бұрын
what if the file size is greater than 50 MB? will the pipeline run?
@pink_bear7773
@pink_bear7773 Жыл бұрын
Hey Sanjay, really great video, well explained. Thank you.
@rajnava944
@rajnava944 Жыл бұрын
Thank you. Very useful
@deepthikotipalli
@deepthikotipalli Жыл бұрын
hi, how can we know the name of the snowpipe while loading
@pvijayabhaskar1520
@pvijayabhaskar1520 Жыл бұрын
Sir snowflake learn enough SQL or python
@ashu60071
@ashu60071 Жыл бұрын
Hi Sir I cannot join whatsapp group. Please assist 🙏.
@rangamharsha
@rangamharsha Жыл бұрын
I am from NON IT background .. planning to move to IT INDUSTRY. which is the best technology to be choosed.. 1) Snowflakes, SQL 2) power bi, power apps, power automate,SQL 3) POWER BI, (SSIS/ ADF),SQL I am confused to those.. please guide me...
@isrivmrao5764
@isrivmrao5764 Жыл бұрын
un Structured data should be uploaded to Snowflake and that's it .. using Pyspark
@isrivmrao5764
@isrivmrao5764 Жыл бұрын
Many tables are needed, and Snowflake Destination Tables should be made separately again.
@AlAtv-rc4or
@AlAtv-rc4or Жыл бұрын
your video is no longer reflecting the latest changes and cannot be used to establish snowpipe, you should create a new one
@shashankm2859
@shashankm2859 Жыл бұрын
how can i configure a snowpipe to grab the same filename from an s3 bucket when the file is refreshed and re-uploaded?
@ShubhamKumar-rm3qe
@ShubhamKumar-rm3qe Жыл бұрын
Materialized views are faster than normal views.
@hussainshaik5365
@hussainshaik5365 Жыл бұрын
hello sir, can you please tell me HOW TO CREATE CONNECT TO DATABRICKS WITH SNOWFLAKE
@אלכסנחשונוב-פ5ע
@אלכסנחשונוב-פ5ע Жыл бұрын
Hi Sanjay, I have a small question, I am trying to build a dwh pipeline, I got 2 json source files in on an s3 bucket, I created a pipe from s3 to my staging schema but I am not sure how do I continue from here, is there a way to create a pipline between two schemas\ tables or should I create a task + procedure to refresh every few minutes to load the data into prod? Thanks Alex
@chetangowda2403
@chetangowda2403 Жыл бұрын
i have a scenario where two accounts are in same region but diff provider .if we are sharing tables and in consumer account we want to do dml on tables after creating database on share would it be possible?
@anjalikp2680
@anjalikp2680 Жыл бұрын
Hi..so this is for migrating an on premise db to snowflake.. we can use talent job to load daily S3 files to SF right? Donu have any videos on that
@anjalikp2680
@anjalikp2680 Жыл бұрын
So informative
@neelamkushwaha1703
@neelamkushwaha1703 2 жыл бұрын
i have tried as per the steps mentioned.but still data is not loading into table
@kushalappabe1993
@kushalappabe1993 2 жыл бұрын
i could'nt get the files from s3 to snowflake inspite of following your process of configs
@praffulgupta809
@praffulgupta809 2 жыл бұрын
can anyone help me out? please
@praffulgupta809
@praffulgupta809 2 жыл бұрын
can I get these codes? Please
@praffulgupta809
@praffulgupta809 2 жыл бұрын
I did the same but unfortunately, my data is not coming in the table. can you help me out?
@sri689
@sri689 2 жыл бұрын
Sanjay i want sf notes pls give the link
@sri689
@sri689 2 жыл бұрын
Hi Sanjay ur videos excellent
@eladed7434
@eladed7434 2 жыл бұрын
wouldn't it be faster and more efficient to cut the lambda in the middle, and stream directly from api gateway to kinesis? or what about using a lambda to directly put record onto firehose?
@anandmathad5678
@anandmathad5678 2 жыл бұрын
super sir.. very informative.. may be you can slow a bit when explaining things...it helps beginners understand better
@raosree6955
@raosree6955 2 жыл бұрын
Excellent
@satishyalalla3842
@satishyalalla3842 2 жыл бұрын
how to convert sql query to snowsql query with out error?
@appasahebraghobaanure8611
@appasahebraghobaanure8611 2 жыл бұрын
Very good, Sanjay. 🙏
@raymondbyczko
@raymondbyczko 2 жыл бұрын
Your video was informative. Thank you!
@stevemiller123
@stevemiller123 2 жыл бұрын
for those following along don't forget to include skip_header=1
@anjanashetty482
@anjanashetty482 2 жыл бұрын
Do we have to build data model within Power BI if our source for reporting is Snowflake warehouse?
@kbkonatham1701
@kbkonatham1701 2 жыл бұрын
good demo from end to end details. thanks for your efforts
@humanityislife5931
@humanityislife5931 2 жыл бұрын
Thank you sir
@tonydoberman21
@tonydoberman21 2 жыл бұрын
Nice comprehensive video
@AGurau
@AGurau 2 жыл бұрын
Where are the previous videos? I am new to Talend and this is too advance for me. I can't find this "code" thing in the component. More than this I need to incorporate in the email's text the content of some store variable (or the results of running sql server stored procedures). I'd like a video that addresses this. Thanks.
@prasadsd1094
@prasadsd1094 2 жыл бұрын
Hi Sanjay can you Connecting SSIS To SNOWFLAKE Using ODBC driver
@thomsondcruz
@thomsondcruz 2 жыл бұрын
Great video. Typically in projects we use a Azure SF Storage Integration instead of using a SAS key while creating a Stage.
@raghavsengar
@raghavsengar Жыл бұрын
yup, if Integration is there , then better use it rather than SAS keys which is temp
@iamincognito5765
@iamincognito5765 2 жыл бұрын
@Sanjay Kattimani Bro Awesome explanation... 👌 Using this explanation I have created project on my resume using my current ETL project business with additional transformation mapping rules... I can now start marketing my resume for snowflake positions... Thank you so much...