Wonderful explanation, You are helping huge people without spending money for course training. Great Efforts which are uncountable.
@ketanrehpade91502 жыл бұрын
this is my fav topic , so easily explained that too in 30 minutes with all demos , excellent ...Thanks
@DataEngineering2 жыл бұрын
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
@nishavkrishnan42712 жыл бұрын
I just finished going through Chapter-1 to Chapter-14 .Time Travel one was so cool and this chapter was interesting with all hands on tutorial. Thank you so much!
@DataEngineering2 жыл бұрын
Glad you enjoyed it.
@peterscalise11762 жыл бұрын
Just finished chapters 1-14...GREAT TUTORIALS! THANK YOU!!!! 😀
@DataEngineering2 жыл бұрын
Glad you like them!
@neerajtiwari16952 жыл бұрын
your all 14 video , I have seen. Now I am waiting for rest of your videos which you have mentioned in course curriculum . Thanks again for your wonderful content in Video
@DataEngineering2 жыл бұрын
The ch-15 is out kzbin.info/www/bejne/mXLcfmCEn56KhcU, hope you would like and learn from it.
@moh.7777 Жыл бұрын
Thanks for your excellent videos. Can you please share the sql scripts used for time travel tutorial?
@SowmyaAS-x7tАй бұрын
Excellent
@DataEngineeringАй бұрын
Thank you! Cheers!
@tejaswinerella5223 Жыл бұрын
Hi, Thanks for your excellent videos. Can you please share the SQL scripts used for the time travel tutorial?
@vedantshirodkar Жыл бұрын
Nicely explained. Thank you
@DataEngineering Жыл бұрын
Glad it was helpful! and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
@rociogonzalez3577 Жыл бұрын
Great tutorial :) would you be able to share the SQL scripts used?I'm not able to see it in the link you provided. Thanks :)
@alertforfalsecase22992 жыл бұрын
Excellent explanation. Really enjoyed lot
@DataEngineering2 жыл бұрын
Glad it was helpful! ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
@RK-wf7re2 жыл бұрын
Nice content and your efforts are superb. Eagerly waiting for 15th part, when can we expect?
@DataEngineering2 жыл бұрын
Will upload soon
@geoffreyhibon26512 жыл бұрын
Thanks a lot for your videos :)
@DataEngineering2 жыл бұрын
thanks a lot .. good to know that these videos are adding value...
@arumughamthiagarajan4910 Жыл бұрын
Sessions are great and worth spending time. Can you please share the link for sample tests?
@DataEngineering Жыл бұрын
here is the entier playlist - and look for Ch-14 from this playlist (kzbin.info/aero/PLba2xJ7yxHB5X2CMe7qZZu-V4LxNE1HbF)
@ramum46842 жыл бұрын
Excellent hardworking I can see from this team for making us expertise in snowflaks. Now I would like to get all the sql scripts used in the all vedios for practice. I would be thanks full if I get all the scripts including all python scripts for practice and complete understanding. Thanks in advance Ramu M
@DataEngineering2 жыл бұрын
Yes, sure. Let me see how can I make all of them available.
@tanayamandal1112 жыл бұрын
Another excellent video. Thank you so much. Please make a video on advance snowflake certifications if possible. Planning to go for another certification after snowpro but very little information is there on internet.
@DataEngineering2 жыл бұрын
You are welcome... It is in my list but it takes time as I want to make sure that I cover as much as possible for my audience. Let me share my quick experience from my adv certification. 1. Questions are very descriptive and it takes time to read them. 2. You will have a time crunch as, 2 of the option will look exactly same. 3. Lot of scenario based question from every topic 4. 3-4 question will come from basics, so make sure you read the basic too... 5. Lot of question from data loading, micro-partition, stored procedure (specially javascript), transaction, performance and most of them scenarios. 6. You would not get time to go back and review, so don't jump, go in sequence. 7. Read the documentation and new features also.. like masking, API call etc.. and 4-6 weeks would be enough.. since you are snowpro certification, can you appear for adv de or architecture exam. Will surely make a detailed video, but you have to wait..
@tanayamandal1112 жыл бұрын
@@DataEngineering thank you so much for your inputs. Your videos helped a lot to clear snowpro.
@arumughamthiagarajan4910 Жыл бұрын
I have completed snowpro core certification. These videos are great and helped a lot. I am planning to do Snowflake Architecture Certification. Are there separate videos for that?
@DataEngineering Жыл бұрын
Great to hear! .. I have not yet published for architect level.. but will do it soon..
@dt02292 жыл бұрын
great content, thank you!
@DataEngineering2 жыл бұрын
Welcome!
@srinivasp6579 Жыл бұрын
Thank you for the detailed tutorial. I really appreciate. It looks like Sql Scripts link is not working. It is giving an error:" You don't have permission to access the resource."
@dhirajgrover86642 жыл бұрын
All your vedios are great , Where I can find scripts used in your vedios ?
@DataEngineering2 жыл бұрын
I am working on it.. will update you soon.
@udayrajdАй бұрын
Is time travel achieved through nonmutable micro partitions ?
@harshaaaditya24302 жыл бұрын
hi, I have a question regd "data_retention_time_in_days". Is this attribute inheritable? If a database is created with this attribute value set to say 10 days, any schema, table within the database created without this attribute defined would get the value as 10 or would it be a default value of 1?
@DataEngineering2 жыл бұрын
Yes, you are right.
@niveditharaokulakarni41932 жыл бұрын
hi, Thanks for making such detailed videos which are really helpful in understanding snowflake. I have question here, I have created a table on day1 . From day 1 on wards it is receiving updates. I can recover the old data up to 90 days using time travel. What if the updates are happened beyond 90 days (or beyond fail safe stage). How can i recover the data after fail safe? And why it is only 90 days. Please explain. Thanks in advance.
@DataEngineering2 жыл бұрын
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. Time travel cost a lot for churning table (if enabled for 90 days) and if you make it forever, it will cost a hell in snowflake account. If you need to store historical data for longer period, probably time travel is not the right feature.
@niveditharaokulakarni41932 жыл бұрын
Thank you for taking time and responding. But what happens to the old micro partitions of a table for which updates have been received. And to those mp's for which retention period and failsafe is completed. Thank you.
@govardhanyadav2684 Жыл бұрын
Thanks for nice content , Would love to see more ! I have a question, I had set time travel of 4 days at schema level but inside table , I have changed to 10 days . If schema gets dropped beyond time travel period, still can I get the data for table which is having time travel period of 10 days? Thanks !
@akashsharma4769 Жыл бұрын
Hi govardhan yadav, This was a good question and it helped me to clear my concept as I was also confused. I would like to share my learnings with you. You have defined a schema with a time travel retention period of 2 days. This means any tables using this schema will by default keep historical data for 2 days. However, for a specific table that uses this schema, you have overridden the retention period and set it to 10 days instead. In this case, the table-level setting takes precedence. So even though the schema retention is 2 days, that table will keep historical data for 10 days. The schema-level retention period acts as a default that can be overridden at the table level. But any other tables using the schema and not overriding will still follow the 2 day retention.
@swethakalidoss6578 Жыл бұрын
will fail safe enables in 1day for standard snowflake account? since standard edition is having 1 day time travel.
@DataEngineering Жыл бұрын
fail safe is a table level feature and not edition level feature... so yes...
@kannanarjun6569 Жыл бұрын
every time I need to take statement id while update or deleting the rows
@narenkrishh7412 Жыл бұрын
Hi bro. Please help us by providing the sql scripts. It is not working.
@sharathbunty891 Жыл бұрын
We reduced time retention period from 3 to 2 so that past made(first day changes) changes will lost..isnt possible using fail safe can we bring changes we made back the changes
@DataEngineering Жыл бұрын
Yes, it can be done but it is too much of effort unless the data is very critical to your and your team... if it is permanent table and if you have lost data due to time-travel configuration, you can contact snowflake team and they will recover it.... I assume you are not using free trial edition and you/company has contract with snowflake team.
@ankitsoni5286 Жыл бұрын
Hi, it would be great if you mention which of your quiz videos we have to go thru after finishing each of this one's from series.
@DataEngineering Жыл бұрын
From Ch-11 they are one to one mapping...
@rohitmanderwad6034 Жыл бұрын
What happens to time travel if a table is renamed? And if I alter table and add new column? Or create or replace table?
@DataEngineering Жыл бұрын
good question.. but I would suggest.. it is easy to try that out and share the behaviour..
@rohitmanderwad6034 Жыл бұрын
I would be more than happy to try it out. But I use individual personal account and unfortunately I have consumed all the credits available. The billing has emptied my pockets 😅
@gokulnavamaniphotosbygokul10722 жыл бұрын
I've a question, while creating a table by default it is creating under transient tables, Is there's any way to change into the permanent table?
@DataEngineering2 жыл бұрын
The default table is permanent table, you might be making a mistake. Could you share the DDL script? that will help me to understand the problem. I generally use transitent in my SQL as it cost less compare to permanent table;
@gokulnavamaniphotosbygokul10722 жыл бұрын
@@DataEngineering Here's the same Table Script that you've used in this tutorial - CREATE OR REPLACE TABLE POPWAREHOUSE.ORDER_DETAILS ( O_ORDERKEY NUMBER(38), O_CUSTKEY NUMBER(38,0), O_ORDERSTATUS VARCHAR(1), O_TOTALPRICE NUMBER(12,2), O_ORDERDATE DATE, O_ORDERPRIORITY VARCHAR(15), O_CLERK VARCHAR(15), O_SHIPPRIORITY NUMBER(38,0), O_COMMENT VARCHAR(79) ) data_retention_time_in_days = 50; Actually, I'm using a Free Trial Account (Enterprise Edition), Could that be a reason?
@asilbek_sanoqulov Жыл бұрын
Guys can anybody share the scripts, on site given by author it says not everyone could have access
@VenkataVamsiVardhineni4 ай бұрын
unable to open sql scripts
@raghumajji23262 жыл бұрын
Hi..I have one question time travel...Assume that we have created 1 table with time travel of 90 days..and there are no transaction happened in between..on 92nd day I dropped the table will I be able to un drop?
@DataEngineering2 жыл бұрын
Yes, the 90 days is a moving window and it is not fixed from the day it is created... so if you created the table on 01-Jan and dropped the table on 02-Apr (31days in Jan, 29days in Feb & 31days in Mar = 90 days).. the recovery window will start from 03-Jan and any transaction done between 01-Jan to 02-jan will be lost but all the data starting from 03-Apr minus 90 days can be recovered. I hope this is clear.
@raghumajji23262 жыл бұрын
@@DataEngineering Thank You so much...your videos are awesome. Thank you so much for your efforts to help to learn others..
@DataEngineering2 жыл бұрын
@@raghumajji2326 thanks..
@ushakiran38702 жыл бұрын
Matillion how it's work
@DataEngineering2 жыл бұрын
It is a big topic, do you have any specific requirement or learning expectation. Matallion has its own learning university.