Amazing demo, very clear and well-detailed, step-by-step - rare quality content! Thank you!
@shatrughanbtech-ek4mp Жыл бұрын
Nice present that I expect. I have seen many videos to How load data in s3 to snowflake❄️ but not found any single video to how integrate aws with snowflake. 🙏🏼🙏🏼👍 Thank u so much.
@KSRDatavizon Жыл бұрын
Thank You so much
@vishalcrazy51213 ай бұрын
Very very very helpful to get the real exposure
@KSRDatavizon3 ай бұрын
Thanks for your kind words! We're glad our content is helpful. If you found it valuable, please consider subscribing to our channel for more similar content. It helps us keep creating videos that you'll enjoy. 😊"
Thank you so much, Please subscribe our channel, it motivates us a lot
@jeydrn276 Жыл бұрын
absolutely amazing presentation, this is exactly what I was looking for until I get access to our account. Thanks very much!
@KSRDatavizon Жыл бұрын
Glad it was helpful!
@srinivaskotagiri1113 Жыл бұрын
Can I get from starting of snowflake, I have experience on DBA.
@anishyapaul51898 ай бұрын
Good Demo and explanation. Thank you
@KSRDatavizon8 ай бұрын
Glad it was helpful!, Please subscribe our channel for regular updates .
@DeltaJes-co8yu9 күн бұрын
A beautiful work spoiled by the 1000s of "OKAY"s. My head exploded hearing the continuous OKAY and I quit.
@shavy223 Жыл бұрын
Thank you for clear explanation with step by step process. I would love to learn more from you.
@KSRDatavizon Жыл бұрын
Thanks a lot
@balajikomma541 Жыл бұрын
kindly make project video on "Snowflake Integration with Azure"
@KSRDatavizon Жыл бұрын
Sure, will do shortly, Please subscribe our channel, it motivates us
@reachinganwesha123 Жыл бұрын
Excellent.. sooo helpful
@KSRDatavizon Жыл бұрын
Glad it was helpful!
@asmitjais3776 Жыл бұрын
Very help full..thanks..
@KSRDatavizon Жыл бұрын
Most welcome
@dilipkumarreddyjanumpally6231 Жыл бұрын
Thanks for your video it's helpful for while do integration end yo end
@KSRDatavizon Жыл бұрын
You are welcome, please subscribe our channel for regular updates,
@ravishankarrallabhandi531 Жыл бұрын
Thanks for explaining it with an end to end project. I have a question... Once we create an integration on a s3 bucket and create a stage on top of it.. how do we identify and load only the delta files into snowflake table? For ex: On Day 1 we have 10 files.. and on Day 2 we have 10 new files.. how can manage to load only the new 10 files into snowflake table? Thanks you.
@KSRDatavizonАй бұрын
We can maintain meta data..(meta table) which captures the timestamp of data load.. this will be helpful for tracking purpose
@JimnyGirl11 ай бұрын
very nice
@KSRDatavizon11 ай бұрын
Thanks
@VenkatAnnamraju Жыл бұрын
Very detailed explanation, very good
@KSRDatavizon Жыл бұрын
Glad it was helpful!
@nadianizam61017 ай бұрын
well explained video.plz make more videos on snoflake
@KSRDatavizon7 ай бұрын
Thank you for your feedback! I'm glad you found the video helpful.Please subscribe our channel for regular updates and it motivates us a lot
@rk-ej9ep Жыл бұрын
This is awesome bro..
@KSRDatavizon Жыл бұрын
Thank you, Please subscribe our channel for regular updates , all the best
@amrendrakumarsingh59118 ай бұрын
Well explained
@KSRDatavizon8 ай бұрын
Thank you so much, Please subscribe our channel, it motivates us a lot
@Larry219249 ай бұрын
This content is absolutely astonishing. I recently read a similar book, and it was an absolute marvel. "Mastering AWS: A Software Engineers Guide" by Nathan Vale
@SidharthanPV Жыл бұрын
watch it in 1.25x speed
@KSRDatavizon Жыл бұрын
thank you
@SRaavi10 ай бұрын
superb explanation, can u pls provide that JSON script which is used for policy creation
@KSRDatavizon2 ай бұрын
Thank You for Your comment, let me check to get the jSON FILES Please subscribe our channel for regular updates and it Motivates us a lot 🙏🏼
@lavanyap3021 Жыл бұрын
Excellent explanation, thank you.
@KSRDatavizon Жыл бұрын
Glad it was helpful!, Please subscribe our channel it motivates us
@mohammedvahid5099 Жыл бұрын
Pls make more projects n in interview session programs we need that to practice..
@KSRDatavizon Жыл бұрын
sure..we will upload asap
@dev41285 ай бұрын
Thank you so much
@KSRDatavizon2 ай бұрын
You're most welcome, Please subscribe our channel for regular updates
@SwatiXtia5 ай бұрын
Hi, excellent project, however it would be nice if you provide the policy script
@KSRDatavizon2 ай бұрын
sure, will make for future sessions
@samyakG-dr4vu Жыл бұрын
Very informative video.
@KSRDatavizon Жыл бұрын
Glad it was helpful!
@mounikamallikanti9522 Жыл бұрын
Good morning sir, very good explanation sir.iam mounika.i have learnt snowflake software from a private institute, but I have the career gap.now i want to improve my skill, for that i would like to work with real time work shops on snowflake.so sir, is there any freelancers or real time work shop on snowflake.
@KSRDatavizon3 ай бұрын
Good morning Mounika! I'm glad the explanation was helpful. Here's some information to help you improve your Snowflake skills and find real-time workshops: Freelance Work and Real-Time Workshops on Snowflake: 1. Freelance Projects: Freelance platforms: Look for Snowflake projects on platforms like Upwork, Freelancer.com, and Arc.dev. These platforms connect freelancers with businesses needing help on various projects, including Snowflake development. Direct connections: Network with other data professionals or companies using Snowflake. They might have freelance opportunities for someone with your skills. 2. Real-Time Workshops: Snowflake Training: While Snowflake doesn't offer public real-time workshops, they do provide a comprehensive training platform with on-demand courses and labs. You can explore these at learn.snowflake.com/en/courses. Third-Party Platforms: Some companies offer real-time or live online workshops on Snowflake. Explore platforms like Udemy, Coursera, or Pluralsight for such workshops. Meetups and Events: Stay updated on local data meetups or conferences that might feature live workshops or sessions on Snowflake. Here are some resources for finding events: Meetup.com: www.meetup.com/ Eventbrite: www.eventbrite.com/ Data Science Central: www.datasciencecentral.com/ Tips for Finding Opportunities: Focus on your skills: Highlight your strengths in Snowflake, including specific tools, tasks, or functionalities you're proficient in. Build a portfolio: Consider creating a portfolio showcasing your Snowflake projects, even if they were personal projects. Network actively: Connect with data professionals on LinkedIn or online forums to stay updated on opportunities. Additional Resources: Snowflake Documentation: docs.snowflake.com/ Snowflake Certification: Consider pursuing the Snowflake Data Analyst Professional certification to validate your skills. By combining freelance work, real-time workshops, and self-directed learning, you can effectively improve your Snowflake expertise and bridge your career gap.
@shashankm2859 Жыл бұрын
What is the difference between Integration object and pipe? When to use them?
@sairevanthmuppana Жыл бұрын
Snowflake Integration object is used for making relationship between the snowflake and external cloud storage. if you want to load the data file from external storage, first of all we need to create an Intergation object and in that intergation object we are going to define the location of the file and role(which is used for accessing the files). whereas snowpipe is used for continous data loading, sometimes in relatime we will get the data for every one hour or within certain time in microbatches. so once the data is available in storage locations, cloud vendor will send an notification to the snowflake. once snowflake receives notification it will load the data.
@KSRDatavizon3 ай бұрын
Integration Objects vs. Pipes in Dataflow Integration Objects and Pipes are fundamental components in dataflow systems, each serving distinct purposes. While they might appear similar at first glance, understanding their differences is crucial for effective data processing. Integration Objects Purpose: Integration objects are designed to connect dataflow systems to external sources or destinations. They act as bridges between the internal processing of data and the external world. Functionality: Integration objects handle tasks such as: Reading data from external sources (e.g., files, databases, APIs) Writing data to external destinations (e.g., files, databases, APIs) Transforming data to match the requirements of the external system Types: Common types of integration objects include: File readers/writers: For reading from or writing to files in various formats (e.g., CSV, JSON, XML) Database connectors: For interacting with databases (e.g., MySQL, PostgreSQL, SQL Server) API connectors: For communicating with web APIs (e.g., REST, SOAP) Message queues: For integrating with messaging systems (e.g., RabbitMQ, Kafka) Pipes Purpose: Pipes are used to define the flow of data within a dataflow system. They represent the logical connections between different processing stages or components. Functionality: Pipes handle tasks such as: Transmitting data from one component to another Applying transformations to the data (e.g., filtering, mapping, aggregation) Controlling the flow of data (e.g., branching, looping) Types: Pipes can be categorized based on their functions: Data transformation pipes: Apply transformations to the data (e.g., map, filter, reduce) Data flow control pipes: Control the flow of data (e.g., branch, loop) Data storage pipes: Store data temporarily (e.g., buffer) When to Use Which Integration Objects: Use integration objects when you need to interact with external systems or sources. They are essential for connecting your dataflow system to the outside world. Pipes: Use pipes to define the internal logic and flow of data within your dataflow system. They are crucial for transforming, filtering, and controlling the data as it moves through the pipeline. In summary, integration objects are the gateways to the external world, while pipes are the building blocks of the internal dataflow process. By understanding their roles and differences, you can effectively design and implement robust dataflow systems.
@ChandraS-c9h Жыл бұрын
Can you please explain how control frame work for snowflake project?
@KSRDatavizon Жыл бұрын
sure we will make a seperate video for the same, here will explain little short . Certainly! The "Control Framework" in the context of a Snowflake project typically refers to a set of guidelines, best practices, and processes that help manage and maintain the data architecture, data pipelines, and overall data management within a Snowflake data warehouse environment. It ensures consistency, reliability, and scalability in your data operations. Here's an overview of how a control framework might work for a Snowflake project: Key Elements: Governance: Define data standards, ownership, and access rules. Environments: Separate environments (dev, test, prod) with proper security. Data Model: Design logical/physical models for data organization. ETL/ELT: Implement structured data extraction, loading, and transformation. Version Control: Manage code changes and deployments. Testing: Ensure data quality with automated validation. Monitoring: Watch query performance and resource utilization. Backup/Recovery: Plan for data protection and disaster scenarios. Documentation: Maintain architecture and process documentation. Improvement: Continuously enhance based on feedback and lessons learned.
@OnlineGuru00111 ай бұрын
Do i need any prior snowflake/aws knowledge to start this?
@KSRDatavizon11 ай бұрын
No need, we will teach from scratch,.
@yayi34347 ай бұрын
Good One! What Notepad are you using here?
@KSRDatavizon2 ай бұрын
Let me check, Thank you, Please subscribe our channel for regular updates,
@pvijayabhaskar1520 Жыл бұрын
Sir I have already my snowflake account is created in AWS and wanted Amazon S3 bucket storaged data files access into snowflake tables
@KSRDatavizon8 ай бұрын
That's great to hear! Integrating Amazon S3 with Snowflake can really streamline your data storage and processing.
@aniruddhyadav369810 ай бұрын
Can we get a job with sql & snowflake certification. Please reply sir
@KSRDatavizon9 ай бұрын
Adding additional tools like Cloud and DBT, Python would help you in getting Job
@ranjith335 Жыл бұрын
While creating external storage, facing below error: Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)]
@pkjangra1 Жыл бұрын
I have got same error. I have ensured that the policy is correct. Still same error. Did you find any solution?
@augustineopokujuniorantwi8817 ай бұрын
try this: GRANT USAGE ON STAGE project_aws_stage TO ROLE accountadmin; Change the stage name to match yours
@sudhac5841 Жыл бұрын
What happens if stream gets dropped?
@juliestudy147510 ай бұрын
Excellent! Just curious why you label your chart 1, 2 & 3 starting from Snowflake and yet you proceed your steps the other way round ie 3, 2 & 1?
@KSRDatavizonАй бұрын
Great observation! This reversal of numbering is common in certain technical or architectural diagrams to reflect a "flow" of steps in the process, but can indeed create some confusion. Here's why it often happens: Diagram Sequence vs. Execution Flow: Diagrams may label components or stages in a logical hierarchy (e.g., the final data warehouse like Snowflake may be labeled as step 1 because it’s the final destination of your data). However, the steps are executed in reverse, starting from the raw data source (which might be labeled step 3, since it's the initial point of data ingestion). Final Destination First: Often, the final stage (e.g., where data is loaded, such as Snowflake) is labeled as "1" to emphasize the end goal of the process. This approach is common in project plans or workflow charts where the target is highlighted first. The flow, however, starts at the origin (S3 or data lake), hence the actual process runs from "3 → 2 → 1." Hierarchical Presentation: The final step is presented as the most important, so it may be visually or logically numbered first. This helps to understand that all preceding steps aim toward achieving this final goal.
@ITKaksha Жыл бұрын
Is pipeline concept different than what we saw in this video ?
@KSRDatavizonАй бұрын
Its same
@vedavathit4614 Жыл бұрын
How to change dateformat yyyy-mm-dd to dd-mm-yyyy in snowflake with date datatype.
@KSRDatavizon Жыл бұрын
use TO_CHAR() function
@mohammadarsalanadil1057 Жыл бұрын
from where we can get sample files that we are storing in AWS? Can you please help
1st of all thank you for clear explanation... Is this data avro means how do we load? At least Can you explain avro file format? Please
@KSRDatavizon8 ай бұрын
thank you for your compliment, sure will make better sessions in future with all these updates
@meghanb-pc6rk10 ай бұрын
Hi, when we are running for all the files, in what order it will load in the file ? how can I set my files to be loaded based on timestamp. For example its a full load and I did not run my process today. So next day there will be two files in my S3 location. I want to load only one file that I have the latest timestamp. How can we achieve that. thanks for your answer in advance.
@KSRDatavizonАй бұрын
To load only the file with the latest timestamp from your S3 bucket, you can follow these steps using AWS SDKs like Boto3 in Python, or similar mechanisms if you're working in another language or platform. Here's a general approach in Python using Boto3: Steps to Achieve the Desired File Load: List all the files in the S3 bucket location: You can use the list_objects_v2() function to list all the files in the specified S3 path. Sort files by timestamp: From the result of the list, you can extract the timestamps (typically from the file name or metadata) and sort the files accordingly. Pick the file with the latest timestamp: Once sorted, pick the most recent file. Load the file: Perform the desired operation with the latest file.
@shahidawan4072 Жыл бұрын
Hi, any new training course for Snowflake is available?
@KSRDatavizon Жыл бұрын
yes, we are starting new batch from 23rd Nov. 18th we have master class. please enroll from our webiste Datavizon.com
@cansener98027 ай бұрын
Why not add codes that you used in the video, to the description....?!
@KSRDatavizon2 ай бұрын
will check and add the codes to decription
@havishataneja6371 Жыл бұрын
Hello Sir , I Have creatded the statge intergation the same way but i am getting i am getting the error to check aws role "Snowflake Error assuming AWS_ROLE. Please verify the role and external Id are configured correctly in your AWS policy." though all is verified .While List @stage iam getting this error
@KSRDatavizonАй бұрын
Hi! The error you're seeing typically occurs if there's a mismatch between the AWS role configuration and Snowflake's integration setup. Make sure: The AWS role has the correct trust relationship with Snowflake. The External ID is correctly set in both Snowflake and AWS. Your AWS policy allows the necessary S3 bucket actions. Double-check the trust policy in AWS IAM and ensure the role ARN is correct. If everything seems fine and the error persists, try re-creating the integration. Let me know how it goes!
@rakshithdyavarishetty9657 Жыл бұрын
Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)] How do i resolve this error, i have made the data public in the aws s3 and can view the data in the aws but in the snowflake it is giving me this error.
@augustineopokujuniorantwi8817 ай бұрын
try this: GRANT USAGE ON STAGE project_aws_stage TO ROLE accountadmin; Change the stage name to match yours
@KSRDatavizonАй бұрын
Pls verify the access.. and provide proper access
@jawwadkhan769911 ай бұрын
Can you please provide the text file of policy you have created under IAM Role.
@KSRDatavizonАй бұрын
sure will check and update in Description
@harinimmagadda14 Жыл бұрын
what are the frameworks in snowflake?
@KSRDatavizon Жыл бұрын
Key components in Snowflake: Virtual Warehouses: Compute resources for query processing. Database: Organizes data. Schema: Contains tables and objects. Table: Stores data (structured, semi-structured, external). Stage: Loads data from external sources. Warehouse: Executes queries. Materialized Views: Precomputed data summaries. Streams: Captures and replicates changes. Tasks: Automates operations. Security: Role-based access, encryption. Data Sharing: Secure data sharing between accounts. Data Marketplace: Access to third-party data and services.
@bhargavnagineni9322 Жыл бұрын
Do u providing any training?
@KSRDatavizon Жыл бұрын
pls reach us on 9916961234 for more details
@neethisurya9755 Жыл бұрын
This is Etl testing project? Or else
@KSRDatavizon Жыл бұрын
its a snowflake one
@veer5827 Жыл бұрын
I explained the same project in interview but profile got rejected..
@KSRDatavizon Жыл бұрын
Presentation skills also matters.. Usually we will not get rejected for project.. We need to be confident in explaining
@GeekCoders10 ай бұрын
this is not a project this is just a one task
@gangabhavanipolimetle3090 Жыл бұрын
how can i join in your training course.
@KSRDatavizon Жыл бұрын
Please call to +91-9926961234/8527506810
@ramadevigarnepudi-xh8iu Жыл бұрын
How much this snowflake project
@KSRDatavizon Жыл бұрын
did u mean course price sir?
@PriyaGupta-qj1zb8 ай бұрын
Can you please provide customer_ info _ csv file !!!!!!
@KSRDatavizon2 ай бұрын
sure will add shortly
@baludhary6823 Жыл бұрын
Iam getting acces denied 403 error code
@KSRDatavizon Жыл бұрын
can you try recreating it please
@augustineopokujuniorantwi8817 ай бұрын
try this: GRANT USAGE ON STAGE project_aws_stage TO ROLE accountadmin; Change the stage name to match yours