Overview on Cloud Workflows
4:23
19 сағат бұрын
Intro on Cloud BigQuery - Part 01
10:59
Lab On Google Cloud Run - Python
6:44
Overview on Cloud Run (CaaS)
12:36
Пікірлер
@shreyas9309
@shreyas9309 19 күн бұрын
Thanks a lot man ! <3
@user-em3gw8on5i
@user-em3gw8on5i 23 күн бұрын
Thanks for the video. I have a question..... Is that possible to restrict "Update" a project (name) through the Lien Restriction? TIA
@joyhodling
@joyhodling Ай бұрын
This is very helpful. Could you point me a video or create one on how to upgrade data fusion ? There are pipelines connected. How to do it safely. This video is very helpful.
@UTubeAcount1000
@UTubeAcount1000 Ай бұрын
Hi admin, Very nice explanation. do you have a free coupon/ discount voture for Snow Pro Core Certification registration?
@deepthimurali962
@deepthimurali962 Ай бұрын
Do you know how fetch the deleted bytes for all buckets in a gcp project?
@AnantPradhan-y7m
@AnantPradhan-y7m Ай бұрын
Couldn't understand. Complicated...
@rakshapadiyar
@rakshapadiyar Ай бұрын
Apart from Skills boost, did u go through any youtube channels/udemy courses?
@shamilak1
@shamilak1 Ай бұрын
head_usa_names share the file
@patriciodiaz2377
@patriciodiaz2377 Ай бұрын
Thank you very much for your explanation! Pretty well explained, greetings from México.
@shwetapandey2308
@shwetapandey2308 Ай бұрын
It is helping me allot thank you for making it simple for us
@jakrac2790
@jakrac2790 2 ай бұрын
Thanks for focusing on documentation and detailed breakdown of exam topics. I've just passed the exam and I recommend anybody who starts - please read the documentation carefully, as they will test your deep knowledge and understanding of the topics.
@UddhavParab
@UddhavParab 2 ай бұрын
How to send Pipeline Alerts like if the pipeline fails how to send emails? It's not sending when I am trying to do can you please help.
@sotos47
@sotos47 2 ай бұрын
Why do i get "module not found" when pressing the run button
@prasannakumar7097
@prasannakumar7097 2 ай бұрын
Can you please explain how to write dataframe to bigquery
@nicstruebel3391
@nicstruebel3391 2 ай бұрын
doesnt work for me
@cloudaianalytics6242
@cloudaianalytics6242 2 ай бұрын
what's the error?
@figh761
@figh761 3 ай бұрын
Does anyone using snowflake
@salmansayyad4522
@salmansayyad4522 3 ай бұрын
Thanks a lot bro, inreresting content
@user-gm7yt9dd2u
@user-gm7yt9dd2u 3 ай бұрын
Abe langoor khud ki marrketing kar raha hai paisa kamane ke Liye video bana Raha hai
@ashraf_isb
@ashraf_isb 3 ай бұрын
thanks man!
@Ar001-hb6qn
@Ar001-hb6qn 3 ай бұрын
I am unable to create a GCP Composer environment. After around 45 minutes it shows the error "Some of the GKE pods failed to become healthy". I have configured the setting and given the necessary access. I am using composer-2.7.0-airflow-2.7.3. But it failed to create the environment. Can you please help with this? Thanks.
@heenachhabra2977
@heenachhabra2977 3 ай бұрын
This is a single dataflow pipeline right? how is this different from a cloud composer orchestrated one
@YugarajTamang
@YugarajTamang 3 ай бұрын
hello bro, i have defined schema in bigquery and i have a dataframe without column name, have millions of rows. i am unable to upload that dataframe in bigquery. can you make a video on that or help me out. thanks
@MonicaPatil-so3ml
@MonicaPatil-so3ml 4 ай бұрын
You explained how to recover object form a bucket. Can you also explain with deom if its possible to recover an entire bucket is its deleted accidentally.
@cloudaianalytics6242
@cloudaianalytics6242 4 ай бұрын
No buckets are deleted permanently. In order not to delete accidentally we are enforcing bucket retention policies on cloud storage bucket. Hope it helps
@emmanuelihetu9848
@emmanuelihetu9848 4 ай бұрын
Thank you soo much
@riyanshigupta950
@riyanshigupta950 4 ай бұрын
Amazing content! Thanks
@sagarsitap3540
@sagarsitap3540 4 ай бұрын
How does your source data file looks like in GCS ? Can we make it streaming ?
@venkatvlogs07
@venkatvlogs07 4 ай бұрын
too hurry not able to understand it as you are switching tabs and doing all the things and not mentioning where you are writing the code. The course should be designed so that even beginner should be able to understand it. please make a pin to pin point to point explanation video so that everyone can understand it. Thanks in advance ❤
@ushasribhogaraju8895
@ushasribhogaraju8895 4 ай бұрын
Thanks for your videos, I find them helpful. I could get the message published by a python script to pub/sub, updated to the data column in a big query table, by simply creating a subscription that writes to Big Query (to the same topic) without using Dataflow. Since pub sub is schema less, it is receiving whatever schema is published by the python script. My question is , is there a way to update a big query table using the same schema received in pub/sub?
@NaveenPB-yg4vw
@NaveenPB-yg4vw 4 ай бұрын
Hi Can u plz paste python code here
@webnoxtechnicalsupport8005
@webnoxtechnicalsupport8005 4 ай бұрын
while giving ls u are having some files and folder right i only have this README-cloudshell.txt..
@ainvondegraff5233
@ainvondegraff5233 4 ай бұрын
Awsome explanation really wanted to know this, If I migrate Control-M Workload automation tool to GCP. How will I connect control-m to pub/sub?
@AnjaneyuluPonnam
@AnjaneyuluPonnam 5 ай бұрын
excellent job you are doing great SIR!
@zzzmd11
@zzzmd11 5 ай бұрын
Hi, Thanks for the great informative video. can you explain the flow if the data source is from a Rest API. Can we have a dataflow configured to extract from a Rest API to big query with dataflow without having cloud functions or Apache beam scripts involved? Thanks a lot in advance..
@honeylokesh2340
@honeylokesh2340 5 ай бұрын
How to enroll your training???
@ayseguldalgic
@ayseguldalgic 5 ай бұрын
Thanks a lot, I needed this tutorial very much.
@figh761
@figh761 5 ай бұрын
What is your background sir. i am a datawarehouse developer and not much knowledge on data science. How to learn data science, ml and vertex ai. Could you please share all training documents
@v5q211
@v5q211 5 ай бұрын
Is there any syllabus? Because i dont think everything's needed to be studied from the documentation
@Rajdeep6452
@Rajdeep6452 5 ай бұрын
Hey bro. Thanks for the video. I have a ETL process running on VM, using docker and Kafka. And the data is getting stored in big query, as soon as I run the producer and consumer manually. I wanted to use cloud compose to automate this (like whenever I login to my VM the etl process starts automatically), but I couldn’t. Can you tell me if it’s possible to do this with dataflow? I am having trouble setting it up.
@nilawarsakshi2431
@nilawarsakshi2431 5 ай бұрын
Can you provide cupon as you mentioned in thE end of the video I am interested to do GCP professional DataEngineer Certification
@varshasony2352
@varshasony2352 5 ай бұрын
You mentioned that you can complete it in 10 days, but the description days 6 months of hand on experience. Can you explain this pls? I am working to complete the course in a months time.
@cloudaianalytics6242
@cloudaianalytics6242 5 ай бұрын
Its an expectation set by Snowflake team but if you have already worked on other DW services like Bigquery, Synapse analytics and Redshift it will be very easy to pickup Snowflake and clear it in your very first attempt.
@PujaKiPyaariDuniya
@PujaKiPyaariDuniya 4 ай бұрын
​@@cloudaianalytics6242 I have worked on TSQL and have no prior working knowledge of snowflake. I am learning on my own from snowflake website learning track. Will I be able to crack the exam in 1 month?
@SK-rl3wu
@SK-rl3wu 3 ай бұрын
@@cloudaianalytics6242 Is 6months of hands on experience required to attempt this test, I do have knowledge on data warehouses but did not work on DW services Bigquery, Synapse analytics and Redshift etc
@pournimaambikar5857
@pournimaambikar5857 6 ай бұрын
I am getting below error while trying to run dataflow job: import apache_beam as beam ModuleNotFoundError: No module named 'apache_beam' on both cloud sdk and cloud shell, wheras apache_beam is installed
@RajDas-uy2ro
@RajDas-uy2ro 6 ай бұрын
pip install apache-beam[gcp]
@cloudaianalytics6242
@cloudaianalytics6242 5 ай бұрын
pip install apache-beam[gcp] or try createing a virtual environment in cloud shell and run dataflow jobs from there after installing apache beam
@anurak1166
@anurak1166 6 ай бұрын
Awesome. Can you provide a URL of the source code, please?
@Alfred_vinci
@Alfred_vinci 6 ай бұрын
this is really good for me. can i get your mail so i can be mailing you if i need help with something?
@cloudaianalytics6242
@cloudaianalytics6242 5 ай бұрын
@sulemanshaikh731
@sulemanshaikh731 6 ай бұрын
Very Informative.. Appreciate your Efforts
@cloudaianalytics6242
@cloudaianalytics6242 5 ай бұрын
Thanks a lot
@brjkumar
@brjkumar 6 ай бұрын
Thanks bro, nice info.
@brjkumar
@brjkumar 6 ай бұрын
Thanks for biglake explanation.
@archanajain99
@archanajain99 6 ай бұрын
hii, i need to create a GCP dataflow pipeline using Java. This pipeline should take file in GCS bucket as input and write the data into Bigtable. how to create it please help .
@cloudaianalytics6242
@cloudaianalytics6242 6 ай бұрын
You can use predefined template to do it.
@archanajain99
@archanajain99 6 ай бұрын
means i didn't understand. and not able to create cloud account coz it is asking charges.@@cloudaianalytics6242
@archanajain99
@archanajain99 6 ай бұрын
but yet i am not able to create account on google cloud they are asking charges@@cloudaianalytics6242
@archanajain99
@archanajain99 6 ай бұрын
not enable to bigtable data where it is asking me to pay. and you have created that documentation also. how can i create it?@@cloudaianalytics6242
@sanjaynayak2784
@sanjaynayak2784 7 ай бұрын
Is there any we can update side input value in main collection based on some matching key
@sanjaynayak2784
@sanjaynayak2784 7 ай бұрын
How to add side input column value in main p collection based on some look up key