Proud to see that among of us ( software engineer) achieving their goals by dedication and commitment Great congratulations whatever you achieved in your life
@Raju__p-v4d9 ай бұрын
I have been looking for this content for a long time, thank god I found this, very useful
@DataFinancialMarkets8 ай бұрын
You're the best, brother. The information was really helpful, I appreciate it a lot. Greetings from Argentina, Buenos Aires.
@techtrapture8 ай бұрын
Thank you brother ❤️🔥
@amritapattnaik334510 ай бұрын
i loved all your videos.Keep posting 😇🙂🙃
@techtrapture10 ай бұрын
Thanks
@hunterajones9 ай бұрын
did the schema originally fail since the headers would not be an integer? With headers removed the original scheme would work right? Also, is there a way to automate header row removal? I need to daily auto add a CSV like this but it will always have a header row needing removed. Thanks for the video!!
@guptajipriyank2 ай бұрын
Same question about header removal .. I need to add data daily.
@sampyedits35402 ай бұрын
successfully completed this project! thanks!
@techtrapture2 ай бұрын
Cheers🎊
@faroozrimaaz7092 Жыл бұрын
Your videos are informative..keep going
@noolusireesha2054 ай бұрын
Sir, i have done the same process as you have menctioned in the video ....i'm getting error "java.lang.RuntimeException: Failed to serialize json to table row" . Could you please reply me with the solution.
@vignesh0043 ай бұрын
even i'm getting the same error
@nitinhbk8 ай бұрын
Could you please let me know what was the COST shown in GCP for this activity?
@mayankmisra706421 күн бұрын
I am facing some issues when i run the job. Can you please suggest me some solutions for the error i am posting here Error - org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: Failed to serialize json to table row: function transform(line)
@python_code086 ай бұрын
Can we add this project in resume as a mini-project
@arerahul11 ай бұрын
Insightful video. Just a question - Cant we write the data load job in cloud functions, rather than using dataflow ? Also how do we create a delete job - data is deleted whenever the file is deleted from GCS
@techtrapture11 ай бұрын
Yes we can write everything in python and put in cloud function or composer. 2nd question - here you need to add something to identify which data is loaded by your file so your code can delete that data only.
@sampyedits35402 ай бұрын
i got something as csv format while creating data flow, i wrote default, but now there's no data in my tablet
@sampyedits35402 ай бұрын
never mind, this is now working
@vinnakollurakesh848111 ай бұрын
Hi sir can you help me to pull the data from Kinaxis rapid response API to GCS, any related documentation or videos will be helpful, thanks
@nitinhbk8 ай бұрын
Thank you. Really helpful session.
@subhashs52757 ай бұрын
Which location was template path in Python file?
@GURUSINGH-d1c10 ай бұрын
Very good Video. Where can I get more cloud function templates ??
@srikarfarmacy6 ай бұрын
thank you for the video, i have one doubt , if my csv file have header then do i need to have JSON code for schema?
@techtrapture6 ай бұрын
Yes, dataflow job ask for mandatory JSON file
@srikarfarmacy6 ай бұрын
@@techtrapture Thank you for your prompt response. Could you provide a solution for this issue? Every day, my bucket is automatically uploaded with data that contains headers organized by date.
@zzzmd119 ай бұрын
Hi, Thanks for the great informative video. can you explain the flow if the data source is from a Rest API. Can we have a dataflow configured to extract from a Rest API to big query with dataflow without having cloud functions or Apache beam scripts involved? Thanks a lot in advance..
@mulshiwaters53126 ай бұрын
This is what exactly I need however instead of Trigger I would like to use Schedular with certain time interval like Daily Weekly . How can I achieve this.. Cloud COmposer ?Workflow ? Schedular
@techtrapture6 ай бұрын
In scheduler you can use cronjob expression to mention date and time at which you need to trigger job
@mulshiwaters53126 ай бұрын
@@techtrapture Thanks Appreciate your help on this !
@earthlydope7 ай бұрын
There's a catch here, we need to create BQ-Table-Schema and UDF.js file everytime before uploading a new flat file into the system.
@pramodasarath67337 ай бұрын
Do we have to select csv file from storage to bigquery Or text file
@techtrapture7 ай бұрын
Yes CSV file
@ayush10_08 Жыл бұрын
Hello sir, I watched your lot of videos related to function , dataflow I have one question As a GCP Data engineer who is responsible for writing code for dataflow or for data fusion?
@techtrapture Жыл бұрын
Data fusion is a code free ETL tool. But as a general data engineer is responsible for writing all code for the data pipeline.
@ayush10_08 Жыл бұрын
@@techtrapturemeans having a knowledge of only data related services is not important we have to learn coding ?
@techtrapture Жыл бұрын
@@ayush10_08 yes for data engineer role we need coding
@Makkar-b3v5 ай бұрын
You could do away with dataflow here. A simple python job using load_table_from_uri with auto schema detect enabled from trigger function would do this work.
@techtrapture5 ай бұрын
Yes , single python would work definitely. This is to learn different services in GCP.
@SnehaNitishGCPAC Жыл бұрын
I am not able to find source code in github. Would you pls share the direct link for the same
@techtrapture Жыл бұрын
Here is source code github.com/vishal-bulbule/automate-gcs-to-bq
@swarnavo92 ай бұрын
Where is the code buddy ? Could not get it from your Github :(