Thank you it helped to debug another issue which I am trying to solve
@bhawnabedi96274 жыл бұрын
you are really my fav youtuber.
@WafaStudies4 жыл бұрын
Thank you 🙂
@nithishkumar72624 жыл бұрын
Appreciate your work. Keep it up 👍
@WafaStudies4 жыл бұрын
Thank you Nitish Raju🙂
@avaniheriss45884 жыл бұрын
Thank you for all the hard Work. Please show how to store this Json Output to the Blob Storage. Thanks you
@kelechie68363 жыл бұрын
Thank you Wafa!
@WafaStudies3 жыл бұрын
Welcome 😀
@mohammadyunus90203 жыл бұрын
wonderful ❤️
@WafaStudies3 жыл бұрын
Thank you 😊
@BillSeipel2 жыл бұрын
The BEST
@WafaStudies2 жыл бұрын
Thank you ☺️
@baladenmark4 жыл бұрын
Fantastic tutorial. Thank you. How do I write the failed output of a copy activity into a file?
@JL-qc5gq2 жыл бұрын
I want to pass the JSON output of the web activity into databricks activity. I need the whole JSON output since databricks will process the whole output. How do we do that in adf?
@abhinavprakash53832 жыл бұрын
great video
@WafaStudies2 жыл бұрын
Thank you 😊
@rohithnamani96143 жыл бұрын
Thanks for the video. I have a use case, Im trying to load JSON(Simple format) from an API to a single column(data type variant) in a SQL table through a copy activity in ADF. When im trying to do that the keys are splitting and number of columns mismatch error appears. May I know if there is any solution for this use case?
@diveshs4 жыл бұрын
Very good content. Nice Videos. I am looking for more such videos. How can we log pipeline errors in a blob storage file?
@BlueTik2 жыл бұрын
Excellent
@WafaStudies2 жыл бұрын
Thank you ☺️
@aliaksandr2336 Жыл бұрын
how to parse without foreach activity. I do get metadata inside foreach and get metadata always will return 1 file
@hrpproductions532 жыл бұрын
But how to handle if it's an object instead of being an array
@ts41754 ай бұрын
can you make a video on copy data activity ? copy json from one activity to other pls
@dharmveerchaudhary70603 жыл бұрын
Pls give any idea ,how to kill infinite loop created inside the until activity in azure data factory
@WafaStudies3 жыл бұрын
Until loop will break when defined condition becomes true. If you taking about manually killing then cancel pipeline execution
@rahulparmar2082 жыл бұрын
The children array that you got from get metadata activity... i want to store all the children data in mongodb... how can i do that????
@srinivaselluri92673 жыл бұрын
Thanks a lot Brother for your tutorials. I have a question - How to handle the special character in JSON (source) field names? Pipeline is failing with an error saying - input json format has special characters. and I am calling the json data using API(get). Please advise. Eg: {"customer (name) " : "abc"}
@Rafian19244 жыл бұрын
Hello techies, I am generating json code using teradata sql code and I want to send this data to a web application using api call. How should I go about it. I am completely new to adf and just started exploring it. So any help will be of great value.
@harikakiran184 жыл бұрын
Good video!
@surendrasuri25133 жыл бұрын
I am unable to read the web activity JSON response.getting property selection not supported on values of type String
@sivasudarsan23733 жыл бұрын
Hi, I want adf to extract content of text file (from storage) and based on value it should trigger another pipeline.
@WafaStudies3 жыл бұрын
Use execute pipeline activity to trigger another pipeline
@realcontent20085 ай бұрын
I have stored all expressions in one config table to make everything dynamic here you are fetching only one type of variable... I have thousands type of variable How to fetch each of them into pipeline variable any body has that logic i don't want to use multiple times filter activity and set variable
@zfold4702 Жыл бұрын
@WafaStudies, are you available to do one ADF POC?
@premgcp2 жыл бұрын
Hello guru gaaru, how do I loop through this json for ingestion. I tried using lookup activity but unable to access child items. Can you please help anna. I want to use copy activity and pass the below items inside tablelist. to ingest data. { "tablelist": [ { "tblname": "rbf", "secretkey": "xyz", "container": "migration", "cpath": "dev01" }, { "tblname": "abcd", "secretkey": "abcd", "container": "migration", "cpath": "dev02/" } ] }
@आरंभ-भ5ल4 жыл бұрын
Please add video on upsert in ADF
@WafaStudies4 жыл бұрын
Sure
@babluprajapat7943 жыл бұрын
I have Only one json file in Azure blob storage which is containing millions of Json object. My question is how read the that file and extract and upload million of single single json document instead of one. Any idea ?
@sathishkumark83864 жыл бұрын
Thanks
@bhawnabedi96274 жыл бұрын
Please do tell about yourself as well we would love to know you
@WafaStudies4 жыл бұрын
Sure. I will plan for it
@michelepotenzateixeira32094 жыл бұрын
the videos are very good but as i'm from Brazil and i don't have good english i need subtitles. It would be possible? Thanks
@WafaStudies4 жыл бұрын
Thank you. I will try for subtitles
@michelealves6924 жыл бұрын
Need caption...
@venkatramana26673 жыл бұрын
Hi bro, I have 100 tables ,but when I create dataset ,is it mandatory create 100 datasets? And I want to talk personally with u , please provide mobile num
@sibagayatri3 жыл бұрын
Do you want to copy data from 100 tables or do something else?