Fantastic content. I combined some of your videos to do a bit of a complex task and I'm so happy it worked!. Thanks heaps!
@jayong23702 жыл бұрын
Thank you! This is exactly what I was trying to configure.
@piraviperumal25442 жыл бұрын
Hi Brother, Not sure whether Azure team fixed it or not but @replace(item().name,'.txt','') is working fine. I guess you have missed @ sign before replace function in your attempt.
@bitips Жыл бұрын
Thanks for share this knowledge. It is fantastic !
@TechBrothersIT Жыл бұрын
Glad you liked it!
@devops8402 жыл бұрын
Hi Sir, I am able to insert the data using dynamic CSV files, Could you please help me in upserting the data ?
@ayushikankane5305 ай бұрын
If csv file is hqving some columna as json structure than how to proceed?
@TheMadPiggy3 жыл бұрын
works like a charm, however the auto created tables all are nvarchar(MAX). Not the best for database size, not for useability. Any way around this.
@TechBrothersIT3 жыл бұрын
I noticed that too, the data type is nvarchar max. You might want to create final tables once data is loaded and have final tables with correct data types, create stored procedure to load data from these staging tables to your destination tables. if you have already created the tables with correct data type, then you will be fine too.
@marcuslawson-composer28922 жыл бұрын
Very helpful video. Thank you!
@TechBrothersIT2 жыл бұрын
You are welcome
@neftalimich3 жыл бұрын
Thank you very much, was really helpful.
@TechBrothersIT3 жыл бұрын
Glad to hear that!
@insane20938 ай бұрын
Small query sir , once created the table, again if new data or new files come with suffix changes like date change then again it create new table or insert the data into the already created table coz you are using auto created option . Thank you in advance
@viswaanand45782 жыл бұрын
Hi I can see my csv files in SSMS but cannot see in table format in SSMS also it is in CSV format did i miss anything?
@purushothamnaidu55443 жыл бұрын
Sir...Can you show once how to load the files available in blob container and load into multiple existing tables in azure sql database, that would be really helpful to me
@ambatiprasanth4292 Жыл бұрын
Brother i was looking for the same... Now did you know how to do it.?
@rohitsethi5696 Жыл бұрын
hi im rohit can we use copy data activity from CSV files if not why ?
@williamtenhoven8405 Жыл бұрын
Hi, thanks for this ! 1 question. suppose I wanted to convert the csv files to parquet files the how would I proceed ? I used the concat replace, but looking at the target parquet files they seem to be corrupted : The file 'Emp1' may not render correctly as it contains an unrecognized extension. @concat(replace(item().name,'csv','parquet')) does not work either..... Any suggestions ? Thanks
@tomasoon Жыл бұрын
very great tutorial, i have a question, if I run a pipeline, and there's a new csv file in the bucket with the same schema as other, this method will apend the data to the table with same schema or will create another one?
@uditbhargava87622 жыл бұрын
Sir can we use split() function to remove .txt ?
@kiranreddy91032 жыл бұрын
HI, if file names are like emp1 ,emp2, emp3 etc. in this case how we can write a expression to remove numb ers in REPLACE. could you help us.
@Eraldi23232 жыл бұрын
Hi TechBrothers, thanks for this very useful video. I had a question, I am trying to truncate the tables with the following @{concat('truncate table',item().name)} but is not working for me, giving an error Please advise. Thank You
@niranjanchinnu82952 жыл бұрын
i tried this today as well. My implementation idea is to truncate and insert into tables. For that I truncated the table with TRUNCATE TABLE [SCHEMA_NAME].@{item.name} . After this step if the table exists already then it would truncate. Orelse try pointing a fail output line to the same block that you are pointing the sucess block. So by doing this if table doesnt exists then it will go in the fail block and execute it and if it is present then it will truncate and give you the appropriate results
@gebakjes10993 жыл бұрын
Thanks! Really helpful!
@TechBrothersIT3 жыл бұрын
Glad it helped!
@boatengappiah21163 жыл бұрын
Great videos. I however don't see any video on SharePoint with ADF. Do you have a video or can you make one? Thank you
@TechBrothersIT3 жыл бұрын
Hoping to have one soon. working in many videos and scenarios. thanks for feedback
@vishal-xf6ev3 жыл бұрын
Hi Brother , Great Video & thanks for sharing :-)
@TechBrothersIT3 жыл бұрын
My pleasure
@Deezubwun2 жыл бұрын
Hi. This was a great help to me. One issue I am having is the data is failing to load due to multiple data type errors (such as String to DATETIME). As the data in the CSV is exported as string, do you have a way of mapping the formatting of each field which is a problem, bearing in mind the columns may be named something different?
@SRINIVASP-fx5kz2 жыл бұрын
excellent video super
@kirubababu71273 жыл бұрын
How to do this in HTTP server?
@maartencastsbroad2 жыл бұрын
Great video, exactly what I needed!
@harshanakularatna3 жыл бұрын
you are awsome. keep it up!
@thyagaraj11242 жыл бұрын
Is it possible to load the different source files into existing tables in the SQL server? Means the source file names do not match with the existing table names?
@TechBrothersIT2 жыл бұрын
Hi, yes that is possible, but you have to provide some type of source and destination, if file names are different , you can group them in source and then destination table can stay same.
@vijaysagar59842 жыл бұрын
Hi Bro, Any workaround for CSV files which has multiple headers and we can merge them as one Header ? Source is FTP and some files are good and some files has multiple headers.
@TechBrothersIT2 жыл бұрын
One of the way could be load the data without header information into staging table and then remove the bad header data and only use clean data.