Incremental Data Loading from an ERP System into Azure Data Lake Using Azure Data Factory

  Рет қаралды 13,579

Muditha SQLBI

Muditha SQLBI

Күн бұрын

Created By: Muditha Pelpola
LinkedIn: / muditha-pelpola
This is my 5th KZbin video and this time I decided to talk about Data Engineering related topic .
In this video session I talked about how to push incremental data into Azure Data Lake using Azure Data Factory.
Here I selected an ERP system (Infor m3) that I worked with more than 5 years. I got business problem to push M3 ERP tables into data lake
with a file structure “ M3 File Name Year  Month  Date ( Based on M3 Last modified record) “. Without hardcoding values in Data Factory
to create file structure automatically based on M3 Last modified date was the challenge .
I solved it using Data Factory expression and this video is to share my experience with data community.

Пікірлер: 12
@thenarathnegadamithchandan5226
@thenarathnegadamithchandan5226 4 жыл бұрын
Good one..! Thanks for sharing..👌
@damithchandanathenarathnag6408
@damithchandanathenarathnag6408 4 жыл бұрын
Good one..!
@vemarajulasya9585
@vemarajulasya9585 3 жыл бұрын
really good content. thanks for the video. and please upload more videos on data loading from a particular source to data ware house step by step. please!!
@anujgupta8686
@anujgupta8686 4 жыл бұрын
please upload more videos on azure data factory incremental loading using merge. and how to use dynamic expression s
@khana04
@khana04 3 жыл бұрын
what if I have multiple table under my Copy activity using for each loop , and each table have different column to identify delta , is that possible or this has to be one table copy at a time
@dileepkanumuri
@dileepkanumuri 3 жыл бұрын
Nice video! I have a doubt. Will ADF handle incremental changes on the source DB like Deletes and Updates to the records?
@anthonberg3274
@anthonberg3274 3 жыл бұрын
Great video! The code for the demo M3 database is not something you would like to share? Thanks!
@anujgupta8686
@anujgupta8686 4 жыл бұрын
if daily data is coming in my blob storgae I need to insert in my sql tables how to do it automativally. I have 10csv and each csv contains each table details i.e 10 csv and 10 tables in my azuresqldb need to be inserted.
@learnnrelearn7553
@learnnrelearn7553 3 жыл бұрын
You can create a azure function to be triggered once your storage gets some data. The azure function will invoke the azure data factory which can perform the delta load and archive the csv file. Hope you this answer will help you, please let me know.
@gauravmodi1061
@gauravmodi1061 4 жыл бұрын
If I want all the incremental data in a single file like lets I have done a full load of a table "users" now I want that the Incremental data but in the same file not indifferent date wise file so is that possible.
@learnnrelearn7553
@learnnrelearn7553 3 жыл бұрын
Yes, it is possible by choosing a static file as sink and merge the data into it by using 'Union'. Please let me know if this helps
@Vigneshvaraa
@Vigneshvaraa 3 жыл бұрын
Hi Gaurav modi, Exactly am looking for this requirement is this possible to load the updated incremental changes in same file???
Append Files in Data Lake Using Azure Data Factory Flow
39:56
Muditha SQLBI
Рет қаралды 3,6 М.
Real-World Data Movement and Orchestration Patterns using Azure Data Factory V2
1:25:06
Пробую самое сладкое вещество во Вселенной
00:41
NERF WAR HEAVY: Drone Battle!
00:30
MacDannyGun
Рет қаралды 40 МЛН
Azure : Data Factory and DataBricks End to End Project
1:46:39
Data Engineering For Everyone
Рет қаралды 142 М.
Simple Sales Prediction Using Python on Power BI
35:23
Muditha SQLBI
Рет қаралды 8 М.
Processing Slowly Changing Dimensions with ADF Data Flows
1:10:53
Pragmatic Works
Рет қаралды 28 М.
Deploying Azure Data Factory With CI/CD Using Azure Pipeline
56:31
Mohamed Radwan - DevOps
Рет қаралды 74 М.
Develop a Power BI Data Model Using Azure Databricks
53:04
Muditha SQLBI
Рет қаралды 5 М.
Azure Data Factory Mapping Data Flows Tutorial | Build ETL visual way!
26:25
Adam Marczak - Azure for Everyone
Рет қаралды 221 М.