Sorry for my voice! I was sick when recording this episode, but I didn't want to leave you without any video this week :(
@AlexGonsales5 жыл бұрын
No need to apologize, nobody should complain about a receiving top notch contents for free! good job!
@AdamMarczakYT5 жыл бұрын
Thanks man!! :)
@jeffh5664 жыл бұрын
Dont apologize, you did the best explaination of Tumbling Windows on KZbin plus MS documents, good job!
@joejoe5704 жыл бұрын
I would have complained eventhough I have no rights to recieving this high quality content every week at my doorstep. Just got spoiled :D
@amusam73253 жыл бұрын
You are so humble
@jeffh5665 жыл бұрын
The best videos on Azure Data Factory, even compared with some paid ones!
@AdamMarczakYT5 жыл бұрын
Boom! That's awesome to hear!
@dabay2005 жыл бұрын
Another superb video; microsoft's own documentation is not clear but these videos explain things way better.
@AdamMarczakYT5 жыл бұрын
Thanks Dinal! :)
@AlexGonsales5 жыл бұрын
I'm enjoying all your videos, what I like about it most is how you keep cut to the chase and good examples, easy to follow and learn, awesome contents, keep it up!
@AdamMarczakYT5 жыл бұрын
Thank you so much :)
@agnorpettersen Жыл бұрын
Super good teacher! I followed you on your first video and made a copy activity in a pipeline. Now I will try parameters and triggers. Very fun and inspiring!
@bharatruparel94245 жыл бұрын
Hello All, One thing that I ran into which might trip up people is that prior to writing the Event Trigger, make sure that you 'Register' the Event Grid service. Otherwise, you will get an error like I did. The error message is quite clear though, so you should not have any problems even if you forget to do.
@AdamMarczakYT5 жыл бұрын
I could have sworn I mentioned this during the video. Nonetheless, good tip Bharat. You can register event grid by going Subscriptions > your sub > Resource Providers blade > search for EventGrid and hit register. Thanks for letting everyone know, I appreciate it. :)
@seb63024 жыл бұрын
I also had this issue!
@AArora813 жыл бұрын
Perfect introduction! Easy to understand. I'm watching my 2nd video of your channel and already a big fan! Thanks for sharing.
@AdamMarczakYT3 жыл бұрын
Awesome, thank you!
@ttcorner7132 ай бұрын
I am a big fan of your azure tutorial. Could you please bring azure kubernetes tutorial series. Your simple way of explanation is just awesome.
@dixitmca3 жыл бұрын
Really appreciate all your efforts and after watching your more than 10 video's I have successfully implemented a small project on ADF in my organization and it is because of your videos' so really obliged Adam ...keep posting such nice video's. Thank you so muchhh
@94346675 жыл бұрын
All the videos are awesome. Demos are really helpful.Duration is also perfect. Please keep teaching us sir🙏
@AdamMarczakYT5 жыл бұрын
Will do, thank you :) Always nice to hear people enjoy my content!
@NitishKumar-ms8lq3 жыл бұрын
Hi Adam, Wonderful Video and you have explained everything extremely well. This is your first video I watched and now I have decided to watch all your videos (Even the topics I had already covered)
@AdamMarczakYT3 жыл бұрын
Awesome, thank you!
@_indrid_cold_4 жыл бұрын
Fantastic content; really the best there is out there. Thank you. I noticed that you can also trigger pipelines through Logic Apps now.
@AdamMarczakYT4 жыл бұрын
Thanks! :) Appreciated.
@HinesBrad3 жыл бұрын
Yet another excellent, well thought out video Adam. Well done.
@dheerajlenka26074 жыл бұрын
Best Video I have watched on ADF till date!
@AdamMarczakYT4 жыл бұрын
Loving this comment ;) Thanks.
@dheerajlenka26074 жыл бұрын
Adam Marczak - Azure for Everyone Btw thanks mate for this video, I learnt and did a demo for my team today!!
@mpatel2114 жыл бұрын
dude you are the best!!! Please keep making more videos. I learned more from you then Microsofts own training videos.
@AdamMarczakYT4 жыл бұрын
I appreciate that! Thanks ;)
@ninatuttle65483 жыл бұрын
Hi Adam! Thank you for such a great tutorials, very helpful and clear. Please can you explain why I have a problem with signing in to my data factory on the 4th demo with Logic App, it is a moment in the video at 18:59 where Azure requires to sign in to create a connection to ADF. I try to sign in with my admin name for the application, and it gives me the error message "Failed with error: 'The browser is closed.'. Please sign in again.
@AdamMarczakYT3 жыл бұрын
Hard to say. Maybe try doing everything in incognito mode in the browser. Thanks for watching :)
@johnfromireland75513 жыл бұрын
Also, try clearing your cache for , perhaps, the last month.
@ninatuttle65483 жыл бұрын
@@johnfromireland7551 thank you
@abhijeetzagade33493 жыл бұрын
Thanks a lot Adam for this video series
@AdamMarczakYT3 жыл бұрын
My pleasure!
@swativish5 жыл бұрын
Your videos are very helpful and informative. These are way better than the Microsoft Documentation...thanks
@AdamMarczakYT5 жыл бұрын
Thank you :) glad you enjoyed them!
@cmaz_77193 жыл бұрын
Hello and thanks for the content: I have a question about Trigger on Event. Is it possible to delay the trigger by 5 mins after the event? In case I upload many files in the storage, I would wait that the last one is loaded. Many thanks
@monicawtavares3 жыл бұрын
good question Corrado, I have the same problem..
@adityarajora72193 жыл бұрын
Hi adam. I got an error while publishing the pipeline for the storage event "Failed to get subscription status on storage events for event trigger" I'm using 12 months free trail.
@AdamMarczakYT3 жыл бұрын
Check out if you have EventGrid resource provider registered on your subscription.docs.microsoft.com/en-us/azure/azure-resource-manager/management/resource-providers-and-types?WT.mc_id=AZ-MVP-5003556
@adityarajora72193 жыл бұрын
@@AdamMarczakYT Thank you so much, Adam.
@mustafakamal59453 жыл бұрын
Thanks again ! for the wonderful video. While checking Event triggers I ran into the error saying EventGrid is not registered, but when I checked my subscription I can see it is registered. what could be the problem, please guide..
@krishnamoorthyramachandran44422 жыл бұрын
Is there anyway to adjust the tumbling window trigger to adjust according to US daylight savings?
@monicawtavares3 жыл бұрын
Very nice video! Thanks! I have one question: its possible use event-based trigger to run only monday to saturday? I don't want to run on sundays.. thanks
@henrychoi38098 ай бұрын
Thank you very much for the video Adam. I have a pipeline that combines multiple CSV files into one CSV file. I added the event trigger when the files are uploaded to the blob storage. However, it fires the trigger multiple times. What do I need to do to make the trigger only fires once? Thank you very much in advance.
@Extream9174 жыл бұрын
How to handle multiple files in event based triggers I.e. in your example alone with demo if you have have file called demo1.csv,demo2.csv ... how it works in event based teigger
@AdamMarczakYT4 жыл бұрын
Those will raise two separate events and create two separate logic apps runs. So if you need custom logic based on your files than I guess Azure logic apps is the best way to implement that, but it really depends. There are quite few patterns around this written on blogs.
@Thedavidk11 ай бұрын
Don't forget to register your event grid. I just lost day and half wondering why I could not get the data flows to work 😞
@AHMEDALDAFAAE13 жыл бұрын
WOW!such an amazing video
@AdamMarczakYT3 жыл бұрын
Thank you so much!
@Mgiusto2 жыл бұрын
Adam, Do you have any videos that show updating an Azure Table Storage within a Pipeline? I have a Logic-App where I move a file from an FTP location to Blob Storage and during that I also write a record to Table storage with a FileName, Date, and a Status column. At the end of my Logic-App I create a Pipeline Run to import the file contents from Blob storage into an Azure database table. After this I would like to update my Status column in the Table storage matching on FileName that the file has been processed. Was hoping you have something that covers this.
@arulmouzhiezhilarasan85184 жыл бұрын
Bellissimo! Thanks Adam! Have practiced and tried all triggers!
@AdamMarczakYT4 жыл бұрын
Wonderful! Thanks for stopping by!
@nayeemuddinmoinuddin21864 жыл бұрын
Hi Adam, I have a requirement I need to run a pipeline at 2 PM Central Time (US) daily but since scheduling is done at UTC time which is like 7 PM UTC or 8 PM UTC depending on day light saving. Requirement is to make it dynamic so that I don't need to manually change schedule in production.
@AdamMarczakYT4 жыл бұрын
Hey! Currently only UTC is supported (doc ref: docs.microsoft.com/en-us/azure/data-factory/how-to-create-schedule-trigger#schema-overview ). You can use Logic Apps instead, schedule trigger there accepts timezone parameter.
@nayeemuddinmoinuddin21864 жыл бұрын
@@AdamMarczakYT - Thanks a lot for quick response.
@vigneshnatarajan5005 Жыл бұрын
Hi Sir, In schedule triggers What if I need to run Monday to Friday @6AM, Saturday @ 8AM and Sunday @4PM for a single pipeline.. Did I need to create 3 separate triggers and add in the Single pipeline or else in trigger json code I can have 3 schedules?
@bhanugavini15444 жыл бұрын
Hi Adam, It was a very good intro for triggers in ADF.. What if I want to schedule trigger only on the selected dates in a year... lets say I want to schedule a trigger to run on Feb, March, june, July, Oct, Nov months on the specified dates... how can we achieve it..
@AdamMarczakYT4 жыл бұрын
No out of the box feature handles this. So you can make simple logic app on daily schedule, with a small condition checking whether this day is on the list of scheduled dates to run. If it is then trigger adf.
@varuntyagi2282 жыл бұрын
Can we parameterize the Storage account name and container name while creating the Storage event Trigger. If we can't then how will we change the names to QA and production automatically?
@rafaeldejesusbarcelovergar32933 жыл бұрын
Really good video Adam, thanks for share. In my case when I do a test with 10 files at the same time and I look at the monitor tab, ADF executes the pipeline sequentially. I have increased the property "Concurrency " of the pipeline, but it didn't work, any suggestion?
@AdamMarczakYT3 жыл бұрын
Thanks! That is weird, if you use event trigger it should trigger multiple parallel flows.
@rafaeldejesusbarcelovergar32933 жыл бұрын
@@AdamMarczakYT thanks for answering. In my case, the problem was the number of files I used in the POC. I started with 10, later I decided to use 100, which was then when I got the parallel execution. I found it is the behaviour of the "event grid system topic" behind the scene.
@sandrojorgeoliveira1752 жыл бұрын
Awesome, Mark! Thank you!
@venkatx55 жыл бұрын
What's the Event Grid part? You mean the triggers are processed by Event Grid behind the scene? or you created any event Grid?
@AdamMarczakYT5 жыл бұрын
Hey, Event Grid is behind the scenes. You don't provision it or configure it manually, it gets done by creating trigger. You only need to ensure that resource provider for Event Grid is registered on subscription level.
@henrytigro4 жыл бұрын
Very nice explanatory video! Thanks first of all! Question: is there a way to create dynamic event based trigger? Meaning that the trigger will not be hooked up to a certain static path within a blob container but a dynamic one? Like for example with DataSets you can use parameters. Can we make a trigger be looking inside a container whose path is specified through parameter? Or maybe can we put a regex in the path?
@AdamMarczakYT4 жыл бұрын
Thanks! I'm not sure what you ask is possible because triggers are evaluated without context of the pipeline as they are the one who will invoke pipeline later on. But I personally never researched the topic that much. I usually tend to trigger based on a proper filter and then call ADF. If I need more control then I use logic apps which call data factory.
@henrytigro4 жыл бұрын
@@AdamMarczakYT thanks for the answer! I understand what you mean, triggers shouldn't have context of pipeline. I was wondering if there was a way to parameterize the folder where the trigger loos into. Just for reference what I mean: stackoverflow.com/questions/62873941/using-parameters-to-locate-file-during-trigger-creation-in-azure-data-factory thank you anyway for your answer here!
@thalesdefaria2 жыл бұрын
@@henrytigro Hey Enrico, I wonder if you could resolve this case. I'm looking for similar case, where my path is based on date (yyyy/mm/dd) so I need to input that on the path to be triggered.
@saikatsengupta66754 жыл бұрын
Hi Adam, What if in the event based trigger your file got copied but it's corrupted or incomplete. How will you handle such scenario to ensure the pipeline is not triggered? Any suggestions?
@AdamMarczakYT4 жыл бұрын
Hey, files should not be corrupted after load. Maybe you made a small mistake? maybe you missed extension? Files should always copy correctly. Never seen issues like you described.
@GG-uz8us4 жыл бұрын
Really good content. Thank you. I have a quick question, if my pipeline has a date parameter, how do I pass the current date from a scheduled trigger to the pipeline parameter?
@AdamMarczakYT4 жыл бұрын
Check out my ADF parametrization video and use expression :)
@AxL28AxL4 жыл бұрын
Is it possible to update Trigger settings in Ci/Cd? Like in Dev i want it every week while on production i want it every day?
@AdamMarczakYT4 жыл бұрын
Yes, you can implement CI/CD for ADF and update any element as you wish docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment?WT.mc_id=AZ-MVP-5003556 , because ADF triggers are just objects that can be updated via ARM template docs.microsoft.com/en-us/azure/templates/microsoft.datafactory/2018-06-01/factories/triggers?WT.mc_id=AZ-MVP-5003556
@AxL28AxL4 жыл бұрын
@@AdamMarczakYT but every time I run the publish the arm template change.
@fazeelliaqat8193 Жыл бұрын
Can I Set Trigger Recurrence Time From DevOps Library Group?
@mahendramanchekar2 жыл бұрын
Thanks. I am learning from you lots.
@ayyoubsghiourielidrissi28972 жыл бұрын
it's possible to do the same with github : copy files from github to azure blob storage?
@NeumsFor93 жыл бұрын
Is there a video with custom evemt triggers?
@AdamMarczakYT3 жыл бұрын
Not yet, it's still in preview so I typically wait until I create tutorial on those
@srikkar4 жыл бұрын
Hi Adam thanks for the tutorial While creating param for pipeline I don't see parameter section in the current azure version Im trying to get csv from blob n store it in sql database using event trigger
@AdamMarczakYT4 жыл бұрын
Thanks for watching. Make sure to 'unclick' the selected action by clicking on the blank space of the pipeline so that parameters appear below. If you have action selected then panels are for that action.
@nicolemwanaidi14884 жыл бұрын
Did I just see you use the same linked service for input and output? I mean I know you said they can be shared but I thought of it differently
@AdamMarczakYT4 жыл бұрын
Linked service is just connection definition. It can be used for both inputs or outputs if you have the right permissions.
@chen55764 жыл бұрын
Hi Adam, really good content as usual! May I please ask how to handle NA when upserting blob csv file to SQL DB? I am using ADF for ETL, and some data frames contain missing values (some columns are all NA in some extreme cases), and When I upsert new data frames to SQL it kept throwing error messages {errorCode": "2200", ErrorCode=UserErrorInvalidDataValue,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column 'CHP_Kpa' contains an invalid value 'NA'.,} Could we apply parameterized content to RScript as an input? For instance, I need to run Rscript on each csv file inside a blob container, and I am using batch service. Recall your lookup video to get a list of csv files in a container, and could we apply this lookup results as a targetFileList in my R code? If you have any ideas on this would be much appreciated!!!
@AdamMarczakYT4 жыл бұрын
You can use checkbox to treat NA values as nulls in the data factory. This should help. Otherwise you would need to fix your data model. You can also use Mapping Data Flow (I have video on this) to fix the data before uploading. If you run R script via databricks, then yes you can pass parameters to databricks from data factory. docs.microsoft.com/bs-latn-ba/azure/data-factory/transform-data-using-databricks-notebook thanks for tuning in!
@naveenshindhe28933 жыл бұрын
Really good video Adam, thanks for sharing, i got 1 question, In source we have given select a1 from DEV_EDW.tablename, (DEV_EDW is for development), For test it should be like select a1 from QA_EDW.tablename, for this will be creating parameter, should we use trigger also for this, please reply..
@bmk-v7n4 жыл бұрын
can the event grid be used for files in cosmos, meaning when the stream file is created in cosmos trigger the pipeline in ADF
@AdamMarczakYT4 жыл бұрын
But cosmos db isn't file hosting service. It's a NoSQL database, it does have attachment features but it's not it's main feature and should not be used as such. Can you elaborate on your question?
@kainatkhan24594 жыл бұрын
Can we automatically trigger event in storage qeue when storage account updated????
@AdamMarczakYT4 жыл бұрын
Yep. Use event grid to do this. Go to blob > events on the left menu > new subscription > endpoint type storage queue. docs.microsoft.com/en-us/azure/event-grid/blob-event-quickstart-portal?WT.mc_id=AZ-MVP-5003556 like here just different endpoint type (webhook > queue).
@harmonyliu82394 жыл бұрын
Great videos! One question is that so far all the data ingestions using ADF has been moving data within the Azure system, how do I move data from a public data source , in my case the NOAA's public numerical weather forecasts products, into Azure? the data will have two be saved on hourly basis when new forecasts are available. Is ADF still the correct tool for this?
@AdamMarczakYT4 жыл бұрын
Thanks. I don't know NOAA but assuming it's just some web API you can try HTTP connector as data source for copy activity. reference: docs.microsoft.com/en-us/azure/data-factory/connector-http If this would be too complex to achieve my advise would be to use logic apps to pull this data to blob and then trigger ADF to process it.
@harmonyliu82394 жыл бұрын
@@AdamMarczakYT Thank you very much Adam! I will take a look at this. Looking forward to more videos on the different functions of the ADF!!!
@vikasgupta68884 жыл бұрын
Awesome. Thanks for this session. Any suggestion what if I want to run the pipeline only after some other pipeline finished. (Like C pipeline should start only after A and B got done) and all pipeline are separate/individual, so on demand I can run individual also. How can we make event based trigger? Please suggest.
@AdamMarczakYT4 жыл бұрын
You can just make pipeline that will execute pipelines in the order you need. Use execute pipeline activity.
@KheireddineAzzez-l3g11 ай бұрын
Hi, How to do that in power BI datalake
@ChikeEmeka3 жыл бұрын
I have daily email recieved with 15K records in csv from a contractor trigger method shoudl i use to pick up that file from the email. First option logic app but not sure how to develop one. What trigger method would you advice. I am thinking of power automate but not sure which method will work with which trigger method. Thanks
@johnfromireland75513 жыл бұрын
Yes, you can use Power Automate to get the email attachments, as the email arrives, and save them to your OneDrive or SharePoint Library. No need for Logic Apps or ADF.
@anoopdkulkarni4 жыл бұрын
Hi Adam ur videos are very helpful. Thanks. I have a Q. I have an ADF pipeline which accepts parameters, so if want to schedule a trigger which passes these parameters to the pipeline how should I achieve this?
@AdamMarczakYT4 жыл бұрын
When you create a schedule (trigger) for pipeline that has parameters defined it will ask you to provide parameters. If you need to pass parameters on schedule maybe try invoking it from logic app. Thanks!
@anoopdkulkarni4 жыл бұрын
@@AdamMarczakYT thanks..will try and get back. I triedediting json(trigger) and add manually.. but it does not trigger as scheduled.. let me know if you try this. Thanks much
@SivanandhamP4 жыл бұрын
@Adam, thank you. This is a really great tutorial. I have a couple of questions here, 1. Is there any way that we can hold Logic App Create Pipeline call until Data Factory pipeline operation complete 2. And can we get some value back to Logic App from Data Factory Pipeline call. Any links or sample reference doc might be more helpful. thanks in advance.
@AdamMarczakYT4 жыл бұрын
Hey thanks. For 1. you can use until loop with delay action and get pipeline run to check status . Run that until loop until status changes from running. For 2. Get pipeline run returns info about pipeline run. Not sure if the message property contains what you need. If not you can use rest api for data factory which probably can do more than connector.
@SivanandhamP4 жыл бұрын
@@AdamMarczakYT thank you so much for your response. Could you please explain the last statement, exposing REST API from Data Factory. As per my knowledge, The Data Factory can consume the REST API, not sure how to expose it. My use case is, I have a customer data in SAP Hana, I want to expose the data in REST API. My current approach: #1. Connected the SAP Hana DB in Datafactory and created a pipeline to move the required rows to storage #2. Invoke the pipeline from Logic App, once if the pipeline executed, I'll read the data from storage. I'm sure this is not a good way. Pls provide solution suggestion for my above use case
@AdamMarczakYT4 жыл бұрын
Here is the reference for ADF REST API docs.microsoft.com/en-us/rest/api/datafactory/v2 you can control ADF by calling any of those methods.
@SivanandhamP4 жыл бұрын
@@AdamMarczakYT Thanks for your quick response. I tried invoking the pipeline REST API (docs.microsoft.com/en-us/rest/api/datafactory/pipelines/createrun), it returns only pipeline ID but I wanted to return a couple of DB rows as an output.
@SivanandhamP4 жыл бұрын
@@AdamMarczakYT Is there a way that we can set values to Parameters dynamically. eg: Copy the data from variable to parameters.
@mihasedej4 жыл бұрын
Adam thank your videos. Is there any recomendation how to trigger pipeline from local SSIS ?
@AdamMarczakYT4 жыл бұрын
Thanks for watching. For ADF use REST API to trigger pipeliens docs.microsoft.com/en-us/rest/api/datafactory/pipelines/createrun
@sorontar14 жыл бұрын
Hi!, Is this possible to trigger pipeline when some message lands in queue?
@AdamMarczakYT4 жыл бұрын
Yes but not directly from Data Factory. Use logic apps as they are enterprise integration service designed for those kind of scenarios.
@sorontar14 жыл бұрын
@@AdamMarczakYT Thanks! Moving on to the logic apps video then :)
@balanm85705 жыл бұрын
Hey Adam, this was an awesome video !!!. Keep posting videos like this ... I have my project requirement, where I need to get documents (Unstructured Data like .pdf, .ppt, .docx, .xlsx etc.,) from SharePoint online site in an on-going basis. Our solution, was to use Logic Apps to get the sharepoint content. We have two scenarios while getting the documents from sharepoint. 1. First time extract - Where we will get all the sharepoint documents into Azure Blob Storage. For this we planned to use Logic Apps with HTTP trigger which will gets invoked from ADF as one time activity. 2. Incremental extract - Where we need to get only the new documents and the documents for which content has updated in sharepoint. Do you have any suggestions for implementing the above 2nd scenario. Thanks for your support.
@AdamMarczakYT5 жыл бұрын
Hey. Glad you enjoyed it. For second scenario just use SharePoint "file added or modified properties only" trigger. It will trigger logic app every time there is new file or modified file on SharePoint.
@balanm85705 жыл бұрын
@@AdamMarczakYT Thanks Adam for the tips. I will try the with that and let you know if any help needed. Again, thanks for your timely support.
@vasuthota61795 жыл бұрын
Will ADF supports file created event triggers in data lake store ?
@AdamMarczakYT5 жыл бұрын
Gen 2 is supported, Gen 1 is not. This is related to event grid which doesn't work with gen 1.
@svdfxd5 жыл бұрын
hi Adam, Awesome video on Azure Data Factory Triggers. I have a use-case where I am copying data from Google BigQuery using ADF. Can we use event based trigger to trigger the ADF copy-data pipeline when a table is created in BigQuery? Is that possible? - Sam
@AdamMarczakYT5 жыл бұрын
Hey! thanks Sam! I'm not expert on big query but I know ADF doesn't support events outside of blob storage at this time. What you could do on the other hand is maybe create a small logic app and query big querys REST Api for this cloud.google.com/bigquery/docs/reference/rest/ and whenever you see new dataset trigger ADF pipeline with name of the table. Also if you just want to always copy all tables from big query via ADF you can use Lookup action to list tables from bigquery and just loop over them and copy all. I don't know big query through so I'm just throwing out ideas.
@svdfxd5 жыл бұрын
@@AdamMarczakYT Thanks Adam for your prompt response. Let me explore the Logic app; I tried the Lookup action, but in my case a table gets created every day. Basically I need to query the google analytics data from ga_sessions_ table into a temp table on BigQuery side and then copy that temp table to Azure.
@AdamMarczakYT5 жыл бұрын
I did exactly this in the past by writing small function app in Azure which consumed data daily from this reporting api developers.google.com/analytics/devguides/reporting/core/v4/
@chaudhari11115 жыл бұрын
simply wonderfull work..thanks..
@AdamMarczakYT5 жыл бұрын
Cheers mate!
@swativish4 жыл бұрын
Hi Adam thanks for your videos they are very useful. I have simple copy activity pipeline for which I have set up a event based trigger (when blob is updated the pipeline is executed). It works seamlessly when I upload a file into the blob manaully. But when I use logic app to extract the attachment and upload the file to blob, the copy activity fails Operation on target Copy data1 failed: ErrorCode=UserErrorInvalidColumnMappingColumnNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column 'Employee_ID' specified in column mapping cannot be found in source data.,Source=Microsoft.DataTransfer.ClientLibrary,' The same csv file returns no error when I upload the file manually but fails when it is uploaded via the logic app
@AdamMarczakYT4 жыл бұрын
Are you sure it's exactly the same. Error suggests your column mapping is incorrect. Also try removing all mapping and create them again, maybe you missed something. I've seen this happen when I was doing trainings for others. Best course of action was quickly doing everything over, it's too hard to find issues sometimes.
@venkat.k43924 жыл бұрын
Thank you for a great share.
@AdamMarczakYT4 жыл бұрын
Thank you! Glad you enjoyed it :)
@seb63024 жыл бұрын
Very cool video!!
@AdamMarczakYT4 жыл бұрын
Thanks Seb! :)
@alfredsfutterkiste75343 жыл бұрын
Very good.
@christinaconstantinou30944 жыл бұрын
Great video. For anyone getting stuck publishing the event trigger, I had manually register the Event Grid resource provider. Here's the doc from Microsoft docs.microsoft.com/en-us/azure/azure-resource-manager/templates/error-register-resource-provider
@AdamMarczakYT4 жыл бұрын
Thanks Christina. I think I mentioned this in the video but considering I've already got similar comment/question few times I see this should have been stressed out more, thank you for sharing with the community :)
@johnfromireland75513 жыл бұрын
@@AdamMarczakYT I had exactly the same problem. What is the difference between an "Event Grid System Topic" and an "Event Grid Topic"? It was the system one that had to be created to enable me to publish my ADF trigger.
@ajay123684 жыл бұрын
I wish I could hit 1000 likes... tqsm !!
@jeffersonbabuyo32703 жыл бұрын
subscribed!
@AdamMarczakYT3 жыл бұрын
Welcome, thanks! :)
@pigrebanto4 жыл бұрын
the most interesting rigger the Tumbling Wondows you did NOT explained..why?
@AdamMarczakYT4 жыл бұрын
I explained tumbling window at 1:35 . I did not show life demo because it is too similar to timer trigger demo. Only difference is concurrency controlling. Feel free to check the docs for more examples: docs.microsoft.com/en-us/azure/data-factory/tumbling-window-trigger-dependency
@pigrebanto4 жыл бұрын
@@AdamMarczakYT ok but Tumbling Windows is different by defintion. It has states. I was keen to see an explanatin of it. I am wondering why you did not explain it because you normally give clear explanation of topics. thanks anyway.
@sjitghosh4 жыл бұрын
excellent
@AdamMarczakYT4 жыл бұрын
Thank you! Cheers!
@BijouBakson2 жыл бұрын
Thank you
@snackymcgoo15394 жыл бұрын
MS Azure is perhaps the worst product I have ever seen. Over engineered, super slow, thousands of things to click, thousands of "X" to click just to get rid of the stupid popups built in to every single action, thousands of re-hovering over something which has a popup description where UNDER THE DESCRPTION POPUP is the 3 elipses ... that you are trying to click but you can't because when you hovered over it the FUCKING POPUP gets in the way. Just terrible. I mean UNBELIEVABLY bad product. By the way, event trigger doesn't work, throws an error and window pops up with error message but it only stays there a few seconds, and as I try to highlight select the text so I can copy/paste it to search for solution, then, AND ONLY THEN does the typical popup which stays WAY TOO LONG decide to disappear before i can copy paste it into a search. I mean this product is complete and utter fucking shit.