Hi @azurecontentannu6399, I have a random file in txt format but unstructured. There is no delimiter. How do i append the file with the existing file in the destination? wildcard merger option is not available for txt or other formats.
@amlansharma54293 күн бұрын
Hi Annu, can you please create your next video on tumbling window trigger and provide a real time use case? Thanks in advance!
@amlansharma54293 күн бұрын
Anyone facing error, first you need to go to IAM Tab of Your ADF and Add Role Assignment - Data Factory Contributor to the System Assigned Managed Identity and then publish the pipelines and run.
@pavankumar-ni3my3 күн бұрын
WHy are we not using "Filter by Last Modified" in copy data activity alone to do this?
@SandipanSarkar-c8v4 күн бұрын
Finished watching
@manipatel38554 күн бұрын
4
@shibswain66794 күн бұрын
Hi madam , By using only CopyData Activity we can also copy our required format by providing .json or .csv to Wildcard File Path. Can we do that?
@azurecontentannu63994 күн бұрын
@@shibswain6679 yes right
@azurelearner40554 күн бұрын
Thanks for starting SQL Series
@SandipanSarkar-c8v5 күн бұрын
Finished watching
@azharaktherk24325 күн бұрын
This seems to be tough to me honestly but it was very great explanation Annu. Well Done.
@prasadmuthyala96255 күн бұрын
Please do more video on this and make Azure interview questions series also
@vikashkumargupta785 күн бұрын
In adf how to pass dynamic variables and insert pipeline details into pipeline log table in ssms Provide me link asap
@@azurecontentannu6399 I have CREATE TABLE [bolt_stage].[Pipeline_Log_Table] ( [ID] INT IDENTITY(1,1) NOT NULL PRIMARY KEY, -- Setting ID as the primary key [PipelineName] VARCHAR(50) NULL, [ExecutionStartDateTime] DATETIME NULL, [ExecutionEndDateTime] DATETIME NULL, [Status] VARCHAR(50) NOT NULL, -- Ensure status is always provided like Success, InProgress, Failed [CreatedDate] DATETIME NULL DEFAULT GETDATE(), -- Automatically set the creation date [PL_RunId] VARCHAR(100) NULL, [PL_TriggerTime] DATETIME NULL, [PL_TriggerType] VARCHAR(20) NULL, [PL_ExecutionTimeInSec] INT NULL, -- Changed to INT for better performance and accuracy [ErrorMessage] NVARCHAR(MAX) NULL ); Then write store procedure ya lookup but how can I pass dynamic variables or parameter without any error can you provide according to this pls ASAP and insert into ssms this SQL table
@vikashkumargupta785 күн бұрын
@@azurecontentannu6399 I capture all column dynamic variable data in adf how can you show me pls as requested
@pavankumar-ni3my5 күн бұрын
This video is helpful, However I'd like to know if we can do it other way (SFTP client is azure storage account and SFTP server is local machine) to get the files from cloud to onprem
@ravipatisrikanth83315 күн бұрын
instead of this can i use wildcard file path for this scenario
@@ravipatisrikanth8331 right that's the easiest way
@ravipatisrikanth83315 күн бұрын
Thank you for reply
@manu775646 күн бұрын
thank you....hope it will continue on daily
@citizenearth33246 күн бұрын
Subscribed & Loved , Thanks
@iSmartSai37 күн бұрын
I did liked your video even before watching it...that is the trust I have on your videos annu. Please keep uploading videos regularly. Thank you.
@kavadianil7 күн бұрын
Hi... One doubt.. Assume u have more than 3 transaction for each customer id.. In that case.. My question is how to find the latest two recent transaction dates Tell me the quary for that.. If u know?
@Sunil999G7 күн бұрын
Nice explanation. Looking forward to more such SQL scenario questions
@mayankpatni56397 күн бұрын
I have done bcom want to enter in it industry how to do mtech in computer science after bcom should i need to do mtech or mca to get a job in it industry after bcom
@kalki11687 күн бұрын
Hi mam, it will count subfolder as file, right ? How to go inside and subfolder and fetch count of files ?
@jeevaraj81510 күн бұрын
Hi Annu you are doing great job, Thanks for your videos. I need one help from you please create one video for copying delta files from One Environment to Another Environment.
@matiascesarano575111 күн бұрын
Hi! If i want to add an additonal column, but with a parameter like this: @json(string(pipeline().parameters.additional_fiel)) The parameter additional_fiel how was defined? i try this: {"name": "snapshot","value": {"value": "@formatDateTime(utcNow(),'yyyy-MM-dd')","type": "Expression"}} but i get "The value of property 'additionalColumns' is in unexpected type 'IList`1'. " Any suggestion?
@matiascesarano57515 күн бұрын
@azurecontentannu6399
@chethan416012 күн бұрын
how to copy from 3 table from on prim to azure sql with one copy activity
@user-kakatya12 күн бұрын
Will you also cover the dataframes in this playlist and the use-cases on where is it best to use the rdd vs df? I actually thought that the dfs are used more due to the ease of use, depending on the data type of course. Thanks for this series so far! Helpful and easy explanation for a difficult concept of rdds. Following for more!
@azurecontentannu639912 күн бұрын
@@user-kakatya thankyou. Ya Data frames are used more often
@MrinaalRaj13 күн бұрын
a good playlist to go over....so that to gain some real life scenarios
@azurecontentannu639913 күн бұрын
@@MrinaalRaj thanks
@TheBastard190013 күн бұрын
Id would be additionally helpfull if you share the links you mention in your videos in the description from your videos
@tusharkaushik7914 күн бұрын
Topics are covered nice
@azurecontentannu639914 күн бұрын
@@tusharkaushik79 Thankyou
@tusharkaushik7914 күн бұрын
please provide the ppts also if possible
@zahidalam783115 күн бұрын
I m having one doubt ... We can use any one like "pipeline parametrization" OR "Data set parametrization". I can see you shown in this video both way?
@azurecontentannu639915 күн бұрын
@@zahidalam7831 ya both we can use
@zahidalam783115 күн бұрын
@azurecontentannu6399 Thanks 4 your quick response. Your teaching way is really different. Keep going on
@azurecontentannu639915 күн бұрын
@@zahidalam7831 thankyou
@zahidalam783115 күн бұрын
@@azurecontentannu6399 does also another way like variable ?
@azurelearner405516 күн бұрын
Thanks for sharing the video please share more videos in expressions !!
@mayankgupta487317 күн бұрын
will it also give notification if a pipeline run is cancelled?
@azharaktherk243218 күн бұрын
Excellent Explanation Annu. This seems to be tricky however your explanation made it simple.
@mohammadabdelrahman415319 күн бұрын
You're creating way too many datasets???
@azurecontentannu639919 күн бұрын
@@mohammadabdelrahman4153 no one is pointing to container level, another one is folder level and other is file level
@tkmaliren19 күн бұрын
Video is very nice and illustrative, I want to know, how you can customize the email, e.g. I want to include the pipeline name, timestamp, status etc as a part of email description, I tried with binding query columns {pipeliname_name}, {status} as a part of details but still no luck. Can you help?
@azurecontentannu639919 күн бұрын
@@tkmaliren use logic app approach
@listen_learn_earn19 күн бұрын
Back Slash\
@listen_learn_earn20 күн бұрын
Default: ,
@AkashBisht-of7tw20 күн бұрын
Hi Annu, Just wanted to ask for this useCase why didn't you use preserve hiearchy instead of using get metadata. retrieving filename wasn't required in my opinion. Could you clarify.
@azurecontentannu639919 күн бұрын
@@AkashBisht-of7tw ya you can directly use a single copy activity with wildcard file path option
@azurecontentannu639919 күн бұрын
Here is the detailed video about wildcard : kzbin.info/www/bejne/sKjIqmWQi8qFhcksi=IX9s5NI29x-ui3cN
@AkashBisht-of7tw19 күн бұрын
@@azurecontentannu6399 Thank you
@kiransonkamble801224 күн бұрын
Thanks for explaining that in such simple terms. I finally understand it now!
@sanjeevreddy369124 күн бұрын
how to optimise pipeline when you are copying large files in adf? could you refer any video related to this
@anjireddyrachamallu717525 күн бұрын
Hi Annu, thanks for the video. Is it possible to automate this process?
@azurecontentannu639924 күн бұрын
@@anjireddyrachamallu7175 use adf instead
@AnjaliMeena-j3p27 күн бұрын
great content
@azurecontentannu639927 күн бұрын
@@AnjaliMeena-j3p thankyou
@ManjuBasavantapura28 күн бұрын
what is there are two folders FolderA and FolderB instead of another folder?
@kovaisampath6329 күн бұрын
well organized content and easy to follow, thank you
@sanjayr3597Ай бұрын
Good Video. Any way we can log the error data in Table and fail the pipeline at the end of the run?
@Sagar_सागरАй бұрын
how to copy all files from one blob storage folder to another without copying subfolders and their contents?
@azurecontentannu6399Ай бұрын
@@Sagar_सागर make recursive = false
@Sagar_सागरАй бұрын
how to copy all files from one blob storage folder to another without copying subfolders and their contents?
@azharaktherk2432Ай бұрын
Go slow it would be much helpful for learners.
@sanjeevreddy3691Ай бұрын
we can set constraints for columns in table , which helps in identifying bad records . how to use fault tolarance in case of file system since we dont have any constraints or rules
@sivaram9654Ай бұрын
I've gone through every video in the playlist and practiced them. They were extraordinarily helpful as well as insightful. I'm looking forward to more videos on ADF. Thank you so much!!!!!!!!.
@azurecontentannu6399Ай бұрын
@@sivaram9654 thankyou so much
@arunsai7257Ай бұрын
I have doubt : If we have 5 columns and 2 new columns added whether the pipeline will fail or not? If not how ??.