awesome vid, man! Thanks for the detailed information
@Tales-from-the-Field2 ай бұрын
Thank you for watching @phillipdataengineer!
@bidata9858 ай бұрын
Thank for the detailed explanation, I have question like How to Capture no of records count while copying the files during Copy Activity in Data Pipeline in MS Fabric.
@shanthanpaladi2312 ай бұрын
How can Parameterize the database name in notebook from other notebook which use SparkSQL.
@bumdinh99113 ай бұрын
Sir can we PASS NOTEBOOK PARAMETERS to LookUp activity in Data Pipeline?
@Tales-from-the-Field3 ай бұрын
Hi @bumdinh9911 per Bradley, "Hello! That is a great question, sadly I could not find a way to do it at this time. The closest way I found was to trigger the Job scheduler API for Microsoft Fabric and you could run a job from a pipeline, but there are limitations and one listed was passing a parameter via an API. Now, just to play the other side. While it is not possible you could write the notebook information to a DW or to an Azure SQL Database table, and then do a look up operation to get the data. So we could accomplish the task of getting data from a notebook back into a pipeline, but it is not as straight forward as I was hoping. If that changes, or I should say 'when that changes', I promise you I will make a video on that!"
@abeerahmed56345 ай бұрын
I want to use the output of Notebook 1 in another notebook(it is using smpt to send mail and the mail should have the output), how do I do it
@Tales-from-the-Field5 ай бұрын
Hi @abeerahmed5634 could we get a little bit more information. Are you building text from Lakehouse fields, or is it plugging in numbers. You may not be able to get to specific, but trying to understand what parameters we need to define to send from Notebook1 into the child notebook.