While creating Iceberg table in Snowflake, it needs METADATA_FILE_PATH right? I have tried same command and getting an error - Iceberg table DEPT_ICEBERG creation lacks the required METADATA_FILE_PATH table option for the external catalog integration ICEBERGCATALOGINT. Can you explain how did you create unity_catalog_dev ? Appreciate more info on this. Thank you.
@wallco26Күн бұрын
do you need a profile.yml file saved somewhere in databricks when using workflows or does defining the cluster within the UI take place of the yml file?
@laobaba2 күн бұрын
In the very last step of the last notebook when you go to run the demo, I get the following error: "py4j.security.Py4JSecurityException: Method public java.lang.String com.databricks.backend.common.rpc.CommandContext.toJson() is not whitelisted on class class com.databricks.backend.common.rpc.CommandContext" How do you fix this?
@dioppapis2 күн бұрын
Thank you
@hheungsu2 күн бұрын
Awesome!!
@laobaba3 күн бұрын
Please show a complete beginning to end demo of this, and quit skipping parts! I don't have a document chopped into chunks, show me how to start from a folder of PDFs to being able to talk to an LLM with RAG, not skip parts that make me have to search around to how to do a part of what you're demoing!
@laobaba3 күн бұрын
I can't tell you how many of these presentations I've watched where I am 90% the way through and I get angry because I just want them to show me the damn steps to do it in the databricks UI. I'm a noob here just begging you to walk me through the process like i'm 5, showing me how to be able to do RAG against my PDFs and instead you waste hours of my time now trying ot find a damn video that show me how to do it instead of taking about the pros, cons, etc. This video even says "And how" but it doesn't show you how!
@LorenzSinger3 күн бұрын
That's really cool, could you add a link to an example notebook ? Thanks
@niteshgupta96974 күн бұрын
such engaging delivery
@Shubham-y3k5 күн бұрын
Can I get the PPT ?
@ajithvenugopal0075 күн бұрын
Interesting feature on the byol side.. where can I read more about it? and how do I enable the feature for my databricks workspace?
@jasonkhaihoang7815 күн бұрын
Can we test individual DLT models in notebooks now? Or we need to run the full pipeline to see and validate the results? Thanks.
@marcoferraro1746 күн бұрын
Pretty useful!
@ernestoflores38737 күн бұрын
Hi, nice video! The powerpoint is somewhere?
@thusharmohan968 күн бұрын
I am being shown that the freq argument need to be passed as the current index index has none even though I had passed the frequency argument. how to solve this?
@Inceptionxg10 күн бұрын
After jensen, I thought to see what does it 2.0 actually means? Muaadh Rilwan
@andydataguy10 күн бұрын
I almost didn't click on this video. If you put the name of the speaker on the title I would have instant clicked this and watched immediately months ago.
@sharathreddygogula992610 күн бұрын
Great! That is just what I was looking for. Thanks :)
@narasimhanmb470311 күн бұрын
Hey Databricks, Your videos are very informative. But, at your company scale, do you really need to make money from the youtube AD? You're raking-in money with tech, vision and talent and customers. You could help those interested in learning your platform by disallowing youtube ADs on your channel.
@unstoppabforce11 күн бұрын
woow
@sammail9611 күн бұрын
Another very easy option is to use a snowflake jdbc connection.
@pavankumar-ni3my13 күн бұрын
Not useful at all
@ignaciomorenobasanez382113 күн бұрын
Dear Moez Ali, you blew mi mind with this Demo. Absolutely incredible. Thanks for sharing such a great information to the Supply Chain Data Science Community!
@nicky_rads14 күн бұрын
Interesting. Is it common for companies to have both snowflake and databricks ?
@mikevladi14 күн бұрын
sha1 creates a hash (secure in its name is a misnomer -- it is just longer than an MD5 hash), which should not be considered a replacement for encryption. Python supports the HMAC algorithm through its hmac module, which allows you to mix in a secret key for encryption-like security. Otherwise the presentation is thorough. Thank you!
@carrieliu696914 күн бұрын
This is very helpful thank you!
@nagusameta36614 күн бұрын
How do I calculate the optimal numPartitions in repartition or coalesce of dataframe?
@sujaa100014 күн бұрын
Thank you, it was great!
@vam877514 күн бұрын
Now Amazon is moving to ray.. when I just started with spark😢
@jasonkhaihoang78115 күн бұрын
Thanks for the useful demo. One question please. I assume that the DAB databricks.yml files are created in the same project to the notebooks, dbt models and AI/ML models correct? But the DAB code can be maintained separately from the data application code. Also, the Deployment (CD) should happen after we hit the "Approve Pull Request" button? I see that the Deployment happens right after the Unit Test/Validation passes?
@jasonkhaihoang78115 күн бұрын
So my understanding is we will put the databricks.yml and resources folder for DAB inside the same code repo where we develop dbt models, notebooks and AI/ML models. Is that correct? We will not put this DAB project in a separate repo? Because I see that notebooks need to be referred to using relative path "../". Thanks :)
@GenBollywood-i9y15 күн бұрын
CTOs have doubts and find it difficult..freshers are confident and find it easy. :-)
@amit1agrawal15 күн бұрын
Very nice explanation.My hunt for internal of delta lake ends here.
@jasonkhaihoang78116 күн бұрын
Can we use CDF (Change Data Feed) for Apply Changes Into in Streaming tables? :)
16 күн бұрын
Anyone else getting the error: raise ValueError("Must specify a chain Type in config")? It looks like Langchain version im working on doesn't like "stuff"
@HaoyuWang-t1i13 күн бұрын
Got the same error when serving the model. It works fine one month ago but I'm not sure how to fix it now
13 күн бұрын
@@HaoyuWang-t1i i fixed it by downloading the exact langchain python modules as he did
@ser1ification16 күн бұрын
Getting permission issues all the time (access denied)
@malaka12345617 күн бұрын
Great demo!
@magicgoku18 күн бұрын
Thanks! Looks awesome
@dahof278918 күн бұрын
Also seeing more migrations from Unity to Infa CGDC. Phase 1 is Catalog of Catalogs. Phase 2 is EOL of Unity.
@dahof278918 күн бұрын
I see a lot of databricks in use. It's pretty much either DB or Snowflake as of Nov. 24. But I see more and more Databricks.
@Ahmdrahbi18 күн бұрын
love it
@raghav651619 күн бұрын
How this model is working, how about data getting stored or there any retention period for this genie?
@golddata515119 күн бұрын
When will I ever see a real demo on databricks ?? This is just a bunch of slides !!
@joanpareja331319 күн бұрын
This was brilliant! Great presentation guys!!
@dpx8919 күн бұрын
“Delta is open” hahaha People really believe in that? A OSS that dbx maintain and control…
@dragondove619719 күн бұрын
For the comparing Java part, I don't think it's a fair comparison because Java's standard library doesn't provide a pre-configured http client. It should be compared with library like `hutool-http` which provides something like `String resText = HttpUtil.get(url)`. But still, scala has scripting `.sc` format and has great tool like `scala-cli` which makes making scripting much easier.(And I like scala worksheet, it's like a simpler version of jupyter). For the comparing python part, your libraries are more comfortable than the same things from python's ecosystem. And python's lambda is awful. ( And I do hate python's variable scope and global variables which made things complex )