Hello Soucy, I wish to learn more about your course and would appreciate the opportunity to discuss it with you, you can schedule a time to talk about it
@BurhanUYGUN-j1v2 ай бұрын
How do you get the databricks api secret key?
@BurhanUYGUN-j1v2 ай бұрын
How do you get the databricks API secrets location?
@okube-ai2 ай бұрын
Good question! Assuming you are talking about the Databricks API secrets required for the second serving endpoint, you need to provide a token associated with a user having an EXECUTE privilege on the model you are planning to use. If it's for your own testing, you can simply create a token associated with your own user, but if you plan to give access to multiple users or even external users, it's good practice to create a user (most likely user principal) specifically for that purpose and generate a token for that user. Once you have a token, you can input it as plain text in the serving endpoint configuration, but best practice would be to store this token as a Databricks secret and reference it using this notation: {{secrets/secret_scope/secret_key}}. Mind you, you might also have to configuration envrionment variables in your first serving endpoint to ensure the model can communicate with your LLM of choice. If you are using OpenAI API key and if you are using langchain.chat_models.ChatDatabricks you will need to set DATABRICKS_HOST and DATABRICKS_TOKEN as environment variables (unless you hardcode them in your model, but this is generally not recommended). I hope that helps!
@BurhanUYGUN-j1v2 ай бұрын
@@okube-ai Thank you for the swift response! Would you be able to provide more information on the way to store the Databricks API token as a Databricks secret! Thanks I am using the a notebook with langchain.chat_models.ChatDatabricks to create the first serving endpoint. Based on my understanding the DATABRICKS_HOST and DATABRICKS_TOKEN are not needed while creating the endpoint through the notebook. Let me know if we have common understanding here :)