- 1542 Views
- 4 replies
- 1 kudos
I renamed our service principal in Terraform, which forces a replacement where the old service principal is removed and a new principal with the same permission is recreated. The Terraform succeeds to apply, but when I try to run dbt that creates tab...
- 1542 Views
- 4 replies
- 1 kudos
Latest Reply
This is also true for removing groups before unassigning them (removing and unassigning in Terraform)│ Error: cannot update grants: Could not find principal with name <My Group Name>
3 More Replies
- 4973 Views
- 8 replies
- 4 kudos
Hi Community,I have successfully run a job through the API but would need to be able to pass parameters (configuration) to the DLT workflow via the APII have tried passing JSON in this format:{
"full_refresh": "true",
"configuration": [
...
- 4973 Views
- 8 replies
- 4 kudos
- 1543 Views
- 1 replies
- 0 kudos
I am running this code:curl -X --request GET -H "Authorization: Bearer <databricks token>" "https://adb-1817728758721967.7.azuredatabricks.net/api/2.0/clusters/list"And I am getting this error:2024-01-17T13:21:41.4245092Z </head>2024-01-17T13:21:41.4...
- 1543 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, Could you please renew the token and confirm?
by
ccs
• New Contributor II
- 1754 Views
- 5 replies
- 2 kudos
On this feature IP access lists IP access lists - Azure Databricks | Microsoft Docs, what we observe is that if your IP is not on the access list, you cannot modify the list via API since you are not on trusted location. What if I specify only 1 IP s...
- 1754 Views
- 5 replies
- 2 kudos
Latest Reply
Hey @Chun Sing Chan Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
4 More Replies
- 2095 Views
- 4 replies
- 0 kudos
I am trying to write a process that will programmatically update the “run_as_user_name” parameter for all jobs in an Azure Databricks workspace, using powershell to interact with the Jobs API. I have been trying to do this with a test job without suc...
- 2095 Views
- 4 replies
- 0 kudos
Latest Reply
Solution you've submitted is a solution for different topic (permission to run job, the job still runs as the user in run_as_user_name field). Here is an example of changing "run_as_user_name"Docs:https://docs.databricks.com/api/azure/workspace/job...
3 More Replies
- 4840 Views
- 3 replies
- 0 kudos
ProblemI'm unable to authenticate against the https://accounts.cloud.databricks.com endpoint even though I'm an account admin. I need it to assign account level groups to workspaces via the workspace assignment api (https://api-docs.databricks.com/re...
- 4840 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @lasse l Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...
2 More Replies
- 10317 Views
- 2 replies
- 3 kudos
Access to Databricks APIs require the user to authenticate. This usually means creating a PAT (Personal Access Token) token. Conveniently, a token is readily available to you when you are using a Databricks notebook.databricksURL = dbutils.notebook....
- 10317 Views
- 2 replies
- 3 kudos
by
jch
• New Contributor III
- 789 Views
- 2 replies
- 0 kudos
suggestions).This one looks perfect kaggle kernels output rsrishav/starter-youtube-trending-video-dataset -p /path/to/dest but I'm not using CLI, I'm using a databricks notebook.I tried using this code but it doesn't work. data_path = 'rsrishav/youtu...
- 789 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @jch, The error message you received, "Permission 'kernels.get' was denied," suggests that you don't have the necessary permissions to access or view the Kaggle kernel. This error typically occurs when you try to access a kernel that you don't ha...
1 More Replies
- 1531 Views
- 5 replies
- 2 kudos
After creating the delta pipeline, I would like to get details from the dlt maintenance job automatically created by Databricks, like the scheduled time when the dlt maintenance tasks will be executed. However, it seems the Job API 2.1 doesn't cover ...
- 1531 Views
- 5 replies
- 2 kudos
Latest Reply
Hi @Debayan Mukherjee ,Actually the Databricks Jobs API documentation has not been fixed yet. The parameter `job_type` should be included in the list endpoint request documentation. Please do this in order to avoid unnecessary questions here in the ...
4 More Replies
by
PawelK
• New Contributor II
- 2205 Views
- 4 replies
- 1 kudos
Hello, I'm looking for a way of defining notification destination using API or Pulumi/Terraform providers. However I cannot find it anywhere. Could you please help and advice if i'm missing something or it's not available at the moment?And If it's no...
- 2205 Views
- 4 replies
- 1 kudos
Latest Reply
This issue seems to point to the lack of a public API being the culprit behind the lack of a resource for Terraform.
3 More Replies
- 20579 Views
- 4 replies
- 5 kudos
Hi, I have to send thousands of API calls from a Databricks notebook to an API to retrieve some data. Right now, I am using a sequential approach using the python request package. As the performance is not acceptable anymore, I need to send my API c...
- 20579 Views
- 4 replies
- 5 kudos
Latest Reply
Hi @Paul Poco Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
3 More Replies
by
Chinu
• New Contributor III
- 450 Views
- 1 replies
- 1 kudos
Hi, Do you have an api endpoint to call to get the databricks status for AWS?Thanks,
- 450 Views
- 1 replies
- 1 kudos
Latest Reply
@Chinu Lee you have webhook/slack that can be used to fetch status https://docs.databricks.com/resources/status.html#webhookare you specifically looking for your account workspace/above one
by
AnuVat
• New Contributor III
- 15685 Views
- 7 replies
- 12 kudos
Hi, I am working on an ML project and I need to access the data in tables hosted in my Databricks cluster through a notebook that I am running locally. This has been very easy while I run the notebooks in Databricks but I cannot figure out how to do ...
- 15685 Views
- 7 replies
- 12 kudos
Latest Reply
We can use Apis and pyodbc to achieve this. Once go through the official documentation of databricks that might be helpful to access outside of the databricks environment.
6 More Replies
- 1546 Views
- 1 replies
- 1 kudos
Hello, I have an Databricks account on Azure, and the goal is to compare different image tagging services from Azure, GCP, AWS via corresponding API calls, with Python notebook. I have problems with GCP vision API calls, specifically with credentials...
- 1546 Views
- 1 replies
- 1 kudos
Latest Reply
Ok, here is a trick: in my case, the file with GCP credentials is stored in notebook workspace storage, which is not visible to os.environ() command. So solution is to read a content of this file, and save it to the cluster storage attached to the no...
by
GuMart
• New Contributor III
- 927 Views
- 2 replies
- 1 kudos
Hi,Is it possible to set it up the RETRY_ON_FAILURE property for DLTs through the API?I'm not finding in the Docs (although it seems to exist in a response payload).https://docs.databricks.com/delta-live-tables/api-guide.html
- 927 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Suteja Kanuri ,Thank you so much for the quick and complete answer!Regards,
1 More Replies