- 13270 Views
- 10 replies
- 4 kudos
Hi Community,I have successfully run a job through the API but would need to be able to pass parameters (configuration) to the DLT workflow via the APII have tried passing JSON in this format:{
"full_refresh": "true",
"configuration": [
...
- 13270 Views
- 10 replies
- 4 kudos
Latest Reply
You cannot pass parameters from a Databricks job to a DLT pipeline. Atleast not yet. You can see from the DLT rest API that there is no option for it to accept any parameters.But there is a workaround.But there is a workaround.With the assumption tha...
9 More Replies
- 20574 Views
- 3 replies
- 3 kudos
Access to Databricks APIs require the user to authenticate. This usually means creating a PAT (Personal Access Token) token. Conveniently, a token is readily available to you when you are using a Databricks notebook.databricksURL = dbutils.notebook....
- 20574 Views
- 3 replies
- 3 kudos
Latest Reply
@User16752245312 You can use Databricks Secret Scope to manage sensitive data such as personal access tokens (PATs) securely. Storing your token in a secret scope ensures you don’t hard-code credentials in your notebook, making it more secure.For mo...
2 More Replies
by
ccs
• New Contributor II
- 4319 Views
- 6 replies
- 2 kudos
On this feature IP access lists IP access lists - Azure Databricks | Microsoft Docs, what we observe is that if your IP is not on the access list, you cannot modify the list via API since you are not on trusted location. What if I specify only 1 IP s...
- 4319 Views
- 6 replies
- 2 kudos
Latest Reply
Curious to learn if somebody also figured out a way to solve for the above as we've encountered this situation and are now locked out...
5 More Replies
- 73041 Views
- 5 replies
- 5 kudos
Hi, I have to send thousands of API calls from a Databricks notebook to an API to retrieve some data. Right now, I am using a sequential approach using the python request package. As the performance is not acceptable anymore, I need to send my API c...
- 73041 Views
- 5 replies
- 5 kudos
Latest Reply
Hey @Paul_Poco what about using the processpoolexecutor or threadypoolexecutor from the concurrent.futures module ? have u tried them or not . ?
4 More Replies
by
hanish
• New Contributor II
- 3627 Views
- 5 replies
- 2 kudos
We are using jobs/runs/submit API of databricks to create and trigger a one-time run with new_cluster and existing_cluster configuration. We would like to check if there is provision to pass "job_clusters" in this API to reuse the same cluster across...
- 3627 Views
- 5 replies
- 2 kudos
Latest Reply
Hi, Any update on the above mentioned issue? Unable to submit a one time new job run (api/2.0 or 21/jobs/runs/submit) with shared job cluster or one new cluster has to be used for all TASKs in the job
4 More Replies
- 10502 Views
- 4 replies
- 0 kudos
ProblemI'm unable to authenticate against the https://accounts.cloud.databricks.com endpoint even though I'm an account admin. I need it to assign account level groups to workspaces via the workspace assignment api (https://api-docs.databricks.com/re...
- 10502 Views
- 4 replies
- 0 kudos
Latest Reply
From this doc: To automate Databricks account-level functionality, you cannot use Databricks personal access tokens. Instead, you must use either OAuth tokens for Databricks account admin users or service principals. For more information, see:Use a s...
3 More Replies
- 8547 Views
- 4 replies
- 1 kudos
I renamed our service principal in Terraform, which forces a replacement where the old service principal is removed and a new principal with the same permission is recreated. The Terraform succeeds to apply, but when I try to run dbt that creates tab...
- 8547 Views
- 4 replies
- 1 kudos
Latest Reply
This is also true for removing groups before unassigning them (removing and unassigning in Terraform)│ Error: cannot update grants: Could not find principal with name <My Group Name>
3 More Replies
- 2641 Views
- 1 replies
- 0 kudos
I am running this code:curl -X --request GET -H "Authorization: Bearer <databricks token>" "https://adb-1817728758721967.7.azuredatabricks.net/api/2.0/clusters/list"And I am getting this error:2024-01-17T13:21:41.4245092Z </head>2024-01-17T13:21:41.4...
- 2641 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, Could you please renew the token and confirm?
- 5330 Views
- 4 replies
- 0 kudos
I am trying to write a process that will programmatically update the “run_as_user_name” parameter for all jobs in an Azure Databricks workspace, using powershell to interact with the Jobs API. I have been trying to do this with a test job without suc...
- 5330 Views
- 4 replies
- 0 kudos
Latest Reply
Solution you've submitted is a solution for different topic (permission to run job, the job still runs as the user in run_as_user_name field). Here is an example of changing "run_as_user_name"Docs:https://docs.databricks.com/api/azure/workspace/job...
3 More Replies
by
jch
• New Contributor III
- 1959 Views
- 1 replies
- 0 kudos
suggestions).This one looks perfect kaggle kernels output rsrishav/starter-youtube-trending-video-dataset -p /path/to/dest but I'm not using CLI, I'm using a databricks notebook.I tried using this code but it doesn't work. data_path = 'rsrishav/youtu...
- 1959 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @jch Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 3853 Views
- 5 replies
- 2 kudos
After creating the delta pipeline, I would like to get details from the dlt maintenance job automatically created by Databricks, like the scheduled time when the dlt maintenance tasks will be executed. However, it seems the Job API 2.1 doesn't cover ...
- 3853 Views
- 5 replies
- 2 kudos
Latest Reply
Hi @Debayan Mukherjee ,Actually the Databricks Jobs API documentation has not been fixed yet. The parameter `job_type` should be included in the list endpoint request documentation. Please do this in order to avoid unnecessary questions here in the ...
4 More Replies
by
PawelK
• New Contributor II
- 4733 Views
- 4 replies
- 1 kudos
Hello, I'm looking for a way of defining notification destination using API or Pulumi/Terraform providers. However I cannot find it anywhere. Could you please help and advice if i'm missing something or it's not available at the moment?And If it's no...
- 4733 Views
- 4 replies
- 1 kudos
Latest Reply
This issue seems to point to the lack of a public API being the culprit behind the lack of a resource for Terraform.
3 More Replies
by
Chinu
• New Contributor III
- 1075 Views
- 1 replies
- 1 kudos
Hi, Do you have an api endpoint to call to get the databricks status for AWS?Thanks,
- 1075 Views
- 1 replies
- 1 kudos
Latest Reply
@Chinu Lee you have webhook/slack that can be used to fetch status https://docs.databricks.com/resources/status.html#webhookare you specifically looking for your account workspace/above one
by
AnuVat
• New Contributor III
- 42189 Views
- 7 replies
- 13 kudos
Hi, I am working on an ML project and I need to access the data in tables hosted in my Databricks cluster through a notebook that I am running locally. This has been very easy while I run the notebooks in Databricks but I cannot figure out how to do ...
- 42189 Views
- 7 replies
- 13 kudos
Latest Reply
We can use Apis and pyodbc to achieve this. Once go through the official documentation of databricks that might be helpful to access outside of the databricks environment.
6 More Replies
- 3259 Views
- 1 replies
- 1 kudos
Hello, I have an Databricks account on Azure, and the goal is to compare different image tagging services from Azure, GCP, AWS via corresponding API calls, with Python notebook. I have problems with GCP vision API calls, specifically with credentials...
- 3259 Views
- 1 replies
- 1 kudos
Latest Reply
Ok, here is a trick: in my case, the file with GCP credentials is stored in notebook workspace storage, which is not visible to os.environ() command. So solution is to read a content of this file, and save it to the cluster storage attached to the no...