- 1526 Views
- 2 replies
- 0 kudos
Hello,We have a databricks account & workspace, provided by AWS with SSO enabled. Is there any way to access databricks workspace API ( jobs/clusters, etc ) using a token retrieved from IdentityProvider ? We can access databricks workspace API with A...
- 1526 Views
- 2 replies
- 0 kudos
Latest Reply
Hey - Costin and Anonymous user, have you managed to get this working, do you have examples by any chance?I'm also trying something similar but I haven't been able to make it work.> authenticate and access the Databricks REST API by setting the Autho...
1 More Replies
- 2496 Views
- 4 replies
- 0 kudos
Hello,I 'm trying to execute databricks notebook form a python source code but getting error.source code below------------------from databricks_api import DatabricksAPI
# Create a Databricks API client
api = DatabricksAPI(host='databrick_host', tok...
- 2496 Views
- 4 replies
- 0 kudos
Latest Reply
The error you are encountering indicates that there is an issue with establishing a connection to the Databricks host specified in your code. Specifically, the error message "getaddrinfo failed" suggests that the hostname or IP address you provided f...
3 More Replies
- 12083 Views
- 2 replies
- 3 kudos
Access to Databricks APIs require the user to authenticate. This usually means creating a PAT (Personal Access Token) token. Conveniently, a token is readily available to you when you are using a Databricks notebook.databricksURL = dbutils.notebook....
- 12083 Views
- 2 replies
- 3 kudos
- 14965 Views
- 31 replies
- 15 kudos
Hello community!I would like to update a repo from within my Azure DevOps release pipeline. In the pipeline I generate a token using a AAD Service Principal as recommended, and I setup the databricks api using that token.When I pass the databricks re...
- 14965 Views
- 31 replies
- 15 kudos
Latest Reply
traditional PAT may have long lifespn, but the new SP feature uses an AAD token which should have a much shorter lifespqn, maybe around one hour, this could be a limiting factor.However, I haven't tested this yet, so these are merely hypotheses.​Neve...
30 More Replies
- 1250 Views
- 2 replies
- 0 kudos
Hey everyone! I´m trying to access table row using databricks api. Using Insomnia or postman, to test and the error are the same: { "error_code": "ENDPOINT_NOT_FOUND", "message": "Endpoint not found for /2.0/sql/statements/"}Below is my request:(for ...
- 1250 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Denis Macedo​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
1 More Replies
- 3767 Views
- 3 replies
- 6 kudos
dbutils.notebook.help only lists "run" and "exit" methods. I could only find references to dbutils.notebook.entry_point spread across the web but there does not seem to be an official Databricks API documentation to its complete APIs anywhere. Can so...
- 3767 Views
- 3 replies
- 6 kudos
Latest Reply
Hi @James kuo​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
2 More Replies
by
swetha
• New Contributor III
- 1813 Views
- 3 replies
- 4 kudos
I have created a job, Inside a job I have created tasks which are independent, I have used the concept of concurrent futures to exhibit parallelism and in each task there are couple of notebooks running(which are independent) Each notebook running ha...
- 1813 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
2 More Replies
- 3251 Views
- 4 replies
- 2 kudos
I am trying to create Databricks Jobs and Delta live table(DLT) pipelines by using Databricks API.I would like to have the JSON code of Jobs and DLT in the repository(to configure the code as per environment) and execute the Databricks API by passing...
- 3251 Views
- 4 replies
- 2 kudos
Latest Reply
Hi Jose,Yes it answered my question. I am indeed using JSON file to create Jobs and pipelinesThanks.
3 More Replies
- 1077 Views
- 3 replies
- 0 kudos
Is there an any example that contains at least one widget that works with the Databricks SQL Create Dashboard API? I tried the following simple dashboard:{
"name": "Delta View",
"dashboard_filters_enabled": false,
"widgets": [
{
...
- 1077 Views
- 3 replies
- 0 kudos
Latest Reply
@Valentin Rosca​ , Right now, Databricks also does not recommend creating new widgets via queries and dashboards API (https://docs.databricks.com/sql/api/queries-dashboards.html#operation/sql-analytics-create-dashboard). Also, copying a dashboard fr...
2 More Replies
- 1242 Views
- 3 replies
- 2 kudos
Hello, I'm building a Databricks connector to allow users to issue command/SQL from a web app.In general, I think the REST API is okay to work with, though it's pretty tedious to write wrap code for each API call.[Q1]Is there an official (or semi-off...
- 1242 Views
- 3 replies
- 2 kudos
Latest Reply
I don't know if I fully understand DBX, sounds like a job client to manage jobs and deployment and I don't see NodeJS support for this project yet. My question was about how to "stream" query results back from Databricks in a NodeJs application, curr...
2 More Replies
- 1256 Views
- 1 replies
- 1 kudos
I want to test out different APIs directly from a Databricks notebook instead of using Postman or CURL. Is this possible?
- 1256 Views
- 1 replies
- 1 kudos
Latest Reply
If you're question is about using the Databricks API from within a databricks notebook, then the answer is yes of course, you can definitely orchestrate anything and invoke the REST API from a python notebook using the `requests` library already bake...