cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

costi9992
by New Contributor III
  • 1526 Views
  • 2 replies
  • 0 kudos

Access Databricks API using IDP token

Hello,We have a databricks account & workspace, provided by AWS with SSO enabled. Is there any way to access databricks workspace API ( jobs/clusters, etc ) using a token retrieved from IdentityProvider ? We can access databricks workspace API with A...

  • 1526 Views
  • 2 replies
  • 0 kudos
Latest Reply
fpopa
New Contributor II
  • 0 kudos

Hey - Costin and Anonymous user, have you managed to get this working, do you have examples by any chance?I'm also trying something similar but I haven't been able to make it work.> authenticate and access the Databricks REST API by setting the Autho...

  • 0 kudos
1 More Replies
shiv4050
by New Contributor
  • 2496 Views
  • 4 replies
  • 0 kudos

Execute databricks notebook form a python source code.

Hello,I 'm trying to execute databricks notebook form a python source code but getting error.source code below------------------from databricks_api import DatabricksAPI   # Create a Databricks API client api = DatabricksAPI(host='databrick_host', tok...

  • 2496 Views
  • 4 replies
  • 0 kudos
Latest Reply
sewl
New Contributor II
  • 0 kudos

The error you are encountering indicates that there is an issue with establishing a connection to the Databricks host specified in your code. Specifically, the error message "getaddrinfo failed" suggests that the hostname or IP address you provided f...

  • 0 kudos
3 More Replies
User16752245312
by New Contributor III
  • 12083 Views
  • 2 replies
  • 3 kudos

How can I make Databricks API calls from notebook?

Access to Databricks APIs require the user to authenticate. This usually means creating a PAT (Personal Access Token) token. Conveniently, a token is readily available to you when you are using a Databricks notebook.databricksURL = dbutils.notebook....

  • 12083 Views
  • 2 replies
  • 3 kudos
Latest Reply
519320
New Contributor II
  • 3 kudos

hmmm.... no resolution yet? 

  • 3 kudos
1 More Replies
pantelis_mare
by Contributor III
  • 14965 Views
  • 31 replies
  • 15 kudos

Resolved! Repos configuration for Azure Service Principal

Hello community!I would like to update a repo from within my Azure DevOps release pipeline. In the pipeline I generate a token using a AAD Service Principal as recommended, and I setup the databricks api using that token.When I pass the databricks re...

  • 14965 Views
  • 31 replies
  • 15 kudos
Latest Reply
xiangzhu
Contributor II
  • 15 kudos

traditional PAT may have long lifespn, but the new SP feature uses an AAD token which should have a much shorter lifespqn, maybe around one hour, this could be a limiting factor.However, I haven't tested this yet, so these are merely hypotheses.​Neve...

  • 15 kudos
30 More Replies
DenisMcd
by New Contributor
  • 1250 Views
  • 2 replies
  • 0 kudos

"Endpoint not found for /2.0/sql/statements/"

Hey everyone! I´m trying to access table row using databricks api. Using Insomnia or postman, to test and the error are the same: { "error_code": "ENDPOINT_NOT_FOUND", "message": "Endpoint not found for /2.0/sql/statements/"}Below is my request:(for ...

  • 1250 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Denis Macedo​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 0 kudos
1 More Replies
JamesKuo
by New Contributor III
  • 3767 Views
  • 3 replies
  • 6 kudos

Resolved! Where can I find API documentation to dbutils.notebook.entry_point?

dbutils.notebook.help only lists "run" and "exit" methods. I could only find references to dbutils.notebook.entry_point spread across the web but there does not seem to be an official Databricks API documentation to its complete APIs anywhere. Can so...

  • 3767 Views
  • 3 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @James kuo​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 6 kudos
2 More Replies
swetha
by New Contributor III
  • 1813 Views
  • 3 replies
  • 4 kudos

Resolved! Retrieving the job-id's of a notebook running inside tasks

I have created a job, Inside a job I have created tasks which are independent, I have used the concept of concurrent futures to exhibit parallelism and in each task there are couple of notebooks running(which are independent) Each notebook running ha...

  • 1813 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 4 kudos
2 More Replies
Deepak_Goldwyn
by New Contributor III
  • 3251 Views
  • 4 replies
  • 2 kudos

Resolved! Create Jobs and Pipelines in Workflows using API

I am trying to create Databricks Jobs and Delta live table(DLT) pipelines by using Databricks API.I would like to have the JSON code of Jobs and DLT in the repository(to configure the code as per environment) and execute the Databricks API by passing...

  • 3251 Views
  • 4 replies
  • 2 kudos
Latest Reply
Deepak_Goldwyn
New Contributor III
  • 2 kudos

Hi Jose,Yes it answered my question. I am indeed using JSON file to create Jobs and pipelinesThanks.

  • 2 kudos
3 More Replies
Valentin1
by New Contributor III
  • 1077 Views
  • 3 replies
  • 0 kudos

docs.databricks.com

Is there an any example that contains at least one widget that works with the Databricks SQL Create Dashboard API? I tried the following simple dashboard:{ "name": "Delta View", "dashboard_filters_enabled": false, "widgets": [ { ...

  • 1077 Views
  • 3 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

@Valentin Rosca​ , Right now, Databricks also does not recommend creating new widgets via queries and dashboards API (https://docs.databricks.com/sql/api/queries-dashboards.html#operation/sql-analytics-create-dashboard).  Also, copying a dashboard fr...

  • 0 kudos
2 More Replies
shawncao
by New Contributor II
  • 1242 Views
  • 3 replies
  • 2 kudos

best practice of using data bricks API

Hello, I'm building a Databricks connector to allow users to issue command/SQL from a web app.In general, I think the REST API is okay to work with, though it's pretty tedious to write wrap code for each API call.[Q1]Is there an official (or semi-off...

  • 1242 Views
  • 3 replies
  • 2 kudos
Latest Reply
shawncao
New Contributor II
  • 2 kudos

I don't know if I fully understand DBX, sounds like a job client to manage jobs and deployment and I don't see NodeJS support for this project yet. My question was about how to "stream" query results back from Databricks in a NodeJs application, curr...

  • 2 kudos
2 More Replies
tj-cycyota
by New Contributor III
  • 1256 Views
  • 1 replies
  • 1 kudos

Can you use the Databricks API from a notebook?

I want to test out different APIs directly from a Databricks notebook instead of using Postman or CURL. Is this possible?

  • 1256 Views
  • 1 replies
  • 1 kudos
Latest Reply
Mooune_DBU
Valued Contributor
  • 1 kudos

If you're question is about using the Databricks API from within a databricks notebook, then the answer is yes of course, you can definitely orchestrate anything and invoke the REST API from a python notebook using the `requests` library already bake...

  • 1 kudos
Labels