cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Matt101122
by Contributor
  • 2547 Views
  • 4 replies
  • 4 kudos

Resolved! Why are calls to /api/2.1/jobs/create taking 30-60 seconds to complete?

We are having an issue with calls to /api/2.1/jobs/create taking between 30 and 60 seconds to complete. Calls to /api/2.1/jobs/list complete as expected in about 6 seconds so the issue seems to be only with the job create API. This seems to only be h...

  • 2547 Views
  • 4 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Matthew Dalesio​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 4 kudos
3 More Replies
bl12
by New Contributor II
  • 2802 Views
  • 2 replies
  • 1 kudos

Resolved! Use API to Clone Dashboards with Widgets in Databricks SQL?

Hi, I manually created a template dashboard with one widget. I wanted to clone this dashboard using the Create Dashboard API however I don't know what to put for the widget object. What I did was use the Retrieve API on the template dashboard, and th...

  • 2802 Views
  • 2 replies
  • 1 kudos
Latest Reply
Wout
Contributor
  • 1 kudos

@Akash Bhat​ how do I clone a dashboard across workspaces? Being able to deploy dashboards (with widgets!) through the API is essential to set up proper Data Engineering workflows.

  • 1 kudos
1 More Replies
Dave_B_
by New Contributor III
  • 3451 Views
  • 2 replies
  • 2 kudos

Git Integration Configuration via Command Line or API

I have an Azure service principle that is used for our CI/CD pipelines. We do not have access to the Databricks UI via user logins. Our github repos also require SSO PATs. How can I configure the git integration for the service principal so that I ca...

  • 3451 Views
  • 2 replies
  • 2 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 2 kudos

This widget could not be displayed.
I have an Azure service principle that is used for our CI/CD pipelines. We do not have access to the Databricks UI via user logins. Our github repos also require SSO PATs. How can I configure the git integration for the service principal so that I ca...

This widget could not be displayed.
  • 2 kudos
This widget could not be displayed.
1 More Replies
hafeez
by New Contributor III
  • 3645 Views
  • 4 replies
  • 7 kudos

Azure Databricks SCIM API GA Availability plans?

When is the Azure SCIM API 2.0 is going General Availability. Currently I see it is on public preview?And also any security concerns for the APIs In Preview if it is being used in our current production?Reference: https://learn.microsoft.com/en-us/az...

  • 3645 Views
  • 4 replies
  • 7 kudos
Latest Reply
Anonymous
Not applicable
  • 7 kudos

Hi @Sultan Mohideen Hafeez Aboobacker Mohammed Shahul​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more h...

  • 7 kudos
3 More Replies
Karthikeyan1
by New Contributor II
  • 1099 Views
  • 1 replies
  • 2 kudos

How to get the Databricks Default Pricing through API or any endpoints (*if available)???

Attaching Screenshots FYI from the official site, I've checked the Inspect, but no API calls have been made specifically for this cost default scrapping, Is there any Endpoints available to scrape this?

image.png
  • 1099 Views
  • 1 replies
  • 2 kudos
Latest Reply
Karthikeyan1
New Contributor II
  • 2 kudos

@All Users Group​ Any views on this??

  • 2 kudos
Nath
by New Contributor II
  • 2167 Views
  • 3 replies
  • 2 kudos

Resolved! Error with multiple FeatureLookup calls outside databricks

I access databricks feature store outside databricks with databricks-connect on my IDE pycharm.The problem is just outside Databricks, not with a notebook inside Databricks.I use FeatureLookup mecanism to pull data from Feature store tables in my cus...

  • 2167 Views
  • 3 replies
  • 2 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 2 kudos

Also, Please refer to the below KB for additional resolution - https://learn.microsoft.com/en-us/azure/databricks/kb/dev-tools/dbconnect-protoserializer-stackoverflow

  • 2 kudos
2 More Replies
RantoB
by Valued Contributor
  • 7819 Views
  • 10 replies
  • 8 kudos

Resolved! Import notebook with python script using API

Hi,I would like to import a python notebook to my databricks workspace from my local machine using a python script.I manages to create the folder but then I have a status code 400 when I try to import a file :create_folder = requests.post( '{}/api/...

  • 7819 Views
  • 10 replies
  • 8 kudos
Latest Reply
RantoB
Valued Contributor
  • 8 kudos

Hi, Thanks for your answer.Actually both your code and mine are working. However, I cannot write in the directory Repos which is reserved (but I can create subdirectories...)Thanks to your code I got an error message which helped me to understand. Wi...

  • 8 kudos
9 More Replies
Shubhamgoyal
by New Contributor
  • 1849 Views
  • 2 replies
  • 1 kudos

Access Databricks SQL databases using rest API

Hi All,We want to read/write data to Databricks SQL using powerapps. I have been looking for documentation around accessing databases in databricks SQL via rest api.Appreciate your help on this.

  • 1849 Views
  • 2 replies
  • 1 kudos
Latest Reply
byrdman
New Contributor III
  • 1 kudos

With the databricks api you can start a workflow job. build the job to ingest your data into tables.

  • 1 kudos
1 More Replies
talha
by New Contributor III
  • 1726 Views
  • 3 replies
  • 2 kudos

How to fetch runs from cluster id in api

Task to achieve: We have cluster id and want to fetch all runs against it.Currently I have to get all the runs, iterate through it and filter out the runs with the required cluster id.Similarly if I want to fetch all the runs that are active?

  • 1726 Views
  • 3 replies
  • 2 kudos
Latest Reply
Vidula
Honored Contributor
  • 2 kudos

Hi @Muhammad Talha Jamil​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from y...

  • 2 kudos
2 More Replies
data_boy_2022
by New Contributor III
  • 2698 Views
  • 2 replies
  • 1 kudos

Resolved! What are the options to offer a low latency API for small tables derived from big tables?

I have a big dataset which gets divided into smaller datasets. For some of these smaller datasets I'd like to offer a low latency API (*** ms) to query them. Big dataset 1B entriesSmaller dataset 1 Mio entriesWhat's the best way to do it?I thought ab...

  • 2698 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Jan R​ Does @Tian Tan​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
1 More Replies
bl12
by New Contributor II
  • 2658 Views
  • 2 replies
  • 2 kudos

Resolved! Any ways to power a Databricks SQL dashboard widget with a dynamic query?

Hi, I'm using Databricks SQL and I need to power the same widget in a dashboard with a dynamic query. Are there any recommended solutions for this? For more context, I'm building a feature that allows people to see the size of something. That size is...

  • 2658 Views
  • 2 replies
  • 2 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 2 kudos

I believe reDash isn't built that way within Databricks. It's still very limited in its capabilities. I've two solutions for you. I haven't tried any but see if it works for you:Use preset with DB SQL. A hack - read below:I'm assuming you have one wi...

  • 2 kudos
1 More Replies
ThomasKastl
by Contributor
  • 5148 Views
  • 4 replies
  • 4 kudos

Calling Databricks API from Databricks notebook

A similar question has already been added, but the reply is very confusing to me. Basically, for automated jobs, I want to log the following information from inside a Python notebook that runs in the job: - What is the cluster configuration (most im...

  • 5148 Views
  • 4 replies
  • 4 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 4 kudos

hi @Thomas Kastl​,Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

  • 4 kudos
3 More Replies
shawncao
by New Contributor II
  • 1949 Views
  • 3 replies
  • 2 kudos

best practice of using data bricks API

Hello, I'm building a Databricks connector to allow users to issue command/SQL from a web app.In general, I think the REST API is okay to work with, though it's pretty tedious to write wrap code for each API call.[Q1]Is there an official (or semi-off...

  • 1949 Views
  • 3 replies
  • 2 kudos
Latest Reply
shawncao
New Contributor II
  • 2 kudos

I don't know if I fully understand DBX, sounds like a job client to manage jobs and deployment and I don't see NodeJS support for this project yet. My question was about how to "stream" query results back from Databricks in a NodeJs application, curr...

  • 2 kudos
2 More Replies
MarcJustice
by New Contributor
  • 1543 Views
  • 2 replies
  • 3 kudos

Is the promise of a data lake simply about data science, data analytics and data quality or can it also be an integral part of core transaction processing also?

Upfront, I want to let you know that I'm not a veteran data jockey, so I apologize if this topic has been covered already or is simply too basic or narrow for this community. That said, I do need help so please feel free to point me in another direc...

  • 1543 Views
  • 2 replies
  • 3 kudos
Latest Reply
Aashita
Databricks Employee
  • 3 kudos

@Marc Barnett​ , Databricks’ Lakehouse architecture is the ideal data architecture for data-driven organizations. It combines the best qualities of data warehouses and data lakes to provide a single solution for all major data workloads and supports ...

  • 3 kudos
1 More Replies
Sree_Patllola
by New Contributor
  • 1404 Views
  • 0 replies
  • 0 kudos

I am in a process of Connecting to X vendor and pull back the data needed from that X vendor.

For that we have shared our Azure IP addres (NO VPN or Corporate IP address Available as of now - still initial stages of the project) with X vendor, which is whitelisted now. Now I am trying to setup the X vendor API in the databricks to lookup into...

  • 1404 Views
  • 0 replies
  • 0 kudos
Labels