cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

User15787040559
by New Contributor III
  • 2482 Views
  • 2 replies
  • 0 kudos

MicrosoftTeams-image

ERROR Max retries exceeded with url: /api/2.0/jobs/runs/get?run_id= Failed to establish a new connectionThis error can happen when exceeding the rate limits for all REST API calls as documented here.In the image shown for example we're using the Jobs...

  • 2482 Views
  • 2 replies
  • 0 kudos
Latest Reply
User16764241763
Honored Contributor
  • 0 kudos

Hi @Carlos Morillo​  Are you facing this issue consistently or when you run a lot of jobs?We are internally tracking a similar issue. Could you please file a support request with Microsoft Support? Databricks and MSFT will collaborate and provide upd...

  • 0 kudos
1 More Replies
sparkstreaming
by New Contributor III
  • 6208 Views
  • 7 replies
  • 6 kudos

Resolved! Rest API invocation for databricks notebook fails while invoking from ADF pipeline

In the current implementation a streaming databricks notebook needs to be started based on the configuration passed. Since the rest of databricks notebooks are being invoked by using ADF,it was decided to use ADF for starting these notebooks. Since t...

  • 6208 Views
  • 7 replies
  • 6 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 6 kudos

Hi @Prasanth KP​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ and @Werner Stinckens​ 's responses help you to find the solution? Please let us know.

  • 6 kudos
6 More Replies
shawncao
by New Contributor II
  • 3674 Views
  • 0 replies
  • 0 kudos

REST api to execute SQL query and read output

Hi there,I'm using these two APIs to execute SQL statements and read output back when it's finished. However, seems it always returns only 1000 rows even though I need all the results (millions of rows), is there a solution for this? execute SQL: htt...

  • 3674 Views
  • 0 replies
  • 0 kudos
hrushi2000
by New Contributor
  • 679 Views
  • 1 replies
  • 0 kudos

Machine learning is sanctionative computers to tackle tasks that have, until now, completely been administered by folks.From driving cars to translati...

Machine learning is sanctionative computers to tackle tasks that have, until now, completely been administered by folks.From driving cars to translating speech, machine learning is driving accolade explosion among the capabilities of computing – serv...

  • 679 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 0 kudos

@[Kaniz Fatma]​ @[Vartika]​ SPAM

  • 0 kudos
gibbona1
by New Contributor II
  • 3797 Views
  • 5 replies
  • 1 kudos

Resolved! Correct setup and format for calling REST API for image classification

I trained a basic image classification model on MNIST using Tensorflow, logging the experiment run with MLflow.Model: "my_sequential" _________________________________________________________________ Layer (type) Output Shape ...

mnist_model_error
  • 3797 Views
  • 5 replies
  • 1 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 1 kudos

@Anthony Gibbons​  may be this git should work with your use case - https://github.com/mlflow/mlflow/issues/1661

  • 1 kudos
4 More Replies
Nilave
by New Contributor III
  • 4128 Views
  • 4 replies
  • 2 kudos

Resolved! Solution for API hosted on Databricks

I'm using Azure Databricks Python notebooks. We are preparing a front end to display the Databricks tables via API to query the tables. Is there a solution from Databricks to host callable APIs for querying its table and sending it as response to fro...

  • 4128 Views
  • 4 replies
  • 2 kudos
Latest Reply
Nilave
New Contributor III
  • 2 kudos

@Prabakar Ammeappin​  Thanks for the linkAlso was wondering for web page front end will it be more effective to query from SQL Database or from Azure Databricks tables. If from Azure SQL database, is there any efficient way to sync the tables from Az...

  • 2 kudos
3 More Replies
Junee
by New Contributor III
  • 4766 Views
  • 7 replies
  • 3 kudos

Resolved! What happens to the clusters whose jobs are canceled or terminated due to failures? (Jobs triggered through Job API2.1 using runs/submit)

I am using Databeicks Job Api 2.1 to trigger and run my jobs. "jobs/runs/submit" this API helps in starting the cluster, as well as create the job and run it. This API works great for normal jobs as it also cleans the cluster once job is finished suc...

  • 4766 Views
  • 7 replies
  • 3 kudos
Latest Reply
User16871418122
Contributor III
  • 3 kudos

@Junee, Anytime! It is crisply mentioned in the doc too. https://docs.databricks.com/clusters/index.html

  • 3 kudos
6 More Replies
Kaniz_Fatma
by Community Manager
  • 1446 Views
  • 1 replies
  • 1 kudos
  • 1446 Views
  • 1 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

Basically all what is needed is to create api token in databricks and than use Jobs API as described here:https://docs.databricks.com/dev-tools/api/latest/jobs.htmlfollowing endpoints are available:POST https://<databricks-instance>/api/2.1/jobs/crea...

  • 1 kudos
amichel
by New Contributor III
  • 5135 Views
  • 3 replies
  • 2 kudos

Resolved! Is there a way to refresh tokens issued on behalf of service principal?

I want to be able to refresh tokens generated on behalf of a service principal via Token Management API, just like with any other service where OAuth is used and refresh token endpoint is available. Allowing indefinite or very long expiration for acc...

  • 5135 Views
  • 3 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Refresh option would be useful.In Azure you could use Azure automation to make "refresh" script: delete if still existscreate token via: "databricks tokens create" put it to Azure Key Vault with expiration data

  • 2 kudos
2 More Replies
Mohit_m
by Valued Contributor II
  • 1883 Views
  • 5 replies
  • 2 kudos

Which rest API to use in order to list the groups that belong to a specific user

Which rest API to use in order to list the groups that belong to a specific user

  • 1883 Views
  • 5 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Moderator
  • 2 kudos

@Mohit Miglani​ ,Make sure to select the best option so the post will be moved to the top and will help in case more users have this question in the future.

  • 2 kudos
4 More Replies
Manoj
by Contributor II
  • 8821 Views
  • 4 replies
  • 8 kudos

Resolved! Is there a way to submit multiple queries to data bricks SQL END POINT using REST API ?

Is there a way to submit multiple queries to data bricks SQL END POINT using REST API ?

  • 8821 Views
  • 4 replies
  • 8 kudos
Latest Reply
BilalAslamDbrx
Honored Contributor III
  • 8 kudos

@Manoj Kumar Rayalla​  DBSQL currently limits execution to 10 concurrent queries per cluster so there could be some queuing with 30 concurrent queries. You may want to turn on multi-cluster load balancing to horizontally scale with 1 more cluster for...

  • 8 kudos
3 More Replies
Braxx
by Contributor II
  • 1928 Views
  • 1 replies
  • 3 kudos

Retry api request if fails

I have a simple API request to query a table and retrive data, which are then suited into a dataframe. May happened, it fails due to different reasons. How to retry it for let's say 5 times when any kind of error takes place? Here is an api request:d...

  • 1928 Views
  • 1 replies
  • 3 kudos
Latest Reply
Manoj
Contributor II
  • 3 kudos

@Bartosz Wachocki​ ,Use timeout, retry interval ,recursion and exception handling pseudo code belowtimeout = 300def exec_query(query,timeout): try: df = spark.createDataFrame(sf.bulk.MyTable.query(query)) except: if timeout > 0 : sleep(60) exec_que...

  • 3 kudos
NAS
by New Contributor III
  • 2011 Views
  • 2 replies
  • 0 kudos

Set tags for an MLFlow Experiment using Python?

There is this rest API: https://www.mlflow.org/docs/latest/rest-api.html#set-experiment-tagCan I do the same from python's MLFlow API?

  • 2011 Views
  • 2 replies
  • 0 kudos
Latest Reply
NAS
New Contributor III
  • 0 kudos

Someone answered first in StackOverflow. Here it is:from mlflow.tracking import MlflowClient   # Create an experiment with a name that is unique and case sensitive. client = MlflowClient() experiment_id = client.create_experiment("Social NLP Experime...

  • 0 kudos
1 More Replies
User16856693631
by New Contributor II
  • 1506 Views
  • 2 replies
  • 0 kudos

Can you create Clusters via a REST API?

Yes, you can. See here: https://docs.databricks.com/dev-tools/api/latest/clusters.htmlThe JSON payload would look as follows:{ "cluster_name": "my-cluster", "spark_version": "7.3.x-scala2.12", "node_type_id": "i3.xlarge", "spark_conf": { ...

  • 1506 Views
  • 2 replies
  • 0 kudos
Latest Reply
ManishPatil
New Contributor II
  • 0 kudos

One can create a Cluster(s) using CLuster API @ https://docs.databricks.com/dev-tools/api/latest/clusters.html#create However, REST API 2.0 doesn't provide certain features like "Enable Table Access Control", which has been introduced after REST API ...

  • 0 kudos
1 More Replies
User16790091296
by Contributor II
  • 3025 Views
  • 3 replies
  • 0 kudos
  • 3025 Views
  • 3 replies
  • 0 kudos
Latest Reply
Mooune_DBU
Valued Contributor
  • 0 kudos

Dy doing a `GET` call using the cluster idcurl --netrc -X GET \ https://dbc-a1b2345c-d6e7.cloud.databricks.com/api/2.0/clusters/get \ --data '{ "cluster_id": "1234-567890-myclustID" }' \ | jq .The response json will have a `state` tag which will look...

  • 0 kudos
2 More Replies
Labels