cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Esteemed Contributor III
  • 1546 Views
  • 5 replies
  • 18 kudos

Resolved! Azure: Permanently purge cluster logs

Is there any way to purge logs via API instead of clicking daily that option:

image.png
  • 1546 Views
  • 5 replies
  • 18 kudos
Latest Reply
Kaniz
Community Manager
  • 18 kudos

Hi @Hubert Dudek​ ​ , Just a friendly follow-up. Do you still need help, or @Prabakar Ammeappin​'s response help you to find the solution? Please let us know.

  • 18 kudos
4 More Replies
Anonymous
by Not applicable
  • 1008 Views
  • 3 replies
  • 2 kudos

Resolved! JOB API KEEPS SAYING THE JOB IS RUNNING

I have a library that waits until the job goes in the "TERMINATED" / "SKIPPED" state before continuing. It pools the JOB API.Unfortunately, I'm experiencing cases where the job is terminated on the GUI but the API still keeps saying "RUNNING".There i...

  • 1008 Views
  • 3 replies
  • 2 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 2 kudos

@Alessio Palma​ could you please provide the API that you are using? Also share some sample output and logs that would help us with some information.

  • 2 kudos
2 More Replies
gideonvos
by New Contributor
  • 412 Views
  • 0 replies
  • 0 kudos

Databricks workspace API metadata

Hi, the API works great. However, when listing workspaces via API it would be great to also be able to get back extra metadata, for example, last modification date. Is this possible?

  • 412 Views
  • 0 replies
  • 0 kudos
JakeP
by New Contributor III
  • 1179 Views
  • 3 replies
  • 1 kudos

Resolved! Is there a way to create a path under /Repos via API?

Trying to use Repos API to automate creation and updates to repos under paths not specific to a user, i.e. /Repos/Admin/<repo-name>. It seems that creating a repo via POST to /api/2.0/repos will fail if you don't include a path, and will also fail i...

  • 1179 Views
  • 3 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

https://docs.databricks.com/dev-tools/api/latest/workspace.html#mkdirs try through Workspace API:curl --netrc --request POST \ https://dbc-a1b2345c-d6e7.cloud.databricks.com/api/2.0/workspace/mkdirs \ --header 'Accept: application/json' \ --dat...

  • 1 kudos
2 More Replies
DavideCagnoni
by Contributor
  • 2759 Views
  • 8 replies
  • 3 kudos

Resolved! How to force pandas_on_spark plots to use all dataframe data?

When I load a table as a `pandas_on_spark` dataframe, and try to e.g. scatterplot two columns, what I obtain is a subset of the desired points. For example, if I try to plot two columns from a table with 1000000 rows, I only see some of the data - i...

  • 2759 Views
  • 8 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Hi @Davide Cagnoni​ , The Ideas Portal lets you influence the Databricks product roadmap by providing feedback directly to the product team. Use the Ideas Portal to:Enter feature requests.View, comment, and vote up other users’ requests.Monitor the p...

  • 3 kudos
7 More Replies
Sandesh87
by New Contributor III
  • 1756 Views
  • 2 replies
  • 2 kudos

Resolved! create a dataframe with all the responses from the api requests within foreachPartition

I am trying to execute an api call to get an object(json) from amazon s3 and I am using foreachPartition to execute multiple calls in paralleldf.rdd.foreachPartition(partition => { //Initialize list buffer var buffer_accounts1 = new ListBuffer[St...

  • 1756 Views
  • 2 replies
  • 2 kudos
Latest Reply
jose_gonzalez
Moderator
  • 2 kudos

Hi @Sandesh Puligundla​ ,Thank you for sharing the solution. We will mark it as "best" response so, in the future is another user has the same question, they will be able to find the solution right away.

  • 2 kudos
1 More Replies
philm
by New Contributor
  • 982 Views
  • 2 replies
  • 1 kudos

Resolved! set_experiment_description / set_runid_description

Is there anyway at all to update both experiment / run descriptions via the API, the only option I see is note.context for experiment - but that is a limited view and doesnt allow for hyper links.Id like to automate exp/run descriptions externally t...

  • 982 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Phil Meakins​ , The MLflow REST API allows you to create, list, and get experiments and runs, and log parameters, metrics, and artifacts. The API is hosted under the /api route on the MLflow tracking server. For example, to list experiments on a ...

  • 1 kudos
1 More Replies
Nilave
by New Contributor III
  • 2827 Views
  • 4 replies
  • 2 kudos

Resolved! Solution for API hosted on Databricks

I'm using Azure Databricks Python notebooks. We are preparing a front end to display the Databricks tables via API to query the tables. Is there a solution from Databricks to host callable APIs for querying its table and sending it as response to fro...

  • 2827 Views
  • 4 replies
  • 2 kudos
Latest Reply
Nilave
New Contributor III
  • 2 kudos

@Prabakar Ammeappin​  Thanks for the linkAlso was wondering for web page front end will it be more effective to query from SQL Database or from Azure Databricks tables. If from Azure SQL database, is there any efficient way to sync the tables from Az...

  • 2 kudos
3 More Replies
Junee
by New Contributor III
  • 3072 Views
  • 7 replies
  • 3 kudos

Resolved! What happens to the clusters whose jobs are canceled or terminated due to failures? (Jobs triggered through Job API2.1 using runs/submit)

I am using Databeicks Job Api 2.1 to trigger and run my jobs. "jobs/runs/submit" this API helps in starting the cluster, as well as create the job and run it. This API works great for normal jobs as it also cleans the cluster once job is finished suc...

  • 3072 Views
  • 7 replies
  • 3 kudos
Latest Reply
User16871418122
Contributor III
  • 3 kudos

@Junee, Anytime! It is crisply mentioned in the doc too. https://docs.databricks.com/clusters/index.html

  • 3 kudos
6 More Replies
Anonymous
by Not applicable
  • 1114 Views
  • 2 replies
  • 0 kudos

Resolved! Is the "patch"/update method of the repos API synchronous?

The repos API has a patch method to update a repo in the workspace (to do a git pull).We would please like to verify: is this method fully synchronous? Is it guaranteed to only return a 200 after the update is complete? Or, would immediately referenc...

  • 1114 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @ nate_at_lovelytics, In case of success, the PATCH HTTP method returns the 200 OK response code.

  • 0 kudos
1 More Replies
RantoB
by Valued Contributor
  • 4968 Views
  • 5 replies
  • 0 kudos

Resolved! SSLCertVerificationError how to disable SSL Certification

Hi, How is that possible to disable SSL Certification.With databricks API I got this error :SSLCertVerificationError   SSLCertVerificationError: ("hostname 'https' doesn't match either of '*.numericable.fr', 'numericable.fr'",)   MaxRetryError: HTTPS...

  • 4968 Views
  • 5 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Bertrand BURCKER​ - Thanks for letting us know your issue is resolved. If @Prabakar Ammeappin​'s answer solved the problem, would you be happy to mark his answer as best so others can more easily find an answer for this?

  • 0 kudos
4 More Replies
RantoB
by Valued Contributor
  • 1442 Views
  • 2 replies
  • 4 kudos

Resolved! Import a notebook in a Release Pipeline with a Python script

Hi, I would like to import a python file to Databricks with a Azure DevOps Release Pipeline.Within the pipeline I execute a python script which contains this code :import sys import os import base64 import requests   dbw_url = sys.argv[1] # https://a...

  • 1442 Views
  • 2 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 4 kudos

Recently I wrote about alternative way to export/import notebooks in pthon https://community.databricks.com/s/question/0D53f00001TgT52CAF/import-notebook-with-python-script-using-api This way you will get more readable error/message (often it is rela...

  • 4 kudos
1 More Replies
Nick_Hughes
by New Contributor III
  • 1176 Views
  • 3 replies
  • 3 kudos

Is there an alerting API please?

Is there an alerting api so that alerts can be source controlled and automated, please ?https://docs.databricks.com/sql/user/alerts/index.html

  • 1176 Views
  • 3 replies
  • 3 kudos
Latest Reply
Dan_Z
Honored Contributor
  • 3 kudos

Hello @Nick Hughes​ , as of today we do not expose or document the API for these features. I think it will be a useful feature so I created an internal feature request for it (DB-I-4289). If you (or any future readers) want more information on this f...

  • 3 kudos
2 More Replies
User16826992666
by Valued Contributor
  • 707 Views
  • 1 replies
  • 0 kudos

If I write functionally equivalent code in Pyspark and Koalas, will they end up evaluating to the same execution plan?

I am wondering how similar the backend execution of the two API's are. If I have code that does the same operations written in both styles, is there any functional difference between them when it comes to the execution?

  • 707 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @ trevor.bishop! My name is Kaniz, and I'm a technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers on the Forum have an answer to your questions first. Or else I will follow up shortly with a response.

  • 0 kudos
User16790091296
by Contributor II
  • 1158 Views
  • 1 replies
  • 0 kudos
  • 1158 Views
  • 1 replies
  • 0 kudos
Latest Reply
amr
New Contributor III
  • 0 kudos

You need to get the REST service API access tokens and make sure the Databricks VPC (or VNET on Azure) have connectivity to the VPC where this REST API service resides.

  • 0 kudos
Labels