cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Constantine
by Contributor III
  • 7731 Views
  • 5 replies
  • 1 kudos

Resolved! How to use Databricks Query History API (REST API)

I have setup authentication using this page https://docs.databricks.com/sql/api/authentication.html and run curl -n -X GET https://<databricks-instance>.cloud.databricks.com/api/2.0/sql/history/queriesTo get history of all sql endpoint queries, but I...

  • 7731 Views
  • 5 replies
  • 1 kudos
Latest Reply
yegorski
New Contributor III
  • 1 kudos

Here's how to query with databricks-sdk-py (working code). I had a frustrating time doing it with vanilla python + requests/urllib and couldn't figure it out. import datetime import os from databricks.sdk import WorkspaceClient from databricks.sdk.se...

  • 1 kudos
4 More Replies
satya1206
by New Contributor II
  • 1193 Views
  • 1 replies
  • 0 kudos

Compare 13.3LTS with 14.3LTS

Hello,We have plans to migrate our DBR from 13.3LTS to 14.3 LTS. If anyone has recently completed this migration, we would like to know the major benefits we can expect from it and if there are any disadvantages or behavior change we should be aware ...

  • 1193 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @satya1206 ,Check out the docs:https://docs.databricks.com/en/release-notes/runtime/14.3lts.html

  • 0 kudos
AlokThampi
by New Contributor III
  • 4421 Views
  • 7 replies
  • 5 kudos

Joining huge delta tables in Databricks

Hello,I am trying to join few delta tables as per the code below.SQLCopy select <applicable columns> FROM ReportTable G LEFT JOIN EKBETable EKBE ON EKBE.BELNR = G.ORDER_ID LEFT JOIN PurchaseOrder POL ON EKBE.EBELN = POL.PO_NOThe PurchaseOrder table c...

AlokThampi_0-1728392939237.png
  • 4421 Views
  • 7 replies
  • 5 kudos
Latest Reply
AlokThampi
New Contributor III
  • 5 kudos

Hello @-werners-, @Mo ,I tried the liquid clustering option as suggested but it still doesn't seem to work. I am assuming it to be an issue with the small cluster size that I am using.Or do you suggest any other options?@noorbasha534 , the columns th...

  • 5 kudos
6 More Replies
AlexDavies
by Contributor
  • 10439 Views
  • 9 replies
  • 2 kudos

Report on SQL queries that are being executed

We have a SQL workspace with a cluster running that services a number of self service reports against a range of datasets. We want to be able to analyse and report on the queries our self service users are executing so we can get better visibility of...

  • 10439 Views
  • 9 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey there @Alex Davies​ Hope you are doing great. Just checking in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!

  • 2 kudos
8 More Replies
StevenW
by New Contributor III
  • 1130 Views
  • 2 replies
  • 0 kudos

Workflow parameter in sql not working

I'm using the following input parameters when running from a workflow:wid_UnityCatalogName = dbutils.jobs.taskValues.get(taskKey="NB_XXX_Workflow_Parameters", key="p_UnityCatalogName", default="xx_lakehouse_dev")        dbutils.widgets.text("UnityCat...

  • 1130 Views
  • 2 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @StevenW ,I see that you are using python noteboook and then the view is created in SQL.1. If you are using %sql magic command then to use parameters you need to reference them like $parameter or :parameter (depending on the runtime).2. If you are...

  • 0 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 3328 Views
  • 7 replies
  • 6 kudos

Resolved! Retrieve table/view popularity

DearsIs there a way to retrieve the popularity score of an unity catalog object? I looked at APIs documentation but couldn't find one that serves the need.Appreciate any thoughts.Br,Noor.

  • 3328 Views
  • 7 replies
  • 6 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 6 kudos

@filipniziol Hi Filip, Thank you. I did a quick test. In my environment, the table query (indirect) event is getting registered with "getTemporaryTableCredential". However, the view query (direct) event is with "getTable".Thanks for your time again. ...

  • 6 kudos
6 More Replies
james_farrugia
by New Contributor II
  • 2824 Views
  • 4 replies
  • 0 kudos

Cross workspace REST API access denied due to network policies

Hi,Our data workspace architecture consists of a collection of discrete workspace segregated according to business function and environment.  Moreover they are not all deployed to the same region: dev and staging are deployed to south east asia, wher...

  • 2824 Views
  • 4 replies
  • 0 kudos
Latest Reply
james_farrugia
New Contributor II
  • 0 kudos

Hi @filipniziol ,Thanks for the tip. I might repost further queries here especially with regards to NSG as I've never manipulated these manually before.  One question: why is it that vnet peering/NSG alteration are not required when invoking a servic...

  • 0 kudos
3 More Replies
Maulik
by New Contributor
  • 993 Views
  • 1 replies
  • 0 kudos

how to set call back for Databricks Statement Execution SQL API Query?

I m using https://docs.databricks.com/api/workspace/statementexecution. using long running queries.my wait time is zero. queries might take 1 hour and I don't want to do pooling https://docs.databricks.com/api/workspace/statementexecution/getstatemen...

  • 993 Views
  • 1 replies
  • 0 kudos
abc1234
by New Contributor
  • 467 Views
  • 0 replies
  • 0 kudos

Point or Load data from GCP DBFS to AWS DBFS

Hi, is there a way to point GCP DBFS to an AWS DBFS so as to access the data in AWS DBFS from GCP DBFS at minimal cost?We are migrating the jobs from AWS databricks to GCP databricks and the data from the jobs use AWS DBFS as interim location.There s...

  • 467 Views
  • 0 replies
  • 0 kudos
Giorgi
by Contributor
  • 10535 Views
  • 5 replies
  • 5 kudos

GitLab integration

I've followed instructions and did gitlab integration:Generated Personal Access Token from GitLabAdd token (from step 1) to User settings (GitLab, email, token)In Admin console -> Repos Git URL Allow List permissions: Disabled (no restrictions)In Adm...

  • 10535 Views
  • 5 replies
  • 5 kudos
Latest Reply
joshuat
Contributor
  • 5 kudos

Thanks for your reply - I did, see my reply above.

  • 5 kudos
4 More Replies
Sathish_itachi
by New Contributor III
  • 13405 Views
  • 19 replies
  • 13 kudos

Resolved! Encountering an error while accessing dbfs root folders

dbfs file browser storagecontext com.databricks.backend.storage.storagecontexttype$dbfsroot$@4155a7bf for workspace 2818007466707254 is not set in the customerstorageinfo above is the error displayed on the ui

  • 13405 Views
  • 19 replies
  • 13 kudos
Latest Reply
MJ_BE8
New Contributor III
  • 13 kudos

Got this error today when trying to import .csv file. It worked fine before (like, last two weeks?). What happened?

  • 13 kudos
18 More Replies
borori
by New Contributor II
  • 3601 Views
  • 2 replies
  • 0 kudos

Resolved! write operation to the Delta table is not completing.

Using a cluster in serverless mode, three tables are joined and the data frame is written as followsdf.write.mode('append').saveAsTable('table name')and shema is belowdate string (ymd format)id bigintvalue stringpartition by date After about one minu...

  • 3601 Views
  • 2 replies
  • 0 kudos
Latest Reply
borori
New Contributor II
  • 0 kudos

Thank you for your advice. I couldn't come to a conclusion based on what you told me, but it gave me an opportunity to review all the logs again. The cause was that the amount of data became too large due to joining between null data. The advice was ...

  • 0 kudos
1 More Replies
pmarko1711
by New Contributor II
  • 1470 Views
  • 2 replies
  • 0 kudos

External volume over S3 Access point

Can anybody confirm if  external volumes pointing to S3 access points work in Databricks on AWS?I have S3 bucket, but can only access it via S3 access point. The bucket is KMS encrypted.I created an IAM role that can list and read the S3 access point...

  • 1470 Views
  • 2 replies
  • 0 kudos
Latest Reply
pmarko1711
New Contributor II
  • 0 kudos

This look fine to me. I am the owner of the (external) volume and have READ VOLUME privilege on it. (as for the external location I am also its owner and have READ FILES, BROSE, CREATE EXTERNAL TABLE and CREATE EXTERNAL VOLUME)One additional info I g...

  • 0 kudos
1 More Replies
turtleXturtle
by New Contributor II
  • 1082 Views
  • 1 replies
  • 0 kudos

Refreshing DELTA external table

I'm having trouble with the REFRESH TABLE command - does it work with DELTA external tables?  I'm doing the following steps:Create table: CREATE TABLE IF NOT EXISTS `catalog`.`default`.`table_name` ( KEY DOUBLE, CUSTKEY DOUBLE, STATUS STRING, PRICE D...

  • 1082 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

Step 3: Insert the Data; don't add it directly to the S3 folder. Once it's converted to Delta, it maintains the transaction log. Inserting a Parquet file (followed by another convert /refresh) won't work, as the rest of the dataset is already Delta. ...

  • 0 kudos
chethankumar
by New Contributor III
  • 2112 Views
  • 2 replies
  • 0 kudos

how to assign account level groups to workspace using, Terraform

in the workspace console when I create groups it creates a source as an account, Basically, it is a account level group, But    provider "databricks" { host = var.databricks_host # client_id = "" # client_secret = " account_id = ...

  • 2112 Views
  • 2 replies
  • 0 kudos
Latest Reply
jennie258fitz
New Contributor III
  • 0 kudos

@chethankumar wrote:in the workspace console when I create groups it creates a source as an account, Basically, it is a account level group, But     provider "databricks" { host = var.databricks_host # client_id = "" # client_secre...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels