cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

slakshmanan
by New Contributor III
  • 235 Views
  • 1 replies
  • 0 kudos

post /api/2.0/sql/statements/sql statement id/cancel forbidden error in databricks

when i tried executing this requestpost /api/2.0/sql/statements/${SQL_STATEMENT_ID}/cancelI am getting forbidden error.how do we get access to execute this

  • 235 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
Contributor
  • 0 kudos

Hi @slakshmanan ,The 403 Forbidden error usually means that the API request is not authorized, meaning you likely do not have the proper permissions or your authentication credentials are not being recognized. Here are 2 main reasons:1. Missing or In...

  • 0 kudos
Nhan_Nguyen
by Valued Contributor
  • 8768 Views
  • 16 replies
  • 27 kudos

Resolved! Do not received Databricks Certification: Fully Sponsored after order on Reward Store

Hi team.Would you please help check on my case?From 30-Nov I have placed an order "Databricks Certification: Fully Sponsored" on https://communitydatabricks.mybrightsites.com/ and after waiting 10 bussiness days. I still not receive that voucher.Is t...

  • 8768 Views
  • 16 replies
  • 27 kudos
Latest Reply
domenichancock
New Contributor II
  • 27 kudos

If you have ordered a fully sponsored Databricks Certification through a rewards store (e.g., an online rewards platform for employees or learners) and have not received it, there are several possible reasons and steps to resolve the issue:Possible R...

  • 27 kudos
15 More Replies
SaraCorralLou
by New Contributor III
  • 10895 Views
  • 8 replies
  • 2 kudos

Resolved! dbutils.fs.mv - 1 folder and 1 file with the same name and only move the folder

Hello!I am contacting you because of the following problem I am having:In an ADLS folder I have two items, a folder and an automatically generated Block blob file with the same name as the folder.I want to use the dbutils.fs.mv command to move the fo...

  • 10895 Views
  • 8 replies
  • 2 kudos
Latest Reply
deep_coder16
New Contributor II
  • 2 kudos

What are the possible reasons for the generation of those extra files with same name with zero bytes of data? 

  • 2 kudos
7 More Replies
slakshmanan
by New Contributor III
  • 373 Views
  • 4 replies
  • 0 kudos

how to cancel or kill a long running sql query from databricks python notebook.I have a long running

how to cancel or kill a long running sql query from databricks python notebook.I have a long running sql query in sql warehouse

  • 373 Views
  • 4 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 0 kudos

Hi @slakshmanan ,To stop or interrupt a running notebook, select the interrupt button in the notebook toolbar. You can also select Run > Interrupt execution, or use the keyboard shortcut I I.https://learn.microsoft.com/en-us/azure/databricks/notebook...

  • 0 kudos
3 More Replies
BjarkeM
by New Contributor II
  • 4119 Views
  • 9 replies
  • 0 kudos

Schema migration of production delta tables

GoalWe would like to be in control of schema migrations of delta tables in all dev and production environments, and it must be automatically deployed.I anticipated this to be a common problem with a well-known standard solution. But unfortunately, I ...

  • 4119 Views
  • 9 replies
  • 0 kudos
Latest Reply
worlordv
New Contributor II
  • 0 kudos

GitHub - liquibase/liquibase-databricks

  • 0 kudos
8 More Replies
KennethKnewman
by New Contributor III
  • 274 Views
  • 3 replies
  • 7 kudos

Resolved! Gold table for Analyst

Hi team,we are running data pipeline from bronze to gold, and another team need to refer the gold table. however the team doesn't have technical skills to query, and they would like to use the data on spreadsheets. Do we have any good work around in ...

  • 274 Views
  • 3 replies
  • 7 kudos
Latest Reply
KennethKnewman
New Contributor III
  • 7 kudos

It was easy to install. I'm not sure if this information is useful, but I'd like to share it for those who might be in the same situation. https://bricksheet.amukin.com/export-data-from-databricks-to-google-sheet

  • 7 kudos
2 More Replies
MrJava
by New Contributor III
  • 8146 Views
  • 14 replies
  • 12 kudos

How to know, who started a job run?

Hi there!We have different jobs/workflows configured in our Databricks workspace running on AWS and would like to know who actually started the job run? Are they started by a user or a service principle using curl?Currently one can only see, who is t...

  • 8146 Views
  • 14 replies
  • 12 kudos
Latest Reply
hodb
New Contributor II
  • 12 kudos

for some reason the user_identity.email includes only "unknown" or "Sustem-User"any ideas how to repair to include the name of the person that triggered the job?

  • 12 kudos
13 More Replies
mr_poola49
by New Contributor III
  • 274 Views
  • 3 replies
  • 0 kudos

ADLS gen2 config issue

I am new to Azure Databricks. I am trying to access ADLS gen2 from Azure Databricks. I've set all the required configurations in the notebook but when I try to query the table using SPAR.SQL(), it is throwing exception "Failure to initialize configur...

  • 274 Views
  • 3 replies
  • 0 kudos
Latest Reply
mr_poola49
New Contributor III
  • 0 kudos

Issue is resolved! dropped the table from hive_metastore which is pointing to  ppeadlsg2  storage container and re-created it using prodadlsg2 storage.

  • 0 kudos
2 More Replies
-werners-
by Esteemed Contributor III
  • 377 Views
  • 3 replies
  • 2 kudos

Resolved! asset bundles and compute policies

Did anyone succeed in using already existing compute policies (created using the UI) in asset bundles for creating a job?I defined the policy_id in the resources/job yml for the job_cluster, but when deploying I get errors saying spark version is not...

  • 377 Views
  • 3 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

So I figured it out.You can actually refer existing cluster policies, but I made the mistake thinking all cluster config was added automatically by doing that.In fact you still have to add some cluster config in the resources yaml:- spark_version- sp...

  • 2 kudos
2 More Replies
slakshmanan
by New Contributor III
  • 752 Views
  • 7 replies
  • 1 kudos

how to use rest api to find long running query in databricks

how to use rest api to find long running query in databricks from sql/queries/all

  • 752 Views
  • 7 replies
  • 1 kudos
Latest Reply
Rishabh-Pandey
Esteemed Contributor
  • 1 kudos

You can run this query to get the long running queries and then kill the query you wanted to kill.# Step 1: Get active queries active_queries = spark.sql("SHOW PROCESSLIST") active_queries.show(truncate=False) # Step 2: Identify the query ID you wan...

  • 1 kudos
6 More Replies
Erfan
by New Contributor III
  • 427 Views
  • 3 replies
  • 3 kudos

Resolved! Liquid Clustering With more than 4 columns

Hi there,I’m trying to join a small table (a few million records) with a much larger table (around 1 TB in size, containing a few billion records).The small table isn’t quite small enough to use Broadcast. Additionally, our join clause involves more ...

  • 427 Views
  • 3 replies
  • 3 kudos
Latest Reply
filipniziol
Contributor
  • 3 kudos

Hi @Erfan ,What you can do is to create an additional column that concatenates the values of multiple columns and then apply Liquid Clustering on that new column.

  • 3 kudos
2 More Replies
Constantine
by Contributor III
  • 5319 Views
  • 5 replies
  • 1 kudos

Resolved! How to use Databricks Query History API (REST API)

I have setup authentication using this page https://docs.databricks.com/sql/api/authentication.html and run curl -n -X GET https://<databricks-instance>.cloud.databricks.com/api/2.0/sql/history/queriesTo get history of all sql endpoint queries, but I...

  • 5319 Views
  • 5 replies
  • 1 kudos
Latest Reply
yegorski
New Contributor III
  • 1 kudos

Here's how to query with databricks-sdk-py (working code). I had a frustrating time doing it with vanilla python + requests/urllib and couldn't figure it out. import datetime import os from databricks.sdk import WorkspaceClient from databricks.sdk.se...

  • 1 kudos
4 More Replies
satya1206
by New Contributor II
  • 189 Views
  • 1 replies
  • 0 kudos

Compare 13.3LTS with 14.3LTS

Hello,We have plans to migrate our DBR from 13.3LTS to 14.3 LTS. If anyone has recently completed this migration, we would like to know the major benefits we can expect from it and if there are any disadvantages or behavior change we should be aware ...

  • 189 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
Contributor
  • 0 kudos

Hi @satya1206 ,Check out the docs:https://docs.databricks.com/en/release-notes/runtime/14.3lts.html

  • 0 kudos
AlokThampi
by New Contributor III
  • 499 Views
  • 7 replies
  • 5 kudos

Joining huge delta tables in Databricks

Hello,I am trying to join few delta tables as per the code below.SQLCopy select <applicable columns> FROM ReportTable G LEFT JOIN EKBETable EKBE ON EKBE.BELNR = G.ORDER_ID LEFT JOIN PurchaseOrder POL ON EKBE.EBELN = POL.PO_NOThe PurchaseOrder table c...

AlokThampi_0-1728392939237.png
  • 499 Views
  • 7 replies
  • 5 kudos
Latest Reply
AlokThampi
New Contributor III
  • 5 kudos

Hello @-werners-, @Mo ,I tried the liquid clustering option as suggested but it still doesn't seem to work. I am assuming it to be an issue with the small cluster size that I am using.Or do you suggest any other options?@noorbasha534 , the columns th...

  • 5 kudos
6 More Replies
AlexDavies
by Contributor
  • 7126 Views
  • 9 replies
  • 2 kudos

Report on SQL queries that are being executed

We have a SQL workspace with a cluster running that services a number of self service reports against a range of datasets. We want to be able to analyse and report on the queries our self service users are executing so we can get better visibility of...

  • 7126 Views
  • 9 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey there @Alex Davies​ Hope you are doing great. Just checking in if you were able to resolve your issue or do you need more help? We'd love to hear from you.Thanks!

  • 2 kudos
8 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels