cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ErikJ
by New Contributor III
  • 2732 Views
  • 7 replies
  • 2 kudos

Errors calling databricks rest api /api/2.1/jobs/run-now with job_parameters

Hello! I have been using the databricks rest api for running workflows using this endpoint: /api/2.1/jobs/run-now. But now i wanted to also include job_parameters in my api call, i have put job parameters inside my workflow: param1, param2, and in my...

  • 2732 Views
  • 7 replies
  • 2 kudos
Latest Reply
slkdfuba
New Contributor II
  • 2 kudos

I encountered a null job_id in my post, when a notebook parameter was set in the job GUI. But it runs just fine (I get a valid job_id with active run) if I delete the notebook parameter in the job GUI.Is this a documented behavior, or a bug? If it's ...

  • 2 kudos
6 More Replies
diegohMoodys
by New Contributor
  • 375 Views
  • 1 replies
  • 0 kudos

JBDC RBMS Table Overwrite Transaction Incomplete

Spark version:  spark-3.4.1-bin-hadoop3JBDC Driver: mysql-connector-j-8.4.0.jarAssumptions:have all the proper read/write permissionsdataset isn't large: ~2 million recordsreading flat files, writing to a databaseDoes not read from the database at al...

diegohMoodys_0-1737041259601.png
  • 375 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @diegohMoodys, Can you try in debug mode? spark.sparkContext.setLogLevel("DEBUG")

  • 0 kudos
stevomcnevo007
by New Contributor III
  • 2781 Views
  • 16 replies
  • 2 kudos

agents.deploy NOT_FOUND: The directory being accessed is not found. error

I keep getting the following error although the model definitely does exist and version names and model name is correct RestException: NOT_FOUND: The directory being accessed is not found. when calling # Deploy the model to the review app and a model...

  • 2781 Views
  • 16 replies
  • 2 kudos
Latest Reply
ezermoysis
New Contributor III
  • 2 kudos

Does the model need to be served before deployment?

  • 2 kudos
15 More Replies
tonypiazza
by New Contributor II
  • 1565 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks Asset Bundle - Job Cluster - JDBC HTTP Path

I am currently working on deploying dbt jobs using a Databricks Asset Bundle. In my existing job configuration, I am using an all-purpose cluster and the JDBC HTTP Path was manually copied from the web UI. Now that I am trying to switch to using a jo...

  • 1565 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

  To reference the HTTP Path using substitutions in Databricks Asset Bundles and job clusters, you can use the variables section in your databricks.yml configuration file In your databricks.yml file, you can define a variable for the HTTP Path. For e...

  • 0 kudos
Filippo
by New Contributor
  • 1897 Views
  • 1 replies
  • 0 kudos

Resolved! Issue with View Ownership Reassignment in Unity Catalog

Hello,It appears that the ownership rules for views and functions in Unity Catalog do not align with the guidelines provided in the “Manage Unity Catalog object ownership” documentation on Microsoft Learn.When attempting to reassign the ownership of ...

  • 1897 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

Hi @Filippo To prevent privilege escalations, only a metastore admin can transfer ownership of a view, function, or model to any user, service principal, or group in the account. Current owners and users with the MANAGE privilege are restricted to tr...

  • 0 kudos
lbdatauser
by New Contributor II
  • 1493 Views
  • 1 replies
  • 0 kudos

Resolved! dbx with serverless clusters

With dbx, is it impossible to create tasks that run on serverless clusters? Is it necessary to use Databricks bundles for it?https://dbx.readthedocs.io/en/latest/reference/deployment/https://learn.microsoft.com/en-us/azure/databricks/jobs/run-serverl...

  • 1493 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

It is possible to create tasks that run on serverless clusters using dbx. Also, please note that Databricks recommends that you use Databricks Asset Bundles instead of dbx by Databricks Labs. See What are Databricks Asset Bundles and Migrate from dbx...

  • 0 kudos
Roy
by New Contributor II
  • 63797 Views
  • 6 replies
  • 0 kudos

Resolved! dbutils.notebook.exit() executing from except in try/except block even if there is no error.

I am using Python notebooks as part of a concurrently running workflow with Databricks Runtime 6.1. Within the notebooks I am using try/except blocks to return an error message to the main concurrent notebook if a section of code fails. However I h...

  • 63797 Views
  • 6 replies
  • 0 kudos
Latest Reply
tonyliken
New Contributor II
  • 0 kudos

because the dbutils.notebook.exit() is an 'Exception' it will always trigger the except Exception as e: part of the code. When can use this to our advantage to solve the problem by adding an 'if else' to the except block. query = "SELECT 'a' as Colum...

  • 0 kudos
5 More Replies
ghofigjong
by New Contributor
  • 8727 Views
  • 4 replies
  • 2 kudos

Resolved! How does partition pruning work on a merge into statement?

I have a delta table that is partitioned by Year, Date and month. I'm trying to merge data to this on all three partition columns + an extra column (an ID). My merge statement is below:MERGE INTO delta.<path of delta table> oldData using df newData ...

  • 8727 Views
  • 4 replies
  • 2 kudos
Latest Reply
Umesh_S
New Contributor II
  • 2 kudos

Isn't the suggested idea only filtering the input dataframe (resulting in a smaller amount of data to match across the whole delta table) rather than prune the delta table for relevant partitions to scan?

  • 2 kudos
3 More Replies
halox6000
by New Contributor III
  • 1220 Views
  • 1 replies
  • 0 kudos

How do i stop pyspark from outputting text

I am using a tqdm progress bar to monitor the amount of data records I have collected via API. I am temporarily writing them to a file in the DBFS, then uploading to a Spark DataFrame. Each time I write to a file, I get a message like 'Wrote 8873925 ...

  • 1220 Views
  • 1 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Hello @halox6000, You could temporarily redirect console output to a null device for these write operations. Try this out: @contextlib.contextmanager def silence_dbutils(): with contextlib.redirect_stdout(io.StringIO()): yield # Usage in...

  • 0 kudos
Nathant93
by New Contributor III
  • 1309 Views
  • 2 replies
  • 0 kudos

remove empty folders with pyspark

Hi,I am trying to search a mnt point for any empty folders and remove them. Does anyone know of a way to do this? I have tried dbutils.fs.walk but this does not seem to work.Thanks

  • 1309 Views
  • 2 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Hello @Nathant93, You could use dbutils.fs.ls and iterate on all the directories found to accomplish this task. Something like this: def find_empty_dirs(path): directories = dbutils.fs.ls(path) for directory in directories: if directo...

  • 0 kudos
1 More Replies
adriennn
by Valued Contributor
  • 1555 Views
  • 5 replies
  • 1 kudos

Executing an HTTP call from worker to localhost:80 (Unity Catalog)

hi,trying to implement this industry accelerator in UC (DBR 15.4, Shared Access Mode), I'm encountering an error when running an HTTP query (python requests) from within a udf (see this github issue). Are there any restrictions when it comes to run a...

  • 1555 Views
  • 5 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

Hi @adriennn  , I'll try to look into it.

  • 1 kudos
4 More Replies
ChsAIkrishna
by Contributor
  • 1078 Views
  • 11 replies
  • 1 kudos

Databricks SQL Warehouse Querys went to orphan state

We're experiencing an issue with our Databricks dbt workflow and workflow job is using the SQL warehouse L size cluster that's been working smoothly for the past couple of weeks. However, today we've noticed that at a specific time, all queries are g...

  • 1078 Views
  • 11 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Great to hear your issue got resolved

  • 1 kudos
10 More Replies
chari
by Contributor
  • 7252 Views
  • 2 replies
  • 1 kudos

Cant connect power BI desktop to Azure databricks

Hello,I am trying to connect Power BI desktop to azure databricks (source: delta table) by downloading a connection file from Databricks. I see an error message like below when I open the connection file with power BI. Repeated attempts have given th...

  • 7252 Views
  • 2 replies
  • 1 kudos
Latest Reply
AkhilSebastian
New Contributor II
  • 1 kudos

Was this issue resolved? 

  • 1 kudos
1 More Replies
william_dev
by New Contributor
  • 693 Views
  • 1 replies
  • 0 kudos

VSCode Databricks-Connect can't find config file, says it doesn't exist, but it does

Hi all,I am getting an error that I previously didn't have within VSCode when authenticating to Databricks-Connect in PowerShell.When I run "databricks auth login" and choose a profile, I get the following:> Error: cannot load Databricks config file:...

  • 693 Views
  • 1 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor
  • 0 kudos

PrerequisitesDatabricks CLI installedMatch your databricks-connect version with your cluster runtime (e.g. runtime 14.3 LTS, needs databricks-connect 14.3.x)Match your local python installation with the Databricks Python version.Databricks extension ...

  • 0 kudos
elamathi
by New Contributor
  • 1348 Views
  • 1 replies
  • 0 kudos

ExecutorLostFailure

ExecutorLostFailure (executor 22 exited caused by one of the running tasks) Reason: Remote RPC client disassociated. Likely due to containers exceeding thresholds, or network issues. Check driver logs for WARN messages.

  • 1348 Views
  • 1 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor
  • 0 kudos

he ExecutorLostFailure error in Spark indicates that an executor was lost during the execution of a task. Review the driver logs for any WARN or ERROR messages that might provide more context about why the executor was lost. Also, Ensure that the exe...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels