cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

diegohMoodys
by New Contributor
  • 46 Views
  • 1 replies
  • 0 kudos

JBDC RBMS Table Overwrite Transaction Incomplete

Spark version:  spark-3.4.1-bin-hadoop3JBDC Driver: mysql-connector-j-8.4.0.jarAssumptions:have all the proper read/write permissionsdataset isn't large: ~2 million recordsreading flat files, writing to a databaseDoes not read from the database at al...

diegohMoodys_0-1737041259601.png
  • 46 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @diegohMoodys, Can you try in debug mode? spark.sparkContext.setLogLevel("DEBUG")

  • 0 kudos
stevomcnevo007
by New Contributor III
  • 1823 Views
  • 16 replies
  • 2 kudos

agents.deploy NOT_FOUND: The directory being accessed is not found. error

I keep getting the following error although the model definitely does exist and version names and model name is correct RestException: NOT_FOUND: The directory being accessed is not found. when calling # Deploy the model to the review app and a model...

  • 1823 Views
  • 16 replies
  • 2 kudos
Latest Reply
ezermoysis
New Contributor II
  • 2 kudos

Does the model need to be served before deployment?

  • 2 kudos
15 More Replies
Aatma
by New Contributor
  • 480 Views
  • 1 replies
  • 0 kudos

Resolved! DABs require library dependancies from GitHub private repository.

developing a python wheel file using DABs which require library dependancies from GitHub private repository. Please help me understand how to setup the git user and token in the resource.yml file and how to authenticate the GitHub package.pip install...

  • 480 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

To install dependencies from a private GitHub repository in a Databricks Asset Bundle, you need to set up the GitHub user and token in the resource.yml file and authenticate the GitHub package. Here are the steps: Generate a GitHub Personal Access T...

  • 0 kudos
tonypiazza
by New Contributor II
  • 486 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks Asset Bundle - Job Cluster - JDBC HTTP Path

I am currently working on deploying dbt jobs using a Databricks Asset Bundle. In my existing job configuration, I am using an all-purpose cluster and the JDBC HTTP Path was manually copied from the web UI. Now that I am trying to switch to using a jo...

  • 486 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

  To reference the HTTP Path using substitutions in Databricks Asset Bundles and job clusters, you can use the variables section in your databricks.yml configuration file In your databricks.yml file, you can define a variable for the HTTP Path. For e...

  • 0 kudos
Filippo
by New Contributor
  • 608 Views
  • 1 replies
  • 0 kudos

Resolved! Issue with View Ownership Reassignment in Unity Catalog

Hello,It appears that the ownership rules for views and functions in Unity Catalog do not align with the guidelines provided in the “Manage Unity Catalog object ownership” documentation on Microsoft Learn.When attempting to reassign the ownership of ...

  • 608 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

Hi @Filippo To prevent privilege escalations, only a metastore admin can transfer ownership of a view, function, or model to any user, service principal, or group in the account. Current owners and users with the MANAGE privilege are restricted to tr...

  • 0 kudos
lbdatauser
by New Contributor II
  • 410 Views
  • 1 replies
  • 0 kudos

Resolved! dbx with serverless clusters

With dbx, is it impossible to create tasks that run on serverless clusters? Is it necessary to use Databricks bundles for it?https://dbx.readthedocs.io/en/latest/reference/deployment/https://learn.microsoft.com/en-us/azure/databricks/jobs/run-serverl...

  • 410 Views
  • 1 replies
  • 0 kudos
Latest Reply
Satyadeepak
Databricks Employee
  • 0 kudos

It is possible to create tasks that run on serverless clusters using dbx. Also, please note that Databricks recommends that you use Databricks Asset Bundles instead of dbx by Databricks Labs. See What are Databricks Asset Bundles and Migrate from dbx...

  • 0 kudos
Roy
by New Contributor II
  • 57095 Views
  • 6 replies
  • 0 kudos

Resolved! dbutils.notebook.exit() executing from except in try/except block even if there is no error.

I am using Python notebooks as part of a concurrently running workflow with Databricks Runtime 6.1. Within the notebooks I am using try/except blocks to return an error message to the main concurrent notebook if a section of code fails. However I h...

  • 57095 Views
  • 6 replies
  • 0 kudos
Latest Reply
tonyliken
New Contributor II
  • 0 kudos

because the dbutils.notebook.exit() is an 'Exception' it will always trigger the except Exception as e: part of the code. When can use this to our advantage to solve the problem by adding an 'if else' to the except block. query = "SELECT 'a' as Colum...

  • 0 kudos
5 More Replies
ghofigjong
by New Contributor
  • 6968 Views
  • 4 replies
  • 1 kudos

Resolved! How does partition pruning work on a merge into statement?

I have a delta table that is partitioned by Year, Date and month. I'm trying to merge data to this on all three partition columns + an extra column (an ID). My merge statement is below:MERGE INTO delta.<path of delta table> oldData using df newData ...

  • 6968 Views
  • 4 replies
  • 1 kudos
Latest Reply
Umesh_S
New Contributor II
  • 1 kudos

Isn't the suggested idea only filtering the input dataframe (resulting in a smaller amount of data to match across the whole delta table) rather than prune the delta table for relevant partitions to scan?

  • 1 kudos
3 More Replies
halox6000
by New Contributor III
  • 1019 Views
  • 1 replies
  • 0 kudos

How do i stop pyspark from outputting text

I am using a tqdm progress bar to monitor the amount of data records I have collected via API. I am temporarily writing them to a file in the DBFS, then uploading to a Spark DataFrame. Each time I write to a file, I get a message like 'Wrote 8873925 ...

  • 1019 Views
  • 1 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Hello @halox6000, You could temporarily redirect console output to a null device for these write operations. Try this out: @contextlib.contextmanager def silence_dbutils(): with contextlib.redirect_stdout(io.StringIO()): yield # Usage in...

  • 0 kudos
Nathant93
by New Contributor III
  • 905 Views
  • 2 replies
  • 0 kudos

remove empty folders with pyspark

Hi,I am trying to search a mnt point for any empty folders and remove them. Does anyone know of a way to do this? I have tried dbutils.fs.walk but this does not seem to work.Thanks

  • 905 Views
  • 2 replies
  • 0 kudos
Latest Reply
MathieuDB
Databricks Employee
  • 0 kudos

Hello @Nathant93, You could use dbutils.fs.ls and iterate on all the directories found to accomplish this task. Something like this: def find_empty_dirs(path): directories = dbutils.fs.ls(path) for directory in directories: if directo...

  • 0 kudos
1 More Replies
adriennn
by Contributor III
  • 456 Views
  • 5 replies
  • 1 kudos

Executing an HTTP call from worker to localhost:80 (Unity Catalog)

hi,trying to implement this industry accelerator in UC (DBR 15.4, Shared Access Mode), I'm encountering an error when running an HTTP query (python requests) from within a udf (see this github issue). Are there any restrictions when it comes to run a...

  • 456 Views
  • 5 replies
  • 1 kudos
Latest Reply
VZLA
Databricks Employee
  • 1 kudos

Hi @adriennn  , I'll try to look into it.

  • 1 kudos
4 More Replies
Abishrp
by New Contributor III
  • 86 Views
  • 2 replies
  • 1 kudos

Issue in getting system.compute.warehouses table in some workspaces

In some workspaces, I can get system.compute.warehouses table But in some other workspaces, it is not available how can i enable it?Both are in same account but assigned to different metastore. 

Abishrp_0-1737014513631.png Abishrp_1-1737014632007.png
  • 86 Views
  • 2 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Here are the APIs required for that as well as requirements: https://docs.databricks.com/en/admin/system-tables/index.html#enable Let me know if you have questions.

  • 1 kudos
1 More Replies
ChsAIkrishna
by New Contributor III
  • 306 Views
  • 11 replies
  • 1 kudos

Databricks SQL Warehouse Querys went to orphan state

We're experiencing an issue with our Databricks dbt workflow and workflow job is using the SQL warehouse L size cluster that's been working smoothly for the past couple of weeks. However, today we've noticed that at a specific time, all queries are g...

  • 306 Views
  • 11 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Great to hear your issue got resolved

  • 1 kudos
10 More Replies
Greg_c
by New Contributor II
  • 61 Views
  • 1 replies
  • 0 kudos

Best practices for ensuring data quality in batch pipelines

Hello everyone,I couldn't find a topic on this - what are your best practices to ensuring data quality in batch pipelines?I've got a big pipeline processing data once per day. We though about either going with DBT or DLT but DLT seems more directed f...

  • 61 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @Greg_c, Please review this articules: https://www.databricks.com/discover/pages/data-quality-management https://docs.databricks.com/en/lakehouse-architecture/reliability/best-practices.html

  • 0 kudos
chari
by Contributor
  • 6378 Views
  • 2 replies
  • 1 kudos

Cant connect power BI desktop to Azure databricks

Hello,I am trying to connect Power BI desktop to azure databricks (source: delta table) by downloading a connection file from Databricks. I see an error message like below when I open the connection file with power BI. Repeated attempts have given th...

  • 6378 Views
  • 2 replies
  • 1 kudos
Latest Reply
AkhilSebastian
New Contributor II
  • 1 kudos

Was this issue resolved? 

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels