cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

elikvar
by New Contributor III
  • 21393 Views
  • 9 replies
  • 9 kudos

Cluster occasionally fails to launch

I have a daily running notebook that occasionally fails with the error:"Run result unavailable: job failed with error message Unexpected failure while waiting for the cluster Some((xxxxxxxxxxxxxxx) )to be readySome(: Cluster xxxxxxxxxxxxxxxx is in un...

  • 21393 Views
  • 9 replies
  • 9 kudos
Latest Reply
Pavan578
New Contributor II
  • 9 kudos

Cluster 'xxxxxxx' was terminated. Reason: WORKER_SETUP_FAILURE (SERVICE_FAULT). Parameters: databricks_error_message:DBFS Daemomn is not reachable., gcp_error_message:Unable to reach the colocated DBFS Daemon.Can Anyone help me how can we resolve thi...

  • 9 kudos
8 More Replies
tanjil
by New Contributor III
  • 16633 Views
  • 9 replies
  • 6 kudos

Resolved! Downloading sharepoint lists using python

Hello, I am trying to download lists from SharePoint into a pandas dataframe. However I cannot get any information successfully. I have attempted many solution mentioned in stackoverflow. Below is one of those attempts: # https://pypi.org/project/sha...

  • 16633 Views
  • 9 replies
  • 6 kudos
Latest Reply
huntaccess
New Contributor II
  • 6 kudos

The error "<urlopen error [Errno -2] Name or service not known>" suggests that there's an issue with the server URL or network connectivity. Double-check the server URL to ensure it's correct and accessible. Also, verify that your network connection ...

  • 6 kudos
8 More Replies
pesky_chris
by New Contributor III
  • 3361 Views
  • 5 replies
  • 0 kudos

Resolved! Problem with SQL Warehouse (Serverless)

I get the following error message on the attempt to use SQL Warehouse (Serverless) compute with Materialized Views (a simple interaction, e.g. DML, UI sample lookup). The MVs are created off the back of Federated Tables (Postgresql), MVs are created ...

  • 3361 Views
  • 5 replies
  • 0 kudos
Latest Reply
pesky_chris
New Contributor III
  • 0 kudos

Hey,To clarify, as I think I'm potentially hitting Databricks unintended "functionality".Materialised Views are managed by DLT pipeline, which was deployed with DABs off CI/CD pipeline,DLT Pipeline runs a notebook with Python code creating MVs dynami...

  • 0 kudos
4 More Replies
Edthehead
by Contributor III
  • 2211 Views
  • 2 replies
  • 0 kudos

Parameterized Delta live table pipeline

I'm trying to create an ETL framework on delta live tables and basically use the same pipeline for all the transformation from bronze to silver to gold. This works absolutely fine when I hard code the tables and the SQL transformations as an array wi...

Data Engineering
Databricks
Delta Live Table
dlt
  • 2211 Views
  • 2 replies
  • 0 kudos
Latest Reply
canadiandataguy
New Contributor III
  • 0 kudos

Here is how you can do it

  • 0 kudos
1 More Replies
calvinchan_iot
by New Contributor II
  • 725 Views
  • 1 replies
  • 0 kudos

SparkRuntimeException: [UDF_ERROR.ENV_LOST] the execution environment was lost during execution

Hey everyone,I have been facing a weird error when i upgrade to use Unity Catalog.org.apache.spark.SparkRuntimeException: [UDF_ERROR.ENV_LOST] Execution of function line_string_linear_interp(geometry#1432) failed - the execution environment was lost ...

  • 725 Views
  • 1 replies
  • 0 kudos
Latest Reply
Brahmareddy
Honored Contributor II
  • 0 kudos

Hi @calvinchan_iot, How are you doing today?As per my understanding, It sounds like the error may be due to environment instability when running the UDF after enabling Unity Catalog. The [UDF_ERROR.ENV_LOST] error often points to the UDF execution en...

  • 0 kudos
KartRasi_10779
by New Contributor
  • 567 Views
  • 2 replies
  • 0 kudos

Glue Catalog Metadata Management with Enforced Tagging in Databricks

As part of the data governance team, we're trying to enforce table-level tagging when users create tables in a Databricks environment where metadata is managed by AWS Glue Catalog (non-Unity Catalog). Is there a way to require tagging at table creati...

  • 567 Views
  • 2 replies
  • 0 kudos
Latest Reply
145676
New Contributor II
  • 0 kudos

You can use lakeFS pre-merge hooks to force this. Works great with this stack -> https://lakefs.io/blog/lakefs-hooks/ 

  • 0 kudos
1 More Replies
steveanderson
by New Contributor
  • 431 Views
  • 1 replies
  • 0 kudos

Comparing Ultrawide Curved Monitors and Dual-Monitor Setups for Databricks Projects

Hello everyone,I’m currently exploring the best setup for my data engineering tasks in Databricks and have been considering the benefits of using an ultrawide curved monitor compared to a standard dual-monitor setup.I’d love to hear from the communit...

  • 431 Views
  • 1 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

Here’s a clearer version:I don’t use a curved monitor; instead, I have two 35" monitors, which work perfectly for my Databricks work. I chose two large monitors over one extra-large one because I frequently share screens, and it’s easier to share an ...

  • 0 kudos
Jana
by New Contributor III
  • 8743 Views
  • 9 replies
  • 4 kudos

Resolved! Parsing 5 GB json file is running long on cluster

I was creating delta table from ADLS json input file. but the job was running long while creating delta table from json. Below is my cluster configuration. Is the issue related to cluster config ? Do I need to upgrade the cluster config ?The cluster ...

  • 8743 Views
  • 9 replies
  • 4 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 4 kudos

with multiline = true, the json is read as a whole and processed as such.I'd try with a beefier cluster.

  • 4 kudos
8 More Replies
1npo
by New Contributor II
  • 585 Views
  • 2 replies
  • 2 kudos

Dark mode broken by "New version of this app is available" popup

Hello,I have the interface theme set to "Prefer dark" in Databricks. I just got a popup in the Workflow page while reviewing a job run, that said something like "A new version of this app is available, click to refresh". I clicked refresh, and now my...

  • 585 Views
  • 2 replies
  • 2 kudos
Latest Reply
1npo
New Contributor II
  • 2 kudos

I just got another "New version of this app is available" popup, and clicking "Refresh" fixedthe dark mode issue. Thanks for the quick response to whichever engineer at Databricks just pushed a hotfix

  • 2 kudos
1 More Replies
merca
by Valued Contributor II
  • 10092 Views
  • 12 replies
  • 7 kudos

Value array {{QUERY_RESULT_ROWS}} in Databricks SQL alerts custom template

Please include in documentation an example how to incorporate the `QUERY_RESULT_ROWS` variable in the custom template.

  • 10092 Views
  • 12 replies
  • 7 kudos
Latest Reply
CJK053000
New Contributor III
  • 7 kudos

Databricks confirmed this was an issue on their end and it should be resolved now. It is working for me.

  • 7 kudos
11 More Replies
saniafatimi
by New Contributor II
  • 4544 Views
  • 3 replies
  • 1 kudos

How to migrate power bi reports to databricks

I have a sample set of power bi(.pbix) reports with all dropdowns, tables, filters etc. Now I would like to migrate this reports to data bricks. whatever visuals are created in power bi, I would like to create same in data bricks from scratch. I wou...

  • 4544 Views
  • 3 replies
  • 1 kudos
Latest Reply
Neeljy
New Contributor II
  • 1 kudos

We must ensure you are sure that the Databricks cluster is operational. These are the steps needed for integration between Azure Databricks into Power BI Desktop.1. Constructing the URL for the connectionConnect to the cluster, and click the Advanced...

  • 1 kudos
2 More Replies
AnatolBeck
by New Contributor II
  • 1090 Views
  • 2 replies
  • 0 kudos

Dashboard use 'Multiple values' in underlying query

Currently, it appears that the Dashboards functionality does not support linking a 'Multiple Values' widget to a query parameter, nor does it allow the creation of a line plot with multiple lines.We are developing a dashboard where users need to visu...

  • 1090 Views
  • 2 replies
  • 0 kudos
Latest Reply
AnatolBeck
New Contributor II
  • 0 kudos

Hello,thank you for your reply.Notebooks are not really a workaround for me here but thank you for the walk through.I think this feature is very important so I hope this reaches your backlog somehow. As this is something a Grafana for example is able...

  • 0 kudos
1 More Replies
camilo_s
by Contributor
  • 473 Views
  • 1 replies
  • 0 kudos

Difference between "deep clone ..." and as "select * from ..."

Hi all,I was trying to deep clone one of the sample tables provided with a parametrized query:create table if not exists IDENTIFIER(:target_catalog || :target_schema || :table_name) DEEP CLONE IDENTIFIER('samples.tpch.' || :table_name)But Databricks ...

  • 473 Views
  • 1 replies
  • 0 kudos
Latest Reply
skarpeck
New Contributor III
  • 0 kudos

Deep clone will also clone all the metadata (e.g. indexes, properties, history, etc.), while SELECT * will create a new, fresh Delta Table, with it's own history and properties.

  • 0 kudos
Bhuvnesh
by New Contributor
  • 689 Views
  • 1 replies
  • 0 kudos

Unity Catalog

Hi,I have requirement to setup the athena tables. We have a unity catalog setup in databricks workspace and I would like know is there any possibility that Athen can be point to unity catalog so that all the tables are available in athena.whenever we...

  • 689 Views
  • 1 replies
  • 0 kudos
Latest Reply
ArunKhandelwal
New Contributor II
  • 0 kudos

Unfortunately, as of now, there isn't a direct, seamless integration between Unity Catalog and Athena to automatically synchronize table updates.However, here are a few potential approaches to achieve your desired outcome:1. AWS Glue Data Catalog:Man...

  • 0 kudos
stevenayers-bge
by Contributor
  • 802 Views
  • 1 replies
  • 1 kudos

Querying Unity Managed Tables from Redshift

I built a script about 6 months ago to make our Delta Tables accessible in Redshift for another team, but it's a bit nasty...Generate a delta lake manifest each time the databricks delta table is updatedRecreate the redshift external table (incase th...

  • 802 Views
  • 1 replies
  • 1 kudos
Latest Reply
aashish122
New Contributor III
  • 1 kudos

Still searching for the same pain point...may be in marketplace to integrate Unity Catalog and Redshift

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels