cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Some DLTpipelines suddely seem to take different runtime 16.1 instead of 15.4 since last night (CET)

cookiebaker
New Contributor III

Hello, 

Suddenly since last night on some of our DLT pipelines we're getting failures saying that our hive_metastore control table cannot be found. All of our DLT's are set up the same (serverless), and one Shared Compute on runtime version 15.4. 

For the failing pipelines, when looking at the Update Details - > Logs -> Configuration tab, that the failed pipelines take runtime "dlt:16.1.5-delta-pipelines-photon-dlt-release-dp-2025.15-rc0-commit-be1adbe-image-0e56918". 

The pipelines that succeeded all just took the "dlt:15.4.12-delta-pipelines-photon-dlt-release-dp-2025.15-rc0-commit-be1adbe-image-5055805". (starting with 15.4). 


Did something change on the Databricks end? For us nothing changed in the settings and seems like a sudden disruption of DLT pipelines that were previously just running succesfully.

 

Thank you in advance.

Best, Nigel

1 ACCEPTED SOLUTION

Accepted Solutions

voo-rodrigo
New Contributor III

Hello. FYI last week I opened a ticket with Databricks Support. They identified a flag that had been deployed last week as the culprit, and rollback the changes. I tested it this morning and I am now able to read the hive_metastore from Serverless DLTs. Thank you.

View solution in original post

7 REPLIES 7

BigRoux
Databricks Employee
Databricks Employee

This kind of issue might arise due to the automatic upgrade of the runtime version within the serverless environment, as some updates introduce changes potentially affecting dependencies or configurations. Databricks runtime upgrades typically include bug fixes, enhancements, and new features, which might lead to compatibility issues with existing pipelines if the code or configuration depends on the previous runtime behavior.

Serverless is a fully managed service which includes runtime managment.

cookiebaker
New Contributor III

Thank you for your message. 

Where can I find the release notes of this latest change that caused the disruption?

We were forced to move our DLT's back to the normal computes and moved them away from serverless.

voo-rodrigo
New Contributor III

We are also experiencing the same issue. Since around 6pm ET last night (04-22) our Serverless DLT Pipelines cannot read the hive_metastore. It doesn't seem to be related to Databricks Runtime Version 16.1, however, because previous runs using the same version were successful. Also, we are able to run these DLTs using managed Clusters instead of Serverless, even if the managed Clusters are also at version 16.1. Also, when running regular notebooks or job notebooks using Serverless, we are able to query the hive_metastore as well. The issue seems to only affect DLT Serverless.

We wouldn't want to switch all our pipelines to stop using Serverless, but that is the only setup that currently works for us. Could anything have changed last night specifically at DLT Serverless configuration?

JeanSeb
New Contributor II

any update on this one? we're also experiencing the same kind of issue since yesterday evening, with serverless DLT referring to hive metastore tables

BigRoux
Databricks Employee
Databricks Employee

Serverless release notes can be found here:  https://docs.databricks.com/aws/en/release-notes/serverless/

voo-rodrigo
New Contributor III

Hello. FYI last week I opened a ticket with Databricks Support. They identified a flag that had been deployed last week as the culprit, and rollback the changes. I tested it this morning and I am now able to read the hive_metastore from Serverless DLTs. Thank you.

cookiebaker
New Contributor III

@voo-rodrigo Hello, thanks for updating the progress on your end! I've tested as well and confirmed that the DLT can read the hive_metastore via Serverless again. 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now