- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-26-2025 02:42 AM
Hi all,
I have very simple pipeline:
-- Databricks notebook source
CREATE OR REFRESH STREAMING TABLE `catalog-prod`.default.dlt_table AS
SELECT * FROM STREAM read_files('/Volumes/catalog-prod/storage/*', format=> 'json')
-- COMMAND ----------
CREATE OR REFRESH STREAMING TABLE `catalog-prod`.default.dlt_table2 AS SELECT * FROM STREAM `catalog-prod`.default.dlt_table
When I manually initiate the pipeline, everything functions as intended. However, when I select tables for refresh or a full-refresh selection, the pipeline encounters an error message indicating that the pipeline cluster is unreachable, as depicted in the screenshot.
I’ve reproduced the issue on this simple pipeline and have encountered it in production pipelines as well. The first time I experienced it was at the beginning of this month.
Could you please assist me? Has anyone else encountered the same issue?
I appreciate your help in advance.
Full error log:
{
"id": "5aa6fdd0-f427-11ef-bb95-aea0f26f3df1",
"sequence": {
"control_plane_seq_no": 1740563533361001
},
"origin": {
"cloud": "Azure",
"region": "westeurope",
"org_id": 1148767584619153,
"pipeline_id": "66a817ca-c44d-4f32-928e-68807ba63231",
"pipeline_type": "WORKSPACE",
"pipeline_name": "test",
"update_id": "4ab6e5b7-b01a-4bda-8fa8-544c20cb1f1c",
"request_id": "4ab6e5b7-b01a-4bda-8fa8-544c20cb1f1c"
},
"timestamp": "2025-02-26T09:52:13.357Z",
"message": "Update 4ab6e5 is FAILED.",
"level": "ERROR",
"error": {
"exceptions": [
{
"message": "BAD_REQUEST: Pipeline cluster is not reachable."
}
],
"internal_exceptions": [
{
"class_name": "com.databricks.api.base.DatabricksServiceException",
"message": "BAD_REQUEST: Pipeline cluster is not reachable.",
"stack": [
{
"declaring_class": "com.databricks.api.base.DatabricksServiceException$",
"method_name": "apply",
"file_name": "DatabricksServiceException.scala",
"line_number": 443
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.EventLoopErrorHandlerImpl$$anonfun$$nestedInanonfun$handleAPICallError$1$1",
"method_name": "applyOrElse",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 512
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.EventLoopErrorHandlerImpl$$anonfun$$nestedInanonfun$handleAPICallError$1$1",
"method_name": "applyOrElse",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 407
},
{
"declaring_class": "scala.PartialFunction$Lifted",
"method_name": "apply",
"file_name": "PartialFunction.scala",
"line_number": 228
},
{
"declaring_class": "scala.PartialFunction$Lifted",
"method_name": "apply",
"file_name": "PartialFunction.scala",
"line_number": 224
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.EventLoopErrorHandlerImpl",
"method_name": "calculateRecoveryActionUnbounded",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 298
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.EventLoopErrorHandlerImpl",
"method_name": "calculateRecoveryAction",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 239
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.EventLoopErrorHandlerImpl",
"method_name": "handleEventFailure",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 198
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.EventLoopErrorHandlerImpl",
"method_name": "handleAPICallError",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 402
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.DelegatingEventLoopErrorHandlerImpl",
"method_name": "handleAPICallError",
"file_name": "EventLoopErrorHandler.scala",
"line_number": 109
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "$anonfun$executeEvent$9",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 241
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "$anonfun$executeEvent$9$adapted",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 240
},
{
"declaring_class": "scala.PartialFunction",
"method_name": "applyOrElse",
"file_name": "PartialFunction.scala",
"line_number": 127
},
{
"declaring_class": "scala.PartialFunction",
"method_name": "applyOrElse$",
"file_name": "PartialFunction.scala",
"line_number": 126
},
{
"declaring_class": "scala.PartialFunction$$anon$1",
"method_name": "applyOrElse",
"file_name": "PartialFunction.scala",
"line_number": 257
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "$anonfun$executeEvent$3",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 240
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "$anonfun$executeEvent$3$adapted",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 197
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "$anonfun$withActivityInternal$12",
"file_name": "ActivityContextFactory.scala",
"line_number": 829
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "$anonfun$withAttributionContext$1",
"file_name": "AttributionContextTracing.scala",
"line_number": 49
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "$anonfun$withValue$1",
"file_name": "AttributionContext.scala",
"line_number": 293
},
{
"declaring_class": "scala.util.DynamicVariable",
"method_name": "withValue",
"file_name": "DynamicVariable.scala",
"line_number": 62
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "withValue",
"file_name": "AttributionContext.scala",
"line_number": 289
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext",
"file_name": "AttributionContextTracing.scala",
"line_number": 47
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext$",
"file_name": "AttributionContextTracing.scala",
"line_number": 44
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withAttributionContext",
"file_name": "ActivityContextFactory.scala",
"line_number": 52
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "$anonfun$withActivityInternal$2",
"file_name": "ActivityContextFactory.scala",
"line_number": 829
},
{
"declaring_class": "com.databricks.context.integrity.IntegrityCheckContext$ThreadLocalStorage$",
"method_name": "withValue",
"file_name": "IntegrityCheckContext.scala",
"line_number": 73
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withActivityInternal",
"file_name": "ActivityContextFactory.scala",
"line_number": 792
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withChildActivity",
"file_name": "ActivityContextFactory.scala",
"line_number": 556
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withBackgroundOrChildActivity",
"file_name": "ActivityContextFactory.scala",
"line_number": 345
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "$anonfun$executeEvent$1",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 197
},
{
"declaring_class": "scala.runtime.java8.JFunction0$mcV$sp",
"method_name": "apply",
"file_name": "JFunction0$mcV$sp.java",
"line_number": 23
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.logging.PipelinesLogging",
"method_name": "$anonfun$withAttributionTags$1",
"file_name": "PipelinesLogging.scala",
"line_number": 59
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "$anonfun$withAttributionContext$1",
"file_name": "AttributionContextTracing.scala",
"line_number": 49
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "$anonfun$withValue$1",
"file_name": "AttributionContext.scala",
"line_number": 293
},
{
"declaring_class": "scala.util.DynamicVariable",
"method_name": "withValue",
"file_name": "DynamicVariable.scala",
"line_number": 62
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "withValue",
"file_name": "AttributionContext.scala",
"line_number": 289
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext",
"file_name": "AttributionContextTracing.scala",
"line_number": 47
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext$",
"file_name": "AttributionContextTracing.scala",
"line_number": 44
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "withAttributionContext",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 55
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionTags",
"file_name": "AttributionContextTracing.scala",
"line_number": 96
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionTags$",
"file_name": "AttributionContextTracing.scala",
"line_number": 77
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "com$databricks$pipelines$deployment$server$logging$PipelinesLogging$$super$withAttributionTags",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 55
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.logging.PipelinesLogging",
"method_name": "withAttributionTags",
"file_name": "PipelinesLogging.scala",
"line_number": 50
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.logging.PipelinesLogging",
"method_name": "withAttributionTags$",
"file_name": "PipelinesLogging.scala",
"line_number": 43
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "withAttributionTags",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 55
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "executeEvent",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 172
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.SerialEventLoop",
"method_name": "$anonfun$executeNextEvent$3",
"file_name": "SerialEventLoop.scala",
"line_number": 187
},
{
"declaring_class": "scala.runtime.java8.JFunction0$mcV$sp",
"method_name": "apply",
"file_name": "JFunction0$mcV$sp.java",
"line_number": 23
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop",
"method_name": "synchronizedFor",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 100
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.SerialEventLoop",
"method_name": "executeNextEvent",
"file_name": "SerialEventLoop.scala",
"line_number": 173
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop$$anon$1",
"method_name": "$anonfun$doRun$1",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 354
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop$$anon$1",
"method_name": "$anonfun$doRun$1$adapted",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 343
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "$anonfun$withBackgroundActivity$8",
"file_name": "ActivityContextFactory.scala",
"line_number": 705
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "$anonfun$withActivityInternal$12",
"file_name": "ActivityContextFactory.scala",
"line_number": 829
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "$anonfun$withAttributionContext$1",
"file_name": "AttributionContextTracing.scala",
"line_number": 49
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "$anonfun$withValue$1",
"file_name": "AttributionContext.scala",
"line_number": 293
},
{
"declaring_class": "scala.util.DynamicVariable",
"method_name": "withValue",
"file_name": "DynamicVariable.scala",
"line_number": 62
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "withValue",
"file_name": "AttributionContext.scala",
"line_number": 289
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext",
"file_name": "AttributionContextTracing.scala",
"line_number": 47
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext$",
"file_name": "AttributionContextTracing.scala",
"line_number": 44
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withAttributionContext",
"file_name": "ActivityContextFactory.scala",
"line_number": 52
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "$anonfun$withActivityInternal$2",
"file_name": "ActivityContextFactory.scala",
"line_number": 829
},
{
"declaring_class": "com.databricks.context.integrity.IntegrityCheckContext$ThreadLocalStorage$",
"method_name": "withValue",
"file_name": "IntegrityCheckContext.scala",
"line_number": 73
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withActivityInternal",
"file_name": "ActivityContextFactory.scala",
"line_number": 792
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withActivityInternal",
"file_name": "ActivityContextFactory.scala",
"line_number": 774
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "$anonfun$withBackgroundActivity$6",
"file_name": "ActivityContextFactory.scala",
"line_number": 703
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "$anonfun$withAttributionContext$1",
"file_name": "AttributionContextTracing.scala",
"line_number": 49
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "$anonfun$withValue$1",
"file_name": "AttributionContext.scala",
"line_number": 293
},
{
"declaring_class": "scala.util.DynamicVariable",
"method_name": "withValue",
"file_name": "DynamicVariable.scala",
"line_number": 62
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "withValue",
"file_name": "AttributionContext.scala",
"line_number": 289
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext",
"file_name": "AttributionContextTracing.scala",
"line_number": 47
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext$",
"file_name": "AttributionContextTracing.scala",
"line_number": 44
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withAttributionContext",
"file_name": "ActivityContextFactory.scala",
"line_number": 52
},
{
"declaring_class": "com.databricks.logging.activity.ActivityContextFactory$",
"method_name": "withBackgroundActivity",
"file_name": "ActivityContextFactory.scala",
"line_number": 703
},
{
"declaring_class": "com.databricks.pipelines.deployment.server.loop.AbstractSerialEventLoop$$anon$1",
"method_name": "doRun",
"file_name": "AbstractSerialEventLoop.scala",
"line_number": 343
},
{
"declaring_class": "com.databricks.threading.NamedThread",
"method_name": "$anonfun$run$2",
"file_name": "NamedThread.scala",
"line_number": 69
},
{
"declaring_class": "scala.runtime.java8.JFunction0$mcV$sp",
"method_name": "apply",
"file_name": "JFunction0$mcV$sp.java",
"line_number": 23
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "$anonfun$withAttributionContext$1",
"file_name": "AttributionContextTracing.scala",
"line_number": 49
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "$anonfun$withValue$1",
"file_name": "AttributionContext.scala",
"line_number": 293
},
{
"declaring_class": "scala.util.DynamicVariable",
"method_name": "withValue",
"file_name": "DynamicVariable.scala",
"line_number": 62
},
{
"declaring_class": "com.databricks.logging.AttributionContext$",
"method_name": "withValue",
"file_name": "AttributionContext.scala",
"line_number": 289
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext",
"file_name": "AttributionContextTracing.scala",
"line_number": 47
},
{
"declaring_class": "com.databricks.logging.AttributionContextTracing",
"method_name": "withAttributionContext$",
"file_name": "AttributionContextTracing.scala",
"line_number": 44
},
{
"declaring_class": "com.databricks.threading.NamedThread",
"method_name": "withAttributionContext",
"file_name": "NamedThread.scala",
"line_number": 25
},
{
"declaring_class": "com.databricks.threading.NamedThread",
"method_name": "$anonfun$run$1",
"file_name": "NamedThread.scala",
"line_number": 68
},
{
"declaring_class": "scala.runtime.java8.JFunction0$mcV$sp",
"method_name": "apply",
"file_name": "JFunction0$mcV$sp.java",
"line_number": 23
},
{
"declaring_class": "com.databricks.context.integrity.IntegrityCheckContext$ThreadLocalStorage$",
"method_name": "withValue",
"file_name": "IntegrityCheckContext.scala",
"line_number": 73
},
{
"declaring_class": "com.databricks.threading.NamedThread",
"method_name": "run",
"file_name": "NamedThread.scala",
"line_number": 67
}
]
}
],
"fatal": false
},
"details": {
"update_progress": {
"state": "FAILED"
}
},
"event_type": "update_progress",
"maturity_level": "STABLE"
}
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-26-2025 08:49 AM
Hi,
thank you for your response. In the mean time I found the bug Databricks UI which caused this behaviour. I will raise ticket to Databricks. Please see the draft of the ticket bellow for workaround:
We’re facing an issue with Delta Live Tables pipelines. Specifically, it prevents us from using refresh selection or full-refresh selection on tables in the catalog with the dash character ‘-‘ in their name.
To illustrate the problem, let’s consider three Databricks CLI requests.
> databricks pipelines start-update 66a817ca-c44d-4f32-928e-68807ba63231 —json ‘{“full_refresh_selection”: [“gbs_core-adc_prod.default.dlt_table2”]}’
> databricks pipelines start-update 66a817ca-c44d-4f32-928e-68807ba63231 —json ‘{“full_refresh_selection”: [“`gbs_core-adc_prod`.default.dlt_table2”]}’
> databricks pipelines start-update 66a817ca-c44d-4f32-928e-68807ba63231
The first DLT update fails because the catalog name contains the dash character. The second update succeeds because the catalog name is escaped. The third update succeeds because it doesn’t contain any table name in request.
However, the Databricks UI doesn’t escape the catalog name, so every refresh selection or full-refresh selection fails. This issue prevents us from using the Databricks UI for refresh selection or full-refresh selection. It affects all our DLT pipelines, including production ones.
We anticipate this issue to impact other customers with a dash character ‘-‘ in their catalog name, so we consider it a high priority.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-26-2025 08:40 AM
Can you copy paste the YAML or JSON version of your DLT pipeline so we can see all the settings you are using?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-26-2025 08:49 AM
Hi,
thank you for your response. In the mean time I found the bug Databricks UI which caused this behaviour. I will raise ticket to Databricks. Please see the draft of the ticket bellow for workaround:
We’re facing an issue with Delta Live Tables pipelines. Specifically, it prevents us from using refresh selection or full-refresh selection on tables in the catalog with the dash character ‘-‘ in their name.
To illustrate the problem, let’s consider three Databricks CLI requests.
> databricks pipelines start-update 66a817ca-c44d-4f32-928e-68807ba63231 —json ‘{“full_refresh_selection”: [“gbs_core-adc_prod.default.dlt_table2”]}’
> databricks pipelines start-update 66a817ca-c44d-4f32-928e-68807ba63231 —json ‘{“full_refresh_selection”: [“`gbs_core-adc_prod`.default.dlt_table2”]}’
> databricks pipelines start-update 66a817ca-c44d-4f32-928e-68807ba63231
The first DLT update fails because the catalog name contains the dash character. The second update succeeds because the catalog name is escaped. The third update succeeds because it doesn’t contain any table name in request.
However, the Databricks UI doesn’t escape the catalog name, so every refresh selection or full-refresh selection fails. This issue prevents us from using the Databricks UI for refresh selection or full-refresh selection. It affects all our DLT pipelines, including production ones.
We anticipate this issue to impact other customers with a dash character ‘-‘ in their catalog name, so we consider it a high priority.

