I have tried that but I received error
DataPlaneException: Failed to start the DLT service on cluster <cluster_id>. Please check the stack trace below or driver logs for more details.
com.databricks.pipelines.execution.service.EventLogInitializationException: Failed to initialize event log
java.io.IOException: Error accessing gs://<path>
shaded.databricks.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
GET https://storage.googleapis.com/storage/v1/b/<path>?fields=bucket,name,timeCreated,updated,generation...
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Caller does not have storage.objects.get access to the Google Cloud Storage object. Permission 'storage.objects.get' denied on resource (or it may not exist).",
"reason" : "forbidden"
} ],
"message" : "Caller does not have storage.objects.get access to the Google Cloud Storage object. Permission 'storage.objects.get' denied on resource (or it may not exist)."
}
Alternatively , I also tried to edit the delta live table cluster from UI by adding the service account sa under google service account block. Save Cluster failed with
Error : Dlt prefixed spark images cannot be used outside of Delta live tables service