cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Pat
by Esteemed Contributor
  • 1740 Views
  • 1 replies
  • 2 kudos

Pipeline Tags in development mode

Hey,I couldn't find this documented anywhere, but I have been deploying Databricks Workflows (now called Jobs, I believe) with Pipelines using DABs. I have a 1 set of configuration, so there is no place for human error here.When I deploy bundle in DE...

  • 1740 Views
  • 1 replies
  • 2 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 2 kudos

Hey Pat - thanks for reaching out. There’s no official intent for tag visibility to differ between environments when the config and deployments are consistent. Is the CLI version the same for prod and dev workspaces? Can you make sure that all of the...

  • 2 kudos
Sagar_0607
by New Contributor
  • 155 Views
  • 1 replies
  • 1 kudos

Need the output of a task in Databricks job in JSON format

Where can I see the logs in JSON format of the output produced by a task in Databricks jobs?

  • 155 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Sagar_0607 ,You can use following REST API endpoint, which let's you retrieve the output and metadata of a single task run:Get the output for a single run | Jobs API | REST API reference | Databricks on AWSWith this endpoint, when a notebook task...

  • 1 kudos
santosh-santosh
by New Contributor II
  • 598 Views
  • 8 replies
  • 0 kudos

Create external tables with properties set in delta log and no collation

There is an external delta lake table that need to be mounted on to the unity catalogIt has some properties configured in the _delta_log folder alreadyWhen try to create table using CREATE TABLE catalog_name.schema_name.table_name USING DELTA LOCATIO...

  • 598 Views
  • 8 replies
  • 0 kudos
Latest Reply
ManojkMohan
Honored Contributor
  • 0 kudos

@santosh-santosh Did you execute the steps part by part i shared in the DM ? Step 0: Define your external tables, Step 1: Inspect external Delta table schema & properties Success Check:Ensure all expected properties  are captured  Step 2: Inspect Uni...

  • 0 kudos
7 More Replies
eballinger
by Contributor
  • 1501 Views
  • 6 replies
  • 0 kudos

Email notification to end users

Is there a way a way we can notify all of our databricks end users by email when there is a issue? We currently have our jobs setup to notify the technical team when a job workflow fails. That part works fine.But we would like the ability to maybe us...

  • 1501 Views
  • 6 replies
  • 0 kudos
Latest Reply
AnanthuR
New Contributor II
  • 0 kudos

Hello,I have a similar doubt!I’m running a data pipeline on Databricks, and at the end of the pipeline, I generate some results. I’d like to notify the relevant people automatically via email (or another messaging method) with a summary of the result...

  • 0 kudos
5 More Replies
Hanfo2back
by New Contributor III
  • 567 Views
  • 5 replies
  • 4 kudos

Resolved! DLT Pipeline Failed to create new KafkaAdminClient SQLSTATE: XXKST:

I encountered the error: No LoginModule found for org.apache.kafka.common.security.scram.ScramLoginModule while consuming data from Kafka using a Databricks pipeline. The pipeline had been running smoothly before, but the error appeared on September ...

  • 567 Views
  • 5 replies
  • 4 kudos
Latest Reply
Advika
Databricks Employee
  • 4 kudos

Hello @Hanfo2back! Can you please try changing SASL login string to use kafkashaded.org.apache.kafka.common.security.scram.ScramLoginModule instead of org.apache.kafka.common.security.scram.ScramLoginModule.

  • 4 kudos
4 More Replies
jin2631816
by New Contributor II
  • 987 Views
  • 5 replies
  • 1 kudos

[Free Edition] Outbound internet suddenly blocked - Error: HTTPSConnectionPool(host='www.google.com'

Hi guys,I'm using the new Databricks Free Edition, and I'm seeing what looks like a sudden change in outbound internet access policy.Yesterday morning, I was able to access external APIs and test simple internet calls using Python and %sh commands in...

  • 987 Views
  • 5 replies
  • 1 kudos
Latest Reply
WiliamRosa
Contributor
  • 1 kudos

Hi @test_user_12, @jin2631816,If it helps, here’s the official documentation with the limitations of the Free Edition:https://docs.databricks.com/aws/en/getting-started/free-edition-limitations

  • 1 kudos
4 More Replies
Bedoonraj
by New Contributor II
  • 420 Views
  • 3 replies
  • 0 kudos

TEMPORARILY_UNAVAILABLE: The service at /api/2.1/unity-catalog/tables is taking too long to process

I'm using DBT to run a model in databricks. I have view model, which holds 2 months of data (~2 million). There is no wide dependency transformation. All are case when statements. Total column no is 234. Till yesterday view was running fine. but toda...

  • 420 Views
  • 3 replies
  • 0 kudos
Latest Reply
WiliamRosa
Contributor
  • 0 kudos

Hi @Bedoonraj,I tested the API call and it worked fine. I also confirmed, just like Khaja_Zaffer mentioned, that there’s no instability with this service. I’d suggest checking your cluster settings and, if possible, trying the test on a different com...

  • 0 kudos
2 More Replies
BMex
by New Contributor III
  • 695 Views
  • 3 replies
  • 2 kudos

Resolved! Issue with Databricks Jobs: SQLSTATE: XXKST

Hi,we have our Databricks Jobs deployed via DABs, and they have been running fine for a while now (approximately 1 month since we migrated from ADF). However, since yesterday, we are getting a weird issue while writing. See error below:[STREAM_FAILED...

Data Engineering
Databricks
databricks-sql
jobs
spark
sqlstate
  • 695 Views
  • 3 replies
  • 2 kudos
Latest Reply
WiliamRosa
Contributor
  • 2 kudos

Hi @BMex,The link I shared with a similar issue contains some solutions — did any of them work for you?

  • 2 kudos
2 More Replies
ManojkMohan
by Honored Contributor
  • 478 Views
  • 1 replies
  • 0 kudos

Resolved! Databricks ro Salesforce | Unity Catalog Query

AskCan we get a UC catalog (like prod or genie) in free edition of data bricks ?Problem i am solving:Structuring Data in Databricks before sending customer, account data to salesforceissue:cannot see workspace-local tables (workspace.default.structur...

image (1).png image (2).png
  • 478 Views
  • 1 replies
  • 0 kudos
Latest Reply
RogerThatttt
New Contributor III
  • 0 kudos

root cause of not seeing your workspace-local tables (workspace.default.structured_pdf_table) is the unavailability of a Unity Catalog or Delta Sharing connector configuration in your Free Edition workspace. To resolve this, you typically need admin ...

  • 0 kudos
NUKSY
by New Contributor II
  • 1257 Views
  • 4 replies
  • 0 kudos

`io.unitycatalog.client.model.TableType`, Unexpected value 'MATERIALIZED_VIEW

I have been able to set up jdbc driver with databricks to connect to my unity catalog using local spark sessions. When i try to retrieve tables in my schema i get this error  An error occurred while calling o43.sql.: io.unitycatalog.client.ApiExcepti...

  • 1257 Views
  • 4 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @NUKSY , @Jofes  This should be reported as a bug, see similar issues reported. https://github.com/unitycatalog/unitycatalog/issues/657 https://github.com/unitycatalog/unitycatalog/issues/1077 Thanks!

  • 0 kudos
3 More Replies
tenzinpro
by New Contributor II
  • 677 Views
  • 2 replies
  • 2 kudos

Resolved! delta live tables

Hi . i have a source table that is a delta live streaming table created using dlt.auto_cdc logic  and now i want to create another sreaming table that filters the record from that table as per the client but it also should have auto cdc logic for the...

  • 677 Views
  • 2 replies
  • 2 kudos
Latest Reply
NandiniN
Databricks Employee
  • 2 kudos

Hi @tenzinpro ,  This is an expected error. "DELTA_SOURCE_TABLE_IGNORE_CHANGES] Detected a data update" As explained in the error: This is currently not supported. If this is going to happen regularly and you are okay to skip changes, set the option ...

  • 2 kudos
1 More Replies
Wasubabu
by New Contributor II
  • 570 Views
  • 3 replies
  • 0 kudos

Embed AI/BI Dashboards into Databricks App

Hello,I’m interested in understanding whether it’s possible to embed multiple AI/BI dashboards created in Databricks within a Databricks app. Could you please share the steps or provide any documentation related to this? My goal is to use the app as ...

  • 570 Views
  • 3 replies
  • 0 kudos
Latest Reply
Wasubabu
New Contributor II
  • 0 kudos

Just checking if anyone has already implemented this. Pls share your thoughts.

  • 0 kudos
2 More Replies
Mahesh_rathi__
by New Contributor II
  • 687 Views
  • 4 replies
  • 1 kudos

How to fetch spark.addFiles when used multi node cluster

 I wanted to share the nearly 12 xml files from dbfs location to executor local path by using sc.addFile and I went to your blog and tweaked my code to form path with file:/// the result of it was it worked when we have only one node but throwed erro...

  • 687 Views
  • 4 replies
  • 1 kudos
Latest Reply
K_Anudeep
Databricks Employee
  • 1 kudos

Hello @Mahesh_rathi__ , SparkContext.addFile is for shipping small side files to executors, not for creating an input path that you can pass to sc.textFile("file://..."). On a single-node cluster the driver and executor share the same machine, so the...

  • 1 kudos
3 More Replies
kenmyers-8451
by Contributor
  • 1069 Views
  • 9 replies
  • 13 kudos

Workflows now harder to find old failed runs

Some time in the past few weeks I think there was an update to databricks workflows. Previously you could:run a workflowit failsrepair the workflowclick into the workflowview past runs before that failed via a dropdown bar (like in the screenshot bel...

kenmyers8451_0-1758037918239.png kenmyers8451_1-1758038200242.png kenmyers8451_2-1758038323021.png
  • 1069 Views
  • 9 replies
  • 13 kudos
Latest Reply
hansonma-8451
New Contributor II
  • 13 kudos

I am a Databricks Admin in the workspace that @kenmyers-8451 is having problems in and I am getting the same issue where the retries show up for a brief second but then redirect/refresh and then the retries disappear.This seems to happen when the wor...

  • 13 kudos
8 More Replies
kranthit
by New Contributor II
  • 345 Views
  • 2 replies
  • 0 kudos

Serverless base env setup in Databricks Asset Bundle (DAB)

I am trying to set a base environment for my task (notebook) which is running on serverless, following is the dab yaml i am using when i did bundle deploy -t users, its not throwing any error but its not installing the libraries from the base env, ca...

  • 345 Views
  • 2 replies
  • 0 kudos
Latest Reply
Yogesh_Verma_
Contributor
  • 0 kudos

Your YAML is valid, but the reason the libraries are not being installed is because base_environment_path is not supported for serverless compute. Serverless jobs use a fully managed environment and you can’t override it with a custom base environmen...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels