cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Nishant_Kumar25
by New Contributor
  • 920 Views
  • 2 replies
  • 0 kudos

Resolved! Cluster Issues while assigning it to Notebook

Hi Team Databricks,I have tried to assign cluster to note book in 2 different community edition and its throwing error like:Notebook Detached: Exception when creating execution context: java.netSocketTimeOutException: Connect Timeout.The above error ...

  • 920 Views
  • 2 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Nishant_Kumar25 & @kaleabgirma! There was a recent issue affecting Community Edition clusters, but it has now been mitigated. New cluster creation has been tested and is working as expected. If you're still encountering the error, please try r...

  • 0 kudos
1 More Replies
_singh_vish
by New Contributor III
  • 2049 Views
  • 3 replies
  • 0 kudos

DLT Apply Changes problem

Hi All, I am working on DLT pipeline, to create SCD2 for my bronze layer, my architecture has 4 layers, namely Raw, Bronze, Silver, Gold. I am ingesting data directly into raw, and then I am creating history(SCD2) into bronze. My code:    @Dlt.view(n...

  • 2049 Views
  • 3 replies
  • 0 kudos
Latest Reply
Stefan-Koch
Databricks Partner
  • 0 kudos

Hi @_singh_vish Can you provide some error-logs/messages?

  • 0 kudos
2 More Replies
sergecom
by New Contributor III
  • 1142 Views
  • 1 replies
  • 1 kudos

Resolved! Automating Purging of All Notebook Revision

Hi everyone,We work with sensitive data in Databricks, so it's crucial from both security and regulatory perspectives to purge all data saved in notebook revisions.Currently, there are two manual methods:Delete all history from each notebook individu...

  • 1142 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Here are some things to consider:   Automating the purging of notebook revision history in Databricks is not currently a directly supported feature, and there are some challenges in achieving this: Available Methods:Currently, Databricks provides ma...

  • 1 kudos
Tom_Greenwood
by New Contributor III
  • 24301 Views
  • 14 replies
  • 5 kudos

UDF importing from other modules

Hi community,I am using a pyspark udf. The function is being imported from a repo (in the repos section) and registered as a UDF in a the notebook. I am getting a PythonException error when the transformation is run. This is comming from the databric...

Tom_Greenwood_0-1706798998837.png
  • 24301 Views
  • 14 replies
  • 5 kudos
Latest Reply
rich_avery
New Contributor III
  • 5 kudos

I just ran into and solved this issue. My problem was because in the python script that I loaded in as a module I defined the function that I planned to use as a udf separately from the function that I actually called in my script. I believe that bec...

  • 5 kudos
13 More Replies
dtabass
by New Contributor III
  • 82480 Views
  • 6 replies
  • 9 kudos

How/where can I see a list of my dbfs files?

When using the Community Edition, I'm trying to find a place in the UI where I can browse the files that I've uploaded to dbfs. How/where can I do that? When I try to view them from the Data sidebar I see nothing, yet I know they're there, as if I us...

  • 82480 Views
  • 6 replies
  • 9 kudos
Latest Reply
suman23479
New Contributor II
  • 9 kudos

This is helpful.After enabling i can able to see it.

  • 9 kudos
5 More Replies
taschi
by New Contributor III
  • 15568 Views
  • 7 replies
  • 8 kudos

Resolved! How can I trigger the execution of a specific step within a Databricks Workflow job?

I'm investigating methods to test a Job starting from a particular step. For instance, if I've made modifications midway through a 50+ step Job, is there a way to test the Job without running the steps that precede the one with the modification?

  • 15568 Views
  • 7 replies
  • 8 kudos
Latest Reply
SamAdams
Contributor
  • 8 kudos

It's now generally available

  • 8 kudos
6 More Replies
minhhung0507
by Valued Contributor
  • 4061 Views
  • 3 replies
  • 0 kudos

DeltaFileNotFoundException: [DELTA_TRUNCATED_TRANSACTION_LOG] Error in Streaming Table

I am encountering a recurring issue while working with Delta streaming tables in my system. The error message is as follows: com.databricks.sql.transaction.tahoe.DeltaFileNotFoundException: [DELTA_TRUNCATED_TRANSACTION_LOG] gs://cimb-prod-lakehouse/b...

minhhung0507_0-1739330700784.png minhhung0507_1-1739330749656.png
  • 4061 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

The issue you're encountering with the error DeltaFileNotFoundException: [DELTA_TRUNCATED_TRANSACTION_LOG] is related to Delta Lake's retention policy for logs and checkpoints, which manages the lifecycle of transaction log files and checkpoint files...

  • 0 kudos
2 More Replies
seanstachff
by New Contributor II
  • 3961 Views
  • 2 replies
  • 0 kudos

Databricks SQL Error outputting sesntive data to logs

Hi - I am using `from_json` with FAILFAST to correctly format some data using databricks SQL. However, this function can return the error "[MALFORMED_RECORD_IN_PARSING.WITHOUT_SUGGESTION] Malformed records are detected in record parsing" with the res...

  • 3961 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

You could use mode (default PERMISSIVE allows a mode for dealing with corrupt records during parsing. PERMISSIVE: when it meets a corrupted record, puts the malformed string into a field configured by columnNameOfCorruptRecord, and sets malformed fie...

  • 0 kudos
1 More Replies
dbuenosilva
by New Contributor
  • 4593 Views
  • 2 replies
  • 0 kudos

Auto loader from tables in Delta Share

Hello,I am trying to read delta table in delta shares shared from other environments.The pipeline runs okay; however, as the delta table is update in the source (delta share in GCP), the code below gets error, unless if I reset the checkpoint. I wond...

  • 4593 Views
  • 2 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

The error you are encountering—DeltaUnsupportedOperationException: [DELTA_SOURCE_TABLE_IGNORE_CHANGES]—occurs because your streaming job detects updates in the source Delta table, which is not supported for they type of source you have. Streaming tab...

  • 0 kudos
1 More Replies
pvaz
by New Contributor II
  • 4678 Views
  • 2 replies
  • 1 kudos

Performance issue when using structured streaming

Hi databricks community! Let me first apology for the long post.I'm implementing a system in databricks to read from a kafka stream into the bronze layer of a delta table. The idea is to do some operations on the data that is coming from kafka, mainl...

  • 4678 Views
  • 2 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Have you tried using minPartitions Minimum number of partitions to read from Kafka. You can configure Spark to use an arbitrary minimum of partitions to read from Kafka using the minPartitions option. Normally Spark has a 1-1 mapping of Kafka topicPa...

  • 1 kudos
1 More Replies
Leszek
by Contributor
  • 10337 Views
  • 6 replies
  • 5 kudos

Resolved! Unity Catalog - Azure account console - how to access?

I'm trying to access account console in Azure but I only can see the list of workspaces and access them. I didn't find documentation about account console for Azure. Do you know how to access account console?

  • 10337 Views
  • 6 replies
  • 5 kudos
Latest Reply
vimalii
New Contributor II
  • 5 kudos

Hello @Leszek​ . Please tell me is it works for you ?Did you find the root cause ?I still don't understand why I should grant to myself some extra permissions if I already global administrator, owner of subscription, owner of databricks workspace but...

  • 5 kudos
5 More Replies
drii_cavalcanti
by New Contributor III
  • 1427 Views
  • 3 replies
  • 0 kudos

Databricks App with DAB

Hi All,I am trying to deploy a DBX APP via DAB, however source_code_path seems not to be parsed correctly to the app configuration.- dbx_dash/-- resources/---- app.yml-- src/---- app.yaml---- app.py-- databricks.ymlresources/app.yml:resources:apps: m...

  • 1427 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi Adriana, Have you adjusted the root_path in your databricks.yml Kindly add /Workspace and entire path to the root_path Thanks

  • 0 kudos
2 More Replies
p_romm
by New Contributor III
  • 3910 Views
  • 1 replies
  • 0 kudos

INVALID_HANDLE.SESSION_NOT_FOUND

We run several workflows and tasks parallel using serverless compute. In many different places of code we started to get errors as below. It looks like that when one task fails, every other that run at the same moment fails as well. After retry on on...

  • 3910 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi, The error  INVALID_HANDLE.SESSION_NOT_FOUND  https://docs.databricks.com/aws/en/error-messages/invalid-handle-error-class#session_not_foundis a handled error but the grpc errors are something where more improvements are being pushed in eve...

  • 0 kudos
Sega2
by New Contributor III
  • 3542 Views
  • 1 replies
  • 0 kudos

spark.sql makes debugger freeze

I have just created a simple bundle with databricks, and is using Databricks connect to debug locally. This is my script:from pyspark.sql import SparkSession, DataFrame def get_taxis(spark: SparkSession) -> DataFrame: return spark.read.table("samp...

Sega2_0-1739520074229.png Sega2_1-1739520103137.png
  • 3542 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Ensure that your Databricks Connect is properly set up and is using the correct version compatible with your cluster’s runtime. For VS Code, any mismatches between the installed databricks-connect Python package version and the cluster runtime could ...

  • 0 kudos
Labels