cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

naga_databricks
by Contributor
  • 1380 Views
  • 1 replies
  • 0 kudos

Overwriting same table

I have a table A that is used in a spark.sql and joins with multiple other tables to get data. this data will be overwritten to the same table A.When i tried this, i get an error consistently as below: ERROR: An error occurred while calling o382.save...

  • 1380 Views
  • 1 replies
  • 0 kudos
Latest Reply
naga_databricks
Contributor
  • 0 kudos

Found this to be a transient error. Once i restarted the cluster, the overwrite was successful. 

  • 0 kudos
Babu_Krishnan
by Contributor
  • 1300 Views
  • 1 replies
  • 0 kudos

Why my DLT is not working with UC?

My IAM profile is not working when accessing the SQS for file notification based ingestion?

  • 1300 Views
  • 1 replies
  • 0 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 0 kudos

I'm not sure if I fully understand the question, but what location are you monitoring? Is it a DBFS path or mount? If so, consider using a UC Volume. 

  • 0 kudos
stevenayers-bge
by Contributor
  • 1380 Views
  • 2 replies
  • 1 kudos

Bug with enabling UniForm Data Format?

In the documentation for enabling iceberg compatibility on delta tables, it states that the minReaderVersion for IcebergCompatV1 and IcebergCompatV2 is 2 (https://docs.databricks.com/en/delta/uniform.html#requirements).However, when you run the REORG...

  • 1380 Views
  • 2 replies
  • 1 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 1 kudos

@stevenayers-bge I've just checked source code of delta and you're right - documentation states that tat minReaderVersion should be >=2, but source code is upgrading it to 3https://github.com/delta-io/delta/blob/78970abd96dfc0278e21c04cda442bb05ccde4...

  • 1 kudos
1 More Replies
MaximeGendre
by New Contributor III
  • 1266 Views
  • 1 replies
  • 0 kudos

How to disable DBFS storage

Hello,I administer a self-service oriented Databricks workspace and I notice that more and more users are storing their data in DBFS due to lack of knowledge.They are not specifying a location when creating their schema or they are not specifying a s...

MaximeGendre_1-1718227315313.png
  • 1266 Views
  • 1 replies
  • 0 kudos
Latest Reply
MaximeGendre
New Contributor III
  • 0 kudos

Replacing "/mnt/adl2" by "dbfs:/mnt/adl2" fixed the issue.

  • 0 kudos
sagarsiddhabha
by New Contributor
  • 528 Views
  • 0 replies
  • 0 kudos

Attended Data +AI summit at SFO

It was great experience attending this conference.Got great insights about new features.Got to know about new advances in data industry.Attending the conference was an enriching and transformative experience. I gained invaluable insights into the lat...

  • 528 Views
  • 0 replies
  • 0 kudos
mscsu
by New Contributor
  • 685 Views
  • 0 replies
  • 0 kudos

Unity catalog

Great learning on serverless compute, Unity catalog, etc

  • 685 Views
  • 0 replies
  • 0 kudos
Manjula_Ganesap
by Contributor
  • 611 Views
  • 0 replies
  • 0 kudos

Autoloader on ADLS blobs with archival enabled

Hi All, I'm trying to change our Ingestion process to use Autoloader to identify new files landing in a directory on ADLS. The ADLS directory has access tier enabled to archive files older than a certain time period. When I'm trying to set up Autoloa...

  • 611 Views
  • 0 replies
  • 0 kudos
Trilleo
by New Contributor III
  • 5624 Views
  • 4 replies
  • 2 kudos

Resolved! Handle updates from bronze to silver table stream

Hi Databricks Community,  I am trying to stream from a bronze to a silver table, however, I have the problem that there may be updates in the bronze table. Delta table streaming reads and write does not support skipChangeCommits=false, i.e. handle mo...

  • 5624 Views
  • 4 replies
  • 2 kudos
Latest Reply
Himali_K
New Contributor II
  • 2 kudos

Hi, You can use dlt apply changes to deal with changing source.Delta Live Tables Python language reference | Databricks on AWSThank you

  • 2 kudos
3 More Replies
aliehs0510
by New Contributor II
  • 848 Views
  • 1 replies
  • 1 kudos

DLT Pipeline does not create the view but it shows up on the DLT graph

I wanted a more filtered data set from a materialized view so I figured a view might be the solution but it doesn't get created under the target schema  however it shows upon in the graph as a part of the pipeline. Can't we use MVs as data source for...

  • 848 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rishabh-Pandey
Esteemed Contributor
  • 1 kudos

Issue at Hand:You mentioned that a view is not created under the target schema but appears in the DLT graph. This situation arises due to how DLT manages views and materialized views.Possible Causes and Solutions:DLT Execution and Target Schema:In DL...

  • 1 kudos
sanket-kelkar
by New Contributor II
  • 8304 Views
  • 4 replies
  • 1 kudos

Databricks costing - Need details of the Azure VM costs

Hi All,We are using the Azure Databricks platform for one of our Data Engg needs. Here's my setup -1. Job compute that uses Cluster of size - 1 driver and 2 workers - all are of 'Standard_DS3_v2' type. (Photon is disabled).2. The job compute takes th...

  • 8304 Views
  • 4 replies
  • 1 kudos
Latest Reply
GuillermoM
New Contributor II
  • 1 kudos

To calculate the real cost of an Azure Cluster or Job, there are two ways: DIY, which means querying the Microsoft Cost API and Databricks API and then combining the information to get the exact cost, or you can use a tool such as KopiCloud Databrick...

  • 1 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels