cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Shan3009
by New Contributor III
  • 5109 Views
  • 3 replies
  • 5 kudos

The transaction log has failed integrity checks. We recommend you contact Databricks support for assistance. Failed verification at version 48 of:

I am trying to write data frame data into delta table. Previously it was working fine but now it is throwing "Log has failed integrity checks"

  • 5109 Views
  • 3 replies
  • 5 kudos
Latest Reply
jcasanella
New Contributor III
  • 5 kudos

@Shanmuganathan Jothikumar​ I've the same exception after upgrading into unity catalog. Need to investigate a little more but adding the following setting, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 5 kudos
2 More Replies
patojo94
by New Contributor II
  • 3802 Views
  • 3 replies
  • 4 kudos

Resolved! pyspark streaming failed for now reason

Hi everyone, I have a pyspark streaming reading from an aws kinesis that suddenly failed for no reason (I mean, we did not make any changes in the last time).It is giving the following error: ERROR MicroBatchExecution: Query kinesis_events_prod_bronz...

  • 3802 Views
  • 3 replies
  • 4 kudos
Latest Reply
jcasanella
New Contributor III
  • 4 kudos

@patricio tojo​ I've the same problem, however in my case is after migrating into unity catalog. Need to investigate a little more but adding this to my spark job, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 4 kudos
2 More Replies
absolutelyRice
by New Contributor III
  • 9526 Views
  • 5 replies
  • 2 kudos

Resolved! Databricks Terraform Provider Issues Passing Providers to Child Modules

I have been following the documentation on the terraform databricks documentation in order to provision account level resources on AWS. I can create the workspace fine, add users, etc... However, when I go to use the provider in non-mws mode, I am re...

  • 9526 Views
  • 5 replies
  • 2 kudos
Latest Reply
absolutelyRice
New Contributor III
  • 2 kudos

So the answer to this was that you need to explicitly pass the provider argument to each of the data resources blocks. The docs should be updated to accommodate that. ​i.e. data "databricks_spark_version" "latest" { provider = databricks.workspace ...

  • 2 kudos
4 More Replies
deficiant_codge
by Contributor II
  • 2298 Views
  • 1 replies
  • 7 kudos

Delta Live tables support for UNITY CATALOG

Is there any upcoming update in which UC will support DLT? if yes any expected ETA?

  • 2298 Views
  • 1 replies
  • 7 kudos
Latest Reply
Pat
Honored Contributor III
  • 7 kudos

Hi @Rahul Mishra​,I think you need to contact your comapny databricks representative for this.The last time I heard about the ETA It was end of the November I believe.You might try to join the Databricks Office Hours tomorrow and ask the question or ...

  • 7 kudos
Ranjeeth
by New Contributor
  • 1820 Views
  • 1 replies
  • 2 kudos
  • 1820 Views
  • 1 replies
  • 2 kudos
Latest Reply
Pat
Honored Contributor III
  • 2 kudos

Hi @Ranjeeth Rikkala​ ,you can check this:https://docs.databricks.com/optimizations/index.html#optimization-recommendations-on-databricksTBH, It’s nit enough info. Make sure you have partioned data, Files are nit too small, try to use bigger cluster,...

  • 2 kudos
HariSelvarajan
by Databricks Employee
  • 831 Views
  • 0 replies
  • 5 kudos

DAIWT22_RadicalSpeenInLakehouse_Photon

Topic: Radical Speed on the Lakehouse: Photon under the hoodI am Hari and I works as a Specialist Solutions Architect at Databricks. I specialise in Data engineering and Cloud platforms problems helping client in EMEA.Purpose:I recently presented a t...

  • 831 Views
  • 0 replies
  • 5 kudos
Sourav_Gulati
by Databricks Employee
  • 921 Views
  • 0 replies
  • 7 kudos

This post is regarding Data Streaming on the Lakehouse' session in Data + AI World Tour 2022 in London. I am Resident Solutions Architect at Datab...

This post is regarding Data Streaming on the Lakehouse' session in Data + AI World Tour 2022 in London. I am Resident Solutions Architect at Databricks. I specialise in data engineering.In this session, I talked about how to leverage real time data t...

  • 921 Views
  • 0 replies
  • 7 kudos
Digan_Parikh
by Valued Contributor
  • 13768 Views
  • 2 replies
  • 3 kudos

Resolved! Default Query Limit 1000

By default, we return back up to 1000 query results when a user runs a cell in Databricks. E.g., if you run display(storeData) and you have ten million customers, the UI will show the first 1000 results. If you graph that by age of customer, similarl...

  • 13768 Views
  • 2 replies
  • 3 kudos
Latest Reply
User16805453151
New Contributor III
  • 3 kudos

This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down.

  • 3 kudos
1 More Replies
Digan_Parikh
by Valued Contributor
  • 6940 Views
  • 2 replies
  • 2 kudos

Resolved! Default Query Limit 1000

Is there any way to change the 1000 for the display row limit at workspace, cluster and notebook level?

  • 6940 Views
  • 2 replies
  • 2 kudos
Latest Reply
User16805453151
New Contributor III
  • 2 kudos

This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down.

  • 2 kudos
1 More Replies
labromb
by Contributor
  • 8243 Views
  • 4 replies
  • 8 kudos

Resolved! Create Databricks tables dynamically

Hi, I would like to be able to do something like this...create table if not exists table1using parquetlocation = '/mnt/somelocationsome location needs to be a concatenation of static and code generated string. Documentation suggests that location onl...

  • 8243 Views
  • 4 replies
  • 8 kudos
Latest Reply
PrasanthM
New Contributor III
  • 8 kudos

FString Python can be used. example > spark.sql(f"CREATE TABLE {table_name} (id INT, name STRING, value DOUBLE, state STRING)")

  • 8 kudos
3 More Replies
atul1146
by New Contributor III
  • 2700 Views
  • 2 replies
  • 5 kudos

Resolved! Databricks set up in Prod environment

Hi! can anyone please help me with a documentation which can help me set up integration between data bricks with AWS without a QuickStart default cloud formation template. I would want to use my own CFT rather than using the default due to security ...

  • 2700 Views
  • 2 replies
  • 5 kudos
Latest Reply
Pat
Honored Contributor III
  • 5 kudos

Hi @Atul S​ ,I think that terraform is recommended way to go with Databricks deployment. I mean it's also supported now by the Databricks support.I haven't look much on the CloudFormation setup, because we decided to go with the Terraform in the comp...

  • 5 kudos
1 More Replies
LJ
by New Contributor II
  • 10953 Views
  • 1 replies
  • 4 kudos

Resolved! Accept widget value during runtime from user

list1 = ('alpha', 'beta', 'gamma', 'eta', 'Theta')list2 = ('alpha', 'beta')df1 = spark.createDataFrame(list1, 'String').withColumnRenamed('value', 'dataset')df2 = spark.createDataFrame(list2, 'String').withColumnRenamed('value', 'dataset')df = df1.ex...

  • 10953 Views
  • 1 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 4 kudos

Every time the user change selection in the widget, you get that selection—behavior after the dropdown change you can define in widget settings.

  • 4 kudos
AriB101
by New Contributor
  • 1114 Views
  • 1 replies
  • 1 kudos

Attended 18Oct22 webinar but didnt recieve voucher

Hi @Kaniz Fatma​ Attended the webinar on 18th Oct uploaded the datalakehouse cert but didnt recieve voucher as of now,also didnt recieve data engg associate certificate.​Please help!!

  • 1114 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Arindam Bose​ Just a friendly- follow-up, have you got your certification and badge? If yes, please mark the answer as best.Thanks and Regards

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels