cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

Viren123
by Contributor
  • 9246 Views
  • 2 replies
  • 3 kudos

Resolved! Error : MALFORMED_REQUEST

Hello,I get error for below json. Can you please advice what am I missing here?{    "error_code": "MALFORMED_REQUEST",    "message": "Invalid JSON given in the body of the request - failed to parse given JSON"_________________________________________...

  • 9246 Views
  • 2 replies
  • 3 kudos
Latest Reply
Viren123
Contributor
  • 3 kudos

Thank you Pat. I just realised I had copy pasted the code and it didnt quite copy 1:1. In copy paste double quotes were appeared to be copied but behind the scene, it was some other character. Nothing wrong with the code. It works

  • 3 kudos
1 More Replies
User16752245636
by New Contributor II
  • 636 Views
  • 0 replies
  • 1 kudos

Sergio Copy Data World tour Orchestration_v02.pptx (4)

Databricks Workflows: Reliable orchestration for data, analytics, and AIHello, I am a Solutions Architect at Databricks; and I recently presented at the Data and AI World Tour in London Databricks Workflows. You can see some of the presented slides a...

  • 636 Views
  • 0 replies
  • 1 kudos
al_joe
by Contributor
  • 3837 Views
  • 5 replies
  • 1 kudos

Resolved! Databricks Academy LMS - shortcomings

I am using Databricks Academy extensively and see a few shortcomings in the (new) LMS -- Feedback:- Videos do not have captions/transcript to make them accessible to all audience including non-native English speakers.- "FullScreen" icon does not work...

image
  • 3837 Views
  • 5 replies
  • 1 kudos
Latest Reply
User16847923431
Databricks Employee
  • 1 kudos

Hi @al_joe! My name is Astrid, and I’m on the curriculum team here at Databricks. I was able to review your original post in this thread and wanted to update you on some of the input and questions you provided: Videos do not have captions/transcript ...

  • 1 kudos
4 More Replies
Shan3009
by New Contributor III
  • 4342 Views
  • 3 replies
  • 5 kudos

The transaction log has failed integrity checks. We recommend you contact Databricks support for assistance. Failed verification at version 48 of:

I am trying to write data frame data into delta table. Previously it was working fine but now it is throwing "Log has failed integrity checks"

  • 4342 Views
  • 3 replies
  • 5 kudos
Latest Reply
jcasanella
New Contributor III
  • 5 kudos

@Shanmuganathan Jothikumar​ I've the same exception after upgrading into unity catalog. Need to investigate a little more but adding the following setting, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 5 kudos
2 More Replies
patojo94
by New Contributor II
  • 3360 Views
  • 3 replies
  • 4 kudos

Resolved! pyspark streaming failed for now reason

Hi everyone, I have a pyspark streaming reading from an aws kinesis that suddenly failed for no reason (I mean, we did not make any changes in the last time).It is giving the following error: ERROR MicroBatchExecution: Query kinesis_events_prod_bronz...

  • 3360 Views
  • 3 replies
  • 4 kudos
Latest Reply
jcasanella
New Contributor III
  • 4 kudos

@patricio tojo​ I've the same problem, however in my case is after migrating into unity catalog. Need to investigate a little more but adding this to my spark job, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 4 kudos
2 More Replies
absolutelyRice
by New Contributor III
  • 7601 Views
  • 5 replies
  • 2 kudos

Resolved! Databricks Terraform Provider Issues Passing Providers to Child Modules

I have been following the documentation on the terraform databricks documentation in order to provision account level resources on AWS. I can create the workspace fine, add users, etc... However, when I go to use the provider in non-mws mode, I am re...

  • 7601 Views
  • 5 replies
  • 2 kudos
Latest Reply
absolutelyRice
New Contributor III
  • 2 kudos

So the answer to this was that you need to explicitly pass the provider argument to each of the data resources blocks. The docs should be updated to accommodate that. ​i.e. data "databricks_spark_version" "latest" { provider = databricks.workspace ...

  • 2 kudos
4 More Replies
deficiant_codge
by Contributor II
  • 2119 Views
  • 1 replies
  • 7 kudos

Delta Live tables support for UNITY CATALOG

Is there any upcoming update in which UC will support DLT? if yes any expected ETA?

  • 2119 Views
  • 1 replies
  • 7 kudos
Latest Reply
Pat
Honored Contributor III
  • 7 kudos

Hi @Rahul Mishra​,I think you need to contact your comapny databricks representative for this.The last time I heard about the ETA It was end of the November I believe.You might try to join the Databricks Office Hours tomorrow and ask the question or ...

  • 7 kudos
Ranjeeth
by New Contributor
  • 1556 Views
  • 1 replies
  • 2 kudos
  • 1556 Views
  • 1 replies
  • 2 kudos
Latest Reply
Pat
Honored Contributor III
  • 2 kudos

Hi @Ranjeeth Rikkala​ ,you can check this:https://docs.databricks.com/optimizations/index.html#optimization-recommendations-on-databricksTBH, It’s nit enough info. Make sure you have partioned data, Files are nit too small, try to use bigger cluster,...

  • 2 kudos
HariSelvarajan
by Databricks Employee
  • 721 Views
  • 0 replies
  • 5 kudos

DAIWT22_RadicalSpeenInLakehouse_Photon

Topic: Radical Speed on the Lakehouse: Photon under the hoodI am Hari and I works as a Specialist Solutions Architect at Databricks. I specialise in Data engineering and Cloud platforms problems helping client in EMEA.Purpose:I recently presented a t...

  • 721 Views
  • 0 replies
  • 5 kudos
Sourav_Gulati
by Databricks Employee
  • 814 Views
  • 0 replies
  • 7 kudos

This post is regarding Data Streaming on the Lakehouse' session in Data + AI World Tour 2022 in London. I am Resident Solutions Architect at Datab...

This post is regarding Data Streaming on the Lakehouse' session in Data + AI World Tour 2022 in London. I am Resident Solutions Architect at Databricks. I specialise in data engineering.In this session, I talked about how to leverage real time data t...

  • 814 Views
  • 0 replies
  • 7 kudos
Digan_Parikh
by Valued Contributor
  • 10870 Views
  • 2 replies
  • 3 kudos

Resolved! Default Query Limit 1000

By default, we return back up to 1000 query results when a user runs a cell in Databricks. E.g., if you run display(storeData) and you have ten million customers, the UI will show the first 1000 results. If you graph that by age of customer, similarl...

  • 10870 Views
  • 2 replies
  • 3 kudos
Latest Reply
User16805453151
New Contributor III
  • 3 kudos

This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down.

  • 3 kudos
1 More Replies
Digan_Parikh
by Valued Contributor
  • 3183 Views
  • 2 replies
  • 2 kudos

Resolved! Default Query Limit 1000

Is there any way to change the 1000 for the display row limit at workspace, cluster and notebook level?

  • 3183 Views
  • 2 replies
  • 2 kudos
Latest Reply
User16805453151
New Contributor III
  • 2 kudos

This is simple in Databricks SQL, just uncheck LIMIT 1000 in the drop down.

  • 2 kudos
1 More Replies
labromb
by Contributor
  • 6825 Views
  • 4 replies
  • 8 kudos

Resolved! Create Databricks tables dynamically

Hi, I would like to be able to do something like this...create table if not exists table1using parquetlocation = '/mnt/somelocationsome location needs to be a concatenation of static and code generated string. Documentation suggests that location onl...

  • 6825 Views
  • 4 replies
  • 8 kudos
Latest Reply
PrasanthM
New Contributor III
  • 8 kudos

FString Python can be used. example > spark.sql(f"CREATE TABLE {table_name} (id INT, name STRING, value DOUBLE, state STRING)")

  • 8 kudos
3 More Replies
atul1146
by New Contributor III
  • 2190 Views
  • 2 replies
  • 5 kudos

Resolved! Databricks set up in Prod environment

Hi! can anyone please help me with a documentation which can help me set up integration between data bricks with AWS without a QuickStart default cloud formation template. I would want to use my own CFT rather than using the default due to security ...

  • 2190 Views
  • 2 replies
  • 5 kudos
Latest Reply
Pat
Honored Contributor III
  • 5 kudos

Hi @Atul S​ ,I think that terraform is recommended way to go with Databricks deployment. I mean it's also supported now by the Databricks support.I haven't look much on the CloudFormation setup, because we decided to go with the Terraform in the comp...

  • 5 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels