cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kavin
by Databricks Partner
  • 2618 Views
  • 1 replies
  • 2 kudos

Issue converting the datasets into JSON

Im a newbie to Databricks, I need to convert the data sets into JSON. i tried bth FOR JSON AUTO AND FOR JSON PATH, However im getting an issue - [PARSE_SYNTAX_ERROR] Syntax error at or near 'json'line My Query works fine without FOR JSON AUTO AND FOR...

  • 2618 Views
  • 1 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

Hi @Kavin Natarajan​ , Could you please go through https://www.tutorialkart.com/apache-spark/spark-write-dataset-to-json-file-example/ , looks like the steps are okay.

  • 2 kudos
ae20cg
by New Contributor III
  • 7271 Views
  • 3 replies
  • 3 kudos

Databricks web terminal not able to parse notebooks.

In the web terminal, I am not able to search for text for example using grep -l "search_term" db_notebookThis leads to "Operation not permitted on <notebook> error", any ideas why?This is for all DB notebooks in my cluster.Thanks!

  • 7271 Views
  • 3 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi @Andrej Erkelens​ , Could you please start the shell with %sh . Also, could you please provide the whole screenshot with the error here along with the whole command tried?

  • 3 kudos
2 More Replies
StanleyTang
by New Contributor III
  • 3686 Views
  • 3 replies
  • 4 kudos

How to run SQL queries from services when data migrated from SQL server to data lake?

Currently our service provides an API to serve the purchase records. The purchase records are stored in SQL database. To simplify, when users want to get their recent purchase records, they make an API call. The API call will run a SQL query on the D...

  • 3686 Views
  • 3 replies
  • 4 kudos
Latest Reply
Debayan
Databricks Employee
  • 4 kudos

Hi @Stanley Tang​ , There are several rest API resources managed by Databricks. You can refer https://docs.databricks.com/dev-tools/api/latest/index.html. In this scenario, SQL Warehouses API can be used: https://docs.databricks.com/sql/api/sql-endpo...

  • 4 kudos
2 More Replies
gideont
by New Contributor III
  • 5854 Views
  • 2 replies
  • 2 kudos

Resolved! spark sql update really slow

I tried to use Spark as much as possible but experience some regression. Hopefully to get some direction how to use it correctly.I've created a Databricks table using spark.sqlspark.sql('select * from example_view ') \ .write \ .mode('overwr...

image.png
  • 5854 Views
  • 2 replies
  • 2 kudos
Latest Reply
Pat
Esteemed Contributor
  • 2 kudos

Hi, @Vincent Doe​ ,Updates are available in Delta tables, but under the hood you are updating parquet files, it means that each update needs to find the file where records are stored, then re-write the file to new version, and make new file current v...

  • 2 kudos
1 More Replies
ferbystudy
by New Contributor III
  • 5689 Views
  • 3 replies
  • 3 kudos

Resolved! Can´t read a simple .CSV from a blob

Guys, I am using "Databricks Community" to study. I put some files in a Blob, granted all access but I have no ideia why DB is not reading. Please see the code below and thanks for helping! thanks!

csf
  • 5689 Views
  • 3 replies
  • 3 kudos
Latest Reply
ferbystudy
New Contributor III
  • 3 kudos

Guys, i found the problem! ****, databricks! HhahahaFirst i went to datalake and set all access to public/grant all user owner access..I already mounted before.. So after this changes you will need toUnmount and then Mount again! Yeah, after that it ...

  • 3 kudos
2 More Replies
rams
by Contributor
  • 4374 Views
  • 3 replies
  • 4 kudos

Rollback error - Configuring Databricks lakehouse platform with AWS account

I have logged in databricks account and while creating the workspace i have chosen quickstart approach to configure the databricks with AWS. During the quickstart process the databricks page will redirect to aws cloudformation stack page where the ac...

  • 4374 Views
  • 3 replies
  • 4 kudos
Latest Reply
Debayan
Databricks Employee
  • 4 kudos

Hi @rams shonu​ , The error looks like it is due to the length of the roleName. AWS IAM Role names are limited to 64 characters. Could you please try to edit the default roleName and try to append?

  • 4 kudos
2 More Replies
Hubert-Dudek
by Databricks MVP
  • 2910 Views
  • 1 replies
  • 29 kudos

The New DBR LTS runtime is here. Databricks Runtime 11.3 is set to be LTS!   

The New DBR LTS runtime is here. Databricks Runtime 11.3 is set to be LTS! 

11.3 LTS
  • 2910 Views
  • 1 replies
  • 29 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 29 kudos

Great post, thank you for sharing this @Hubert Dudek​ 

  • 29 kudos
Jyo
by New Contributor II
  • 3045 Views
  • 2 replies
  • 0 kudos

Rollback error - during deploying the workspace through quickstart

These are the steps I followed:1) Under quickstart-> added workspace name-> selected N-virginia (us-east-1), and quick start2) next step: except the password, I haven't edited any of the below:stackname- (default) databricks-workspace-stackparameters...

  • 3045 Views
  • 2 replies
  • 0 kudos
Latest Reply
rams
Contributor
  • 0 kudos

Hi Jyo, I followed the same "quickstart" approach to configure the databricks with aws account and got the same errors. Please can you tell us how to solve the issue if you have already solved. Thank you.

  • 0 kudos
1 More Replies
Viren123
by Contributor
  • 13991 Views
  • 2 replies
  • 3 kudos

Resolved! Error : MALFORMED_REQUEST

Hello,I get error for below json. Can you please advice what am I missing here?{    "error_code": "MALFORMED_REQUEST",    "message": "Invalid JSON given in the body of the request - failed to parse given JSON"_________________________________________...

  • 13991 Views
  • 2 replies
  • 3 kudos
Latest Reply
Viren123
Contributor
  • 3 kudos

Thank you Pat. I just realised I had copy pasted the code and it didnt quite copy 1:1. In copy paste double quotes were appeared to be copied but behind the scene, it was some other character. Nothing wrong with the code. It works

  • 3 kudos
1 More Replies
User16752245636
by Databricks Employee
  • 1238 Views
  • 0 replies
  • 1 kudos

Sergio Copy Data World tour Orchestration_v02.pptx (4)

Databricks Workflows: Reliable orchestration for data, analytics, and AIHello, I am a Solutions Architect at Databricks; and I recently presented at the Data and AI World Tour in London Databricks Workflows. You can see some of the presented slides a...

  • 1238 Views
  • 0 replies
  • 1 kudos
al_joe
by Contributor
  • 7209 Views
  • 5 replies
  • 2 kudos

Resolved! Databricks Academy LMS - shortcomings

I am using Databricks Academy extensively and see a few shortcomings in the (new) LMS -- Feedback:- Videos do not have captions/transcript to make them accessible to all audience including non-native English speakers.- "FullScreen" icon does not work...

image
  • 7209 Views
  • 5 replies
  • 2 kudos
Latest Reply
User16847923431
Databricks Employee
  • 2 kudos

Hi @al_joe! My name is Astrid, and I’m on the curriculum team here at Databricks. I was able to review your original post in this thread and wanted to update you on some of the input and questions you provided: Videos do not have captions/transcript ...

  • 2 kudos
4 More Replies
Shan3009
by New Contributor III
  • 6556 Views
  • 3 replies
  • 5 kudos

The transaction log has failed integrity checks. We recommend you contact Databricks support for assistance. Failed verification at version 48 of:

I am trying to write data frame data into delta table. Previously it was working fine but now it is throwing "Log has failed integrity checks"

  • 6556 Views
  • 3 replies
  • 5 kudos
Latest Reply
jcasanella
New Contributor III
  • 5 kudos

@Shanmuganathan Jothikumar​ I've the same exception after upgrading into unity catalog. Need to investigate a little more but adding the following setting, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 5 kudos
2 More Replies
patojo94
by New Contributor II
  • 4964 Views
  • 3 replies
  • 4 kudos

Resolved! pyspark streaming failed for now reason

Hi everyone, I have a pyspark streaming reading from an aws kinesis that suddenly failed for no reason (I mean, we did not make any changes in the last time).It is giving the following error: ERROR MicroBatchExecution: Query kinesis_events_prod_bronz...

  • 4964 Views
  • 3 replies
  • 4 kudos
Latest Reply
jcasanella
New Contributor III
  • 4 kudos

@patricio tojo​ I've the same problem, however in my case is after migrating into unity catalog. Need to investigate a little more but adding this to my spark job, it works:spark.conf.set("spark.databricks.delta.state.corruptionIsFatal", False)

  • 4 kudos
2 More Replies
Labels