cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SailajaB
by Valued Contributor III
  • 1963 Views
  • 4 replies
  • 4 kudos

facing format issue while converting one type nested json to other brand new json schema

Hi,We are writing our flatten json dataframe to user defined nested schema json using pysprk in Databricks.But we are not getting the expected formatExpecting : {"ID":"aaa",c_id":[{"con":null,"createdate":"2015-10-09T00:00:00Z","data":null,"id":"1"},...

  • 1963 Views
  • 4 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 4 kudos

as @wereners said you need to share the code. If it is dataframe to json probably you need to use StructType - Array to get that list but without code is hard to help.

  • 4 kudos
3 More Replies
JD2
by Contributor
  • 5036 Views
  • 4 replies
  • 4 kudos

Resolved! Databricks Delta Table

Hello:I am new to databricks and need little help on Delta Table creation.I am having great difficulty to understand creating of delta table and they are:-Do I need to create S3 bucket for Delta Table? If YES then do I have to mount on the mountpoint...

  • 5036 Views
  • 4 replies
  • 4 kudos
Latest Reply
mathan_pillai
Databricks Employee
  • 4 kudos

Hi Jay,I would suggest to start with creating managed delta table. please run a simple commandCREATE TABLE events(id long) USING DELTAThis will create a managed delta table called "events"Then perform %sql describe extended eventsThe above command ...

  • 4 kudos
3 More Replies
Siddhesh2525
by New Contributor III
  • 8569 Views
  • 4 replies
  • 4 kudos

how to set retry attempt and how to set email alert with error message of databricks notebook

how to set retry attempt in the data bricks notebook in term of like if any cmd /cell get fails that times that particular cmd/cell should be rerun for purpose of connection issue etc.

  • 8569 Views
  • 4 replies
  • 4 kudos
Latest Reply
Siddhesh2525
New Contributor III
  • 4 kudos

"you can just implement try/except in cell, handling it by using dbutils.notebook.exit(jobId) and using other dbutils can help,@HubertDudek As i am fresher in the databricks ,Could you please suggest /explain me in detail

  • 4 kudos
3 More Replies
Brose
by New Contributor III
  • 13214 Views
  • 9 replies
  • 2 kudos

Creating a delta table Mismatch Input Error

I am trying to create a delta table for streaming data, but I am getting the following error; Error in SQL statement: ParseException: mismatched input 'CREATE' expecting {<EOF>, ';'}(line 2, pos 0).My statement is as follows;%sqlDROP TABLE IF EXISTS ...

  • 13214 Views
  • 9 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Ambrose Walker​ - If Jose's answer resolved your issue, would you be happy to mark that post as best? That will help others find the solution more quickly.

  • 2 kudos
8 More Replies
missyT
by New Contributor III
  • 1266 Views
  • 1 replies
  • 0 kudos

ISP Network Question

Some ISP's like Charter have their systems configured in such a way that from a customers router the ARP table for all of the IP's in the subnet show the same MAC address. The IP it hands off through their modem to the CPE router is a /22. When you t...

  • 1266 Views
  • 1 replies
  • 0 kudos
Latest Reply
Prabakar
Databricks Employee
  • 0 kudos

Hi, @Missy Trussell​ I don't see this to be a Databricks related question. I would suggest that you raise this query in StackOverflow or some networking-related forums.

  • 0 kudos
JD2
by Contributor
  • 3106 Views
  • 4 replies
  • 7 kudos

Resolved! Lakehouse with Delta Lake Deep Dive Training

Hello:As per link shown below, I need help to see from where I can get the DBC file for hands-on training.https://www.youtube.com/watch?v=znv4rM9wevc&ab_channel=DatabricksAny help is greatly appreciated.Thanks

  • 3106 Views
  • 4 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 7 kudos

Thank you for url just watching it

  • 7 kudos
3 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 960 Views
  • 0 replies
  • 19 kudos

docs.databricks.com

Databricks Runtime 10.2 Beta is available from yesterday.More details here: https://docs.databricks.com/release-notes/runtime/10.2.htmlNew features and improvementsUse Files in Repos with Spark StreamingDatabricks Utilities adds an update mount comma...

image.png
  • 960 Views
  • 0 replies
  • 19 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1670 Views
  • 2 replies
  • 18 kudos

I thought that Azure Data Factory is built on spark but now when I crushed it I see that is build directly on databricks :-)

I thought that Azure Data Factory is built on spark but now when I crushed it I see that is build directly on databricks

image.png
  • 1670 Views
  • 2 replies
  • 18 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 18 kudos

correct. Because Data Flows were available before their own (MS) spark pools were available.But let's be honest: that is only a good thing

  • 18 kudos
1 More Replies
guruv
by New Contributor III
  • 4994 Views
  • 4 replies
  • 2 kudos

Resolved! delta table autooptimize vs optimize command

HI,i have several delta tables on Azure adls gen 2 storage account running databricks runtime 7.3. there are only write/read operation on delta tables and no update/delete.As part of release pipeline, below commands are executed in a new notebook in...

  • 4994 Views
  • 4 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

the auto optimize is sufficient, unless you run into performance issues.Then I would trigger an optimize. This will generate files of 1GB (so larger than the standard size of auto optimize). And of course the Z-Order if necessary.The suggestion to ...

  • 2 kudos
3 More Replies
MadelynM
by Databricks Employee
  • 1070 Views
  • 0 replies
  • 1 kudos

vimeo.com

Repos let you use Git functionality such as cloning a remote repo, managing branches, pushing and pulling changes and visually comparing differences upon commit. Here's a quick video (3:56) on setting up a repo for Databricks on AWS. Pre-reqs: Git in...

  • 1070 Views
  • 0 replies
  • 1 kudos
MadelynM
by Databricks Employee
  • 691 Views
  • 0 replies
  • 0 kudos

vimeo.com

A job is a way of running a notebook either immediately or on a scheduled basis. Here's a quick video (4:04) on how to schedule a job and automate a workflow for Databricks on AWS. To follow along with the video, import this notebook into your worksp...

  • 691 Views
  • 0 replies
  • 0 kudos
MadelynM
by Databricks Employee
  • 806 Views
  • 0 replies
  • 1 kudos

vimeo.com

Auto Loader provides Python and Scala methods to ingest new data from a folder location into a Delta Lake table by using directory listing or file notifications. Here's a quick video (7:00) on how to use Auto Loader for Databricks on AWS with Databri...

  • 806 Views
  • 0 replies
  • 1 kudos
marchello
by New Contributor III
  • 2596 Views
  • 5 replies
  • 6 kudos

Resolved! register model - need python 3, but get only python 2

Hi all, I'm trying to register a model with python 3 support, but continue getting only python 2. I can see that runtime 6.0 and above get python 3 by default, but I don't see a way to set neither runtime version, nor python version during model regi...

  • 2596 Views
  • 5 replies
  • 6 kudos
Latest Reply
marchello
New Contributor III
  • 6 kudos

Hi team, thanks for getting back to me. Let's put this on hold for now. I will update once it's needed again. It was solely for education purpose and right now I have quite urgent stuff to do.Have a great day. 

  • 6 kudos
4 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels