cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AJDJ
by New Contributor III
  • 2660 Views
  • 3 replies
  • 1 kudos

Pipeline in Community edition

Hi there,I'm learning Databricks using the community edition. I noticed I don't have way to practice the pipeline in community edition. (The icon below compute). Says I need to upgrade. Is there any way to practice pipeline and follow learning lesson...

  • 2660 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @AJ DJ​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 1 kudos
2 More Replies
gbradley145
by New Contributor III
  • 10929 Views
  • 4 replies
  • 5 kudos

SQL CTE in Databricks or something similar?

%sqlWITH genCTE AS (SELECT MAX(PredID) + 1 AS PredID, 145 AS SystemIDFROM TableAUNION ALLSELECT PredID + 1FROM genCTE)SELECT * FROM genCTEWhen I attempt this, I get an error that genCTE does not exists.There may be a better way to what I am trying to...

  • 10929 Views
  • 4 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Greg Bradley​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 5 kudos
3 More Replies
William_Scardua
by Valued Contributor
  • 7786 Views
  • 7 replies
  • 3 kudos

uuid in Merge

Hi guys,I'm trying to use uuid in the merge but I always get an error...import uuid   ( df_events.alias("events").merge( source = df_updates.alias("updates"), condition = "events.cod = updates.cod and events.num = updates.num" ).whenMatch...

  • 7786 Views
  • 7 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @William Scardua​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 3 kudos
6 More Replies
sm1
by New Contributor III
  • 4263 Views
  • 5 replies
  • 3 kudos

New Visualization Tools

How do I add new visualization tool option to my databricks? I don't see a plus sign that will let you choose "Visualization" in my display command results :(. Please help.

Capture
  • 4263 Views
  • 5 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Suky Muliadikara​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...

  • 3 kudos
4 More Replies
ramankr48
by Contributor II
  • 3339 Views
  • 2 replies
  • 3 kudos

Issue with identity key column in databricks?

For the identity key I've used both GENERATED ALWAYS AS IDENTITY(start with 1 increment by 1) andGENERATED BY DEFAULT AS IDENTITY(start with 1 increment by 1)but in both cases, if I'm running my script once then it is fine (identity key is working as...

  • 3339 Views
  • 2 replies
  • 3 kudos
Latest Reply
lizou
Contributor II
  • 3 kudos

yes, by default option allow duplicated values per design.I will avoid this option and use only use GENERATED ALWAYS AS IDENTITY Using BY DEFAULT option is worse than not using it at all in BY Default option, If I forget to set starting value, the ID...

  • 3 kudos
1 More Replies
bozhu
by Contributor
  • 1704 Views
  • 1 replies
  • 4 kudos

DLT DataPlaneException

I created an Azure Databricks workspace with my Visual Studio Subsciption, so far everything has been working as expected although I have requested to increase CPU core limit once.I am now getting this "DataPlaneException" error in the DTL during "Wa...

  • 1704 Views
  • 1 replies
  • 4 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 4 kudos

@Bo Zhu​ can we get more error log, looks quota limit exceeded. did you get a chance to check quota in azure portal and see if cores exists for config that you selected. try to select another cluster config and validate

  • 4 kudos
AmineHY
by Contributor
  • 2897 Views
  • 1 replies
  • 4 kudos

My DLT pipeline return ACL Verification Failed

Python Commanddf = spark.read.format('csv').option('sep', ';').option("recursiveFileLookup", "true").load('dbfs:/***/data_files/PREVISIONS/')Here is the content of the folder  Each folder contain the following files: Full logorg.apache.spark.sql.stre...

image image.png
  • 2897 Views
  • 1 replies
  • 4 kudos
Latest Reply
AmineHY
Contributor
  • 4 kudos

Yes some of the files I don't have the right to access (mistakenly) In this case, how do you think I can tell DTL to handle this exception and ignore the file, since I can read some files but not all?

  • 4 kudos
Retko
by Contributor
  • 1242 Views
  • 1 replies
  • 2 kudos

How to jump back to latest positions in the Notebook

Hi,when developing I often need to jump around the Notebook to fix and run things. It would be really helpful so I can jump back to several latest positions (cells), similarly, like in Office Word by SHIFT+F5 key. Is here a way now in Databricks?Than...

  • 1242 Views
  • 1 replies
  • 2 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 2 kudos

@Retko Okter​ go to any of notebook and click on help-->keyboard shortcuts, they will show all possibilities that you need

  • 2 kudos
db-avengers2rul
by Contributor II
  • 2126 Views
  • 2 replies
  • 3 kudos

course code - 'ACAD-INTRO-DELTALAKE' Notebook has errors

Dear DB Team,While following a course from DB Academy course code - 'ACAD-INTRO-DELTALAKE' noticed the notebooks has errors can you please check i have also attached the notebookRegards,Rakesh

  • 2126 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Rakesh Reddy Gopidi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from yo...

  • 3 kudos
1 More Replies
elgeo
by Valued Contributor II
  • 5217 Views
  • 0 replies
  • 2 kudos

SQL While do loops

Hello. Could you please suggest a workaround for a while do loop in Databricks SQL?WHILE LSTART>0 DO SET LSTRING=CONCAT(LSTRING, VSTRING2)Thank you in advance

  • 5217 Views
  • 0 replies
  • 2 kudos
BradSheridan
by Valued Contributor
  • 3460 Views
  • 3 replies
  • 4 kudos

Resolved! dropDuplicates

Afternoon Community!! I've done some research today and found multiple, great approaches to accomplish what I'm trying to do, but having trouble understanding exactly which is best suited for my use case.Suppose you're running Auto Loader on S3 and u...

  • 3460 Views
  • 3 replies
  • 4 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 4 kudos

If you records are partitioned to narrow down your search, then can you try writing an upsert logic after autoloader code?The upsert logic will insert, update or drop rows as per your conditions.

  • 4 kudos
2 More Replies
kkumar
by New Contributor III
  • 21748 Views
  • 3 replies
  • 7 kudos

Resolved! can we update a Parquet file??

i have copied a table in to a Parquet file now can i update a row or a column in a parquet file without rewriting all the data as the data is huge.using Databricks or ADFThank You

  • 21748 Views
  • 3 replies
  • 7 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 7 kudos

You can only append Data with Parquet that's why you need to convert your parquet table to Delta. It will be much easier.

  • 7 kudos
2 More Replies
Anonymous
by New Contributor III
  • 9946 Views
  • 5 replies
  • 5 kudos

Resolved! Override and Merge mode write using AutoLoader in Databricks

We are reading files using Autoloader in Databricks. Source system is giving full snapshot of complete data in files. So we want to read the data and write in delta table in override mode so all old data is replaced by the new data. Similarly for oth...

  • 9946 Views
  • 5 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

@Ranjeet Jaiswal​ ,afaik merge is supported:https://docs.databricks.com/_static/notebooks/merge-in-streaming.htmlThis link does some aggregation but that can be ommitted of course.​The interesting part here is outputMode("update"), and the foreachBat...

  • 5 kudos
4 More Replies
Oliver_Floyd
by Contributor
  • 2939 Views
  • 4 replies
  • 6 kudos

Where to find documentation about : spark.databricks.driver.strace.enabled

Hello ,For a support request, Microsoft support ask me to add spark.databricks.driver.strace.enabled trueto my cluster configuration.MS was not able to send me a link to the documentation and I did not find it on the databricks website.Can someone he...

  • 2939 Views
  • 4 replies
  • 6 kudos
Latest Reply
Oliver_Floyd
Contributor
  • 6 kudos

Yes no problem. I have a python program, called "post ingestion", that run on a databricks job cluster during the night and consist of :inserting data to a deltalake tableexecuting an optimize command on that tableexecuting a vacuum command on that t...

  • 6 kudos
3 More Replies
Dusko
by New Contributor III
  • 2215 Views
  • 2 replies
  • 3 kudos

Resolved! Don't receiving password reset email

Hi, our admin created new user in https://accounts.cloud.databricks.com/ with my email dusan.vystrcil@datasentics.com ​But I didn't received any confirmation email. When I try to sign in and click on "reset password", I still didn't received any emai...

  • 2215 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @karthik p​ Thank you for reaching out, and we’re sorry to hear about this log-in issue! We have this Community Edition login troubleshooting post on Community. Please take a look, and follow the troubleshooting steps. If the steps do not resolve ...

  • 3 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels