cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

carlosancassani
by New Contributor III
  • 490 Views
  • 2 replies
  • 0 kudos

Update DeltaTable on column type ArrayType(): add element to array

Hi all,I need to perform an Update on a Delta Table adding elements to a column of ArrayType(StringType()) which is initialized empty.Before UpdateCol_1 StringType()Col_2 StringType()Col_3 ArrayType()ValVal[ ]After UpdateCol_1 StringType()Col_2 Strin...

Data Engineering
deltatable
Update
  • 490 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @carlosancassani, It seems like you’re trying to append a string to an array column in a Delta table. The error you’re encountering is because you’re trying to assign a string value to an array column, which is not allowed due to type mismatch. To...

  • 0 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 6 Views
  • 0 replies
  • 0 kudos

Nulls in Merge

If you are going to handle any null values in your MERGE condition, better watch out for your syntax #databricks

merge_danger.png
  • 6 Views
  • 0 replies
  • 0 kudos
yogu
by Honored Contributor III
  • 1707 Views
  • 8 replies
  • 73 kudos

Trying to claim reward points but its not reflecting my points

Hi Team,Can anyone help me why my reward point still showing 0 balance. My databricks community points is not reflecting on reward claim portal.i was login first time . Also I was wait for 3 business days but its still not reflecting

image
  • 1707 Views
  • 8 replies
  • 73 kudos
Latest Reply
Kaizen
Contributor III
  • 73 kudos

Can you also share the link for the reward points redemption?

  • 73 kudos
7 More Replies
RakeshRakesh_De
by New Contributor III
  • 176 Views
  • 6 replies
  • 0 kudos

Spark CSV file read option to read blank/empty value from file as empty value only instead Null

Hi,I am trying to read one file which having some blank value in column and we know spark convert blank value to null value during reading, how to read blank/empty value as empty value ?? tried DBR 13.2,14.3I have tried all possible way but its not w...

RakeshRakesh_De_0-1713431921922.png
Data Engineering
csv
EmptyValue
FileRead
  • 176 Views
  • 6 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

dont quote something from stackoverflow because those are old version in spark tried.. have you tried the thing on your own to verify if this really working or not in spark3??

  • 0 kudos
5 More Replies
cszczotka
by New Contributor II
  • 69 Views
  • 0 replies
  • 0 kudos

Ephemeral storage how to create/mount.

Hi,I'm looking for information how to create/mount ephemeral storage to Databricks driver node in Azure Cloud.  Does anyone have any experience working with ephemeral storage?Thanks,

  • 69 Views
  • 0 replies
  • 0 kudos
Dom1
by New Contributor
  • 162 Views
  • 2 replies
  • 0 kudos

Show log4j messages in run output

Hi,I have an issue when running JAR jobs. I expect to see logs in the output window of a run. Unfortunately, I can only see messages of that are generated with "System.out.println" or "System.err.println". Everything that is logged via slf4j is only ...

Dom1_0-1713189014582.png
  • 162 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Dom1,  Ensure that both the slf4j-api and exactly one implementation binding (such as slf4j-simple, logback, or another compatible library) are present in your classpath1.If you’re developing a library, it’s recommended to depend only on slf4j-ap...

  • 0 kudos
1 More Replies
drag7ter
by New Contributor
  • 24 Views
  • 0 replies
  • 0 kudos

Configure Service Principle access to GiLab

I'm facing an issue while trying to run my job in db and my notebooks located in Git Lab. When I run job under my personal user_Id it works fine, because I added Git Lab token to my user_Id profile and job able to pull branch from repository. But whe...

  • 24 Views
  • 0 replies
  • 0 kudos
174817
by New Contributor II
  • 67 Views
  • 2 replies
  • 0 kudos

DataBricks Rust client and/or OpenAPI spec

Hi,I'm looking for a DataBricks client for Rust.  I could only find these SDK implementations.Alternatively, I would be very happy with the OpenAPI spec.  Clearly one exists: the Go SDK implementation contains code to generate itself from such a spec...

Data Engineering
openapi
rust
sdk
unity
  • 67 Views
  • 2 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor
  • 0 kudos

Databricks REST API referenceThis reference contains information about the Databricks application programming interfaces (APIs). Each API reference page is presented primarily from a representational state transfer (REST) perspective. Databricks REST...

  • 0 kudos
1 More Replies
Brammer88
by New Contributor II
  • 297 Views
  • 3 replies
  • 0 kudos

Trying to run databricks academy labs, but execution fails due to method to clearcache not whilelist

Hi there,Im trying to run DE 2.1 - Querying Files Directly on my workspace with a default cluster configuration for found below,but I cannot seem to run this file (or any other labs) as it gives me this error message  Resetting the learning environme...

Brammer88_0-1713340930496.png
  • 297 Views
  • 3 replies
  • 0 kudos
Latest Reply
Brammer88
New Contributor II
  • 0 kudos

works, thanks for the quick response! 

  • 0 kudos
2 More Replies
my_super_name
by New Contributor
  • 183 Views
  • 2 replies
  • 3 kudos

Auto Loader Schema Hint Behavior: Addressing Nested Field Errors

Hello,I'm using the auto loader to stream a table of data and have added schema hints to specify field values.I've observed that when my initial data file is missing fields specified in the schema hint,the auto loader correctly identifies this and ad...

  • 183 Views
  • 2 replies
  • 3 kudos
Latest Reply
my_super_name
New Contributor
  • 3 kudos

Hi @Kaniz Thanks for your help!Your solution works for the initial issue,and I've implemented it first in my code.but it creates a other problem.When we explicitly define the struct hint as 'bbb STRUCT<ccc: INT>',it works until someone adds more fiel...

  • 3 kudos
1 More Replies
data-grassroots
by New Contributor
  • 194 Views
  • 5 replies
  • 0 kudos

Ingesting Files - Same file name, modified content

We have a data feed with files whose filenames stays the same but the contents change over time (brand_a.csv, brand_b.csv, brand_c.csv ....).Copy Into seems to ignore the files when they change.If we set the Force flag to true and run it, we end up w...

  • 194 Views
  • 5 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

If you do not have control over the content of the files I suggest the following:Each day you get new files/data (I suppose these are not incremental).  These files contain new, updated and deleted data, and are overwritten.Because of this, autoloade...

  • 0 kudos
4 More Replies
RiyazAli
by Contributor III
  • 387 Views
  • 1 replies
  • 0 kudos

Unable to create a record_id column via DLT - Autoloader

Hi Community,I'm trying to load data from the landing zone to the bronze layer via DLT- Autoloader, I want to add a column record_id to the bronze table while I fetch my data. I'm also using file arrival trigger in the workflow to update my table inc...

  • 387 Views
  • 1 replies
  • 0 kudos
Latest Reply
RiyazAli
Contributor III
  • 0 kudos

Hey @Kaniz  - could you or any body from the community team help me here, please? I've been stuck since quite some time now.

  • 0 kudos
cosminsanda
by New Contributor II
  • 148 Views
  • 4 replies
  • 1 kudos

Resolved! Unit Testing with the new Databricks Connect in Python

I would like to create a regular PySpark session in an isolated environment against which I can run my Spark based tests. I don't see how that's possible with the new Databricks Connect. I'm going in circles here, is it even possible?I don't want to ...

  • 148 Views
  • 4 replies
  • 1 kudos
Latest Reply
cosminsanda
New Contributor II
  • 1 kudos

Ok, so the best solution as it stands today (for me personally at least) is this:Have pyspark ^3.4 installed with the connect extra feature.My unit tests then don't have to change at all, as they use the regular spark session created on the flyFor ru...

  • 1 kudos
3 More Replies
Karlo_Kotarac
by New Contributor II
  • 28 Views
  • 0 replies
  • 0 kudos

Run failed with error message ContextNotFound

Hi all!Recently we've been getting lots of these errors when running Databricks notebooks:At that time we observed DRIVER_NOT_RESPONDING (Driver is up but is not responsive, likely due to GC.) log on the single-user cluster we use.Previously when thi...

Karlo_Kotarac_0-1713422302017.png
  • 28 Views
  • 0 replies
  • 0 kudos
Phani1
by Valued Contributor
  • 53 Views
  • 1 replies
  • 0 kudos

Code Review tools

Could you kindly recommend any Code Review tools that would be suitable for our Databricks tech stack?

Data Engineering
code review
  • 53 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Phani1, When it comes to code review tools for your Databricks tech stack, here are some options you might find useful: Built-in Interactive Debugger in Databricks Notebook: The interactive debugger is available exclusively for Python code withi...

  • 0 kudos
Labels
Top Kudoed Authors