cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

HaripriyaP
by New Contributor
  • 52 Views
  • 1 replies
  • 0 kudos

Multiple Tables Migration from one workspace to another.

Hi all!I need to copy multiple tables from one workspace to another with metadata information. Is there any way to do it?Please reply as soon as possible.

  • 52 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 0 kudos

@HaripriyaP - Depends on your use case, Either of the below approach can be chosen. 1)  DELTA CLONE(DEEP CLONE) to clone them to the new workspace. 2) Have the same cluster policy/Instance profile of the old workspace to access them in the new worksp...

  • 0 kudos
bozhu
by Contributor
  • 919 Views
  • 4 replies
  • 0 kudos

Delta Live Tables Materialised View Column Comment Error

While materialised view doc says MVs support columns comments, this does not seem like the case for MVs created by DLT. For example, when trying to add a comment to a MV created by DLT, it errors:Any ideas on when this will be fixed/supported?

bozhu_0-1692702233893.png
  • 919 Views
  • 4 replies
  • 0 kudos
Latest Reply
bozhu
Contributor
  • 0 kudos

Just to close the loop here that it seems DLT generated MVs now support column comments.

  • 0 kudos
3 More Replies
Chinu
by New Contributor III
  • 976 Views
  • 1 replies
  • 0 kudos

How do I access to DLT advanced configuration from python notebook?

Hi Team, Im trying to get DLT Advanced Configuration value from the python dlt notebook. For example, I set "something": "some path" in Advanced configuration in DLT and I want to get the value from my dlt notebook. I tried "dbutils.widgets.get("some...

  • 976 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

The following docs will help. Please check the examples https://docs.databricks.com/en/delta-live-tables/settings.html#parameterize-pipelines

  • 0 kudos
jenshumrich
by New Contributor III
  • 58 Views
  • 1 replies
  • 0 kudos

Filter not using partition

I have the following code:spark.sparkContext.setCheckpointDir("dbfs:/mnt/lifestrategy-blob/checkpoints") result_df.repartitionByRange(200, "IdStation") result_df_checked = result_df.checkpoint(eager=True) unique_stations = result_df.select("IdStation...

  • 58 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Please check the physical query plan. Add .explain() API to your existing call and check the physical query plan for any filter push-down  values happening in your query.

  • 0 kudos
toolhater
by New Contributor II
  • 67 Views
  • 1 replies
  • 0 kudos

Installing dlt causing error

I'm trying to use the example in big book of engineering 2nd edition-final.pdf and I had an issue with the statementimport dltSo I created another cell and installed it and I noticed I was getting this error:"dataclass_transform() got an unexpected k...

  • 67 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

could you get the full error stack trace please

  • 0 kudos
anish2102
by New Contributor
  • 77 Views
  • 1 replies
  • 0 kudos

Pyspark operations slowness in CLuster 14.3LTS as compared to 13.3 LTS

In my notebook, i am performing few join operations which are taking more than 30s in cluster 14.3 LTS where same operation is taking less than 4s in 13.3 LTS cluster. Can someone help me how can i optimize pyspark operations like joins and withColum...

Data Engineering
clustr-14.3
spark-3.5
  • 77 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

check the physical query plan for both, DBR 14.3 and 13.3 to compare if these values are different. If they are, then check the Spark UI to identify where did it changed

  • 0 kudos
Karlo_Kotarac
by New Contributor II
  • 34 Views
  • 1 replies
  • 0 kudos

Run failed with error message ContextNotFound

Hi all!Recently we've been getting lots of these errors when running Databricks notebooks:At that time we observed DRIVER_NOT_RESPONDING (Driver is up but is not responsive, likely due to GC.) log on the single-user cluster we use.Previously when thi...

Karlo_Kotarac_0-1713422302017.png
  • 34 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

are you able to get the full error stack trace from the driver's logs? 

  • 0 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 40 Views
  • 1 replies
  • 0 kudos

Nulls in Merge

If you are going to handle any null values in your MERGE condition, better watch out for your syntax #databricks

merge_danger.png
  • 40 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Thank you for sharing @Hubert-Dudek 

  • 0 kudos
carlosancassani
by New Contributor III
  • 497 Views
  • 2 replies
  • 0 kudos

Update DeltaTable on column type ArrayType(): add element to array

Hi all,I need to perform an Update on a Delta Table adding elements to a column of ArrayType(StringType()) which is initialized empty.Before UpdateCol_1 StringType()Col_2 StringType()Col_3 ArrayType()ValVal[ ]After UpdateCol_1 StringType()Col_2 Strin...

Data Engineering
deltatable
Update
  • 497 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @carlosancassani, It seems like you’re trying to append a string to an array column in a Delta table. The error you’re encountering is because you’re trying to assign a string value to an array column, which is not allowed due to type mismatch. To...

  • 0 kudos
1 More Replies
yogu
by Honored Contributor III
  • 1709 Views
  • 8 replies
  • 73 kudos

Trying to claim reward points but its not reflecting my points

Hi Team,Can anyone help me why my reward point still showing 0 balance. My databricks community points is not reflecting on reward claim portal.i was login first time . Also I was wait for 3 business days but its still not reflecting

image
  • 1709 Views
  • 8 replies
  • 73 kudos
Latest Reply
Kaizen
Contributor III
  • 73 kudos

Can you also share the link for the reward points redemption?

  • 73 kudos
7 More Replies
RakeshRakesh_De
by New Contributor III
  • 200 Views
  • 6 replies
  • 0 kudos

Spark CSV file read option to read blank/empty value from file as empty value only instead Null

Hi,I am trying to read one file which having some blank value in column and we know spark convert blank value to null value during reading, how to read blank/empty value as empty value ?? tried DBR 13.2,14.3I have tried all possible way but its not w...

RakeshRakesh_De_0-1713431921922.png
Data Engineering
csv
EmptyValue
FileRead
  • 200 Views
  • 6 replies
  • 0 kudos
Latest Reply
RakeshRakesh_De
New Contributor III
  • 0 kudos

dont quote something from stackoverflow because those are old version in spark tried.. have you tried the thing on your own to verify if this really working or not in spark3??

  • 0 kudos
5 More Replies
cszczotka
by New Contributor II
  • 135 Views
  • 0 replies
  • 0 kudos

Ephemeral storage how to create/mount.

Hi,I'm looking for information how to create/mount ephemeral storage to Databricks driver node in Azure Cloud.  Does anyone have any experience working with ephemeral storage?Thanks,

  • 135 Views
  • 0 replies
  • 0 kudos
Dom1
by New Contributor
  • 163 Views
  • 2 replies
  • 0 kudos

Show log4j messages in run output

Hi,I have an issue when running JAR jobs. I expect to see logs in the output window of a run. Unfortunately, I can only see messages of that are generated with "System.out.println" or "System.err.println". Everything that is logged via slf4j is only ...

Dom1_0-1713189014582.png
  • 163 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Dom1,  Ensure that both the slf4j-api and exactly one implementation binding (such as slf4j-simple, logback, or another compatible library) are present in your classpath1.If you’re developing a library, it’s recommended to depend only on slf4j-ap...

  • 0 kudos
1 More Replies
drag7ter
by New Contributor II
  • 28 Views
  • 0 replies
  • 0 kudos

Configure Service Principle access to GiLab

I'm facing an issue while trying to run my job in db and my notebooks located in Git Lab. When I run job under my personal user_Id it works fine, because I added Git Lab token to my user_Id profile and job able to pull branch from repository. But whe...

  • 28 Views
  • 0 replies
  • 0 kudos
Labels
Top Kudoed Authors