cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

KVNARK
by Honored Contributor II
  • 1823 Views
  • 2 replies
  • 4 kudos

How much time does it take for the databricks partner account to get created

How much time does it take for the databricks partner account to get created after we submit the application to databricks.?

  • 1823 Views
  • 2 replies
  • 4 kudos
Latest Reply
Harshjot
Contributor III
  • 4 kudos

Hi @KVNARK .​ On training academy? It was instant for me.

  • 4 kudos
1 More Replies
Smitha1
by Databricks Partner
  • 11165 Views
  • 10 replies
  • 6 kudos

Resolved! onsite exam center registration Databricks Certified Associate Developer for Apache Spark 3

Dear All @Nadia Elsayed​  @Vidula Khanna​ @Harshjot Singh​ @Jose Gonzalez​ @Joseph Kambourakis​ Hope you are well and had a good weekend.I am still waiting to receive voucher after redeeming points which is due this weekMy issue is slots are full to ...

  • 11165 Views
  • 10 replies
  • 6 kudos
Latest Reply
nphau
Valued Contributor
  • 6 kudos

I have the same problem as you. I submitted a ticket to Databricks "Help to re-schedule assessment day in webassessor", but they responsed as below: " Please accept my apologies for the inconvenience caused and the delay in responding. I'm sorry to i...

  • 6 kudos
9 More Replies
Paully
by New Contributor
  • 1910 Views
  • 0 replies
  • 0 kudos

Overwrite still saves numerous parquet files in storage container

I inherited this environment and my question is we have a job that mines the the data lake and creates a table that's is grouped by unit number and their data points. The job runs every 10 minutes. We then connect to that table direct query power bi ...

  • 1910 Views
  • 0 replies
  • 0 kudos
DB_developer
by New Contributor III
  • 7485 Views
  • 3 replies
  • 7 kudos

Resolved! How nulls are stored in delta lake and databricks?

In my findings I have found a lot of delta tables in the lake house to be sparse so just wondering what space data lake takes to store null data and also any suggestions to handle sparse data tables in lake house would be appreciated.I also want to o...

  • 7485 Views
  • 3 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 7 kudos

As delta uses parquet files to store data inside delta:"Nullity is encoded in the definition levels (which is run-length encoded). NULL values are not encoded in the data. For example, in a non-nested schema, a column with 1000 NULLs would be encoded...

  • 7 kudos
2 More Replies
Philblakeman
by New Contributor III
  • 7615 Views
  • 4 replies
  • 5 kudos

How to %run a list of notebooks in Databricks

I'd like to %run a list of notebooks from another Databricks notebook.my_notebooks = ["./setup", "./do_the_main_thing", "./check_results"] for notebook in my_notebooks: %run notebookThis doesn't work ofcourse. I don't want to use dbutils.notebook....

  • 7615 Views
  • 4 replies
  • 5 kudos
Latest Reply
Ajay-Pandey
Databricks MVP
  • 5 kudos

Please refer below codeimport scala.concurrent.{Future, Await} import scala.concurrent.duration._ import scala.util.control.NonFatal   case class NotebookData(path: String, timeout: Int, parameters: Map[String, String] = Map.empty[String, String])   ...

  • 5 kudos
3 More Replies
brickster_2018
by Databricks Employee
  • 9239 Views
  • 2 replies
  • 2 kudos

Resolved! How to get the count of files/partition for a Delta table?

I have a delta table and I run optimize command regularly. However, I still see a large number of files in the table. I wanted to get a break up of the files in each partition and identify which partition has more files. What is the easiest way to ge...

  • 9239 Views
  • 2 replies
  • 2 kudos
Latest Reply
brickster_2018
Databricks Employee
  • 2 kudos

The below code snippet will give details about the file count per partitionimport com.databricks.sql.transaction.tahoe.DeltaLog import org.apache.hadoop.fs.Path   val deltaPath = "<table_path>" val deltaLog = DeltaLog(spark, new Path(deltaPath + "/_d...

  • 2 kudos
1 More Replies
Senthil1
by Databricks Partner
  • 2520 Views
  • 1 replies
  • 0 kudos
  • 2520 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Databricks MVP
  • 0 kudos

Hi @SENTHIL KUMARR MALLI SUDARSAN​ below link might help youLink

  • 0 kudos
164079
by Contributor II
  • 10227 Views
  • 14 replies
  • 2 kudos

Resolved! Terraform keep show changes for databricks_sql_permissions on plan and apply

Hi team, A very weird behaviour when using databricks_sql_permissions with terraform, the changes keep repeating to show on plan and apply.Its repeating also after i apply the changes...Please advise.

  • 10227 Views
  • 14 replies
  • 2 kudos
Latest Reply
Pat
Esteemed Contributor
  • 2 kudos

Hi @Avi Edri​ ,I can see from the screen that you are using id = "any file/", it seems to be related to the import: https://registry.terraform.io/providers/databricks/databricks/0.5.3/docs/resources/sql_permissions#importcan you try the below:resourc...

  • 2 kudos
13 More Replies
THIAM_HUATTAN
by Valued Contributor
  • 10060 Views
  • 6 replies
  • 5 kudos

Error in Databricks code?

https://www.databricks.com/notebooks/recitibikenycdraft/data-preparation.htmlCould someone help to see in that Step 3: Prepare Calendar Info# derive complete list of dates between first and last datesdates = ( spark .range(0,days_between).withCol...

  • 10060 Views
  • 6 replies
  • 5 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 5 kudos

Hi @THIAM HUAT TAN​ In your notebook, you are creating a integer column days_between with the codedays_between = (last_date - first_date).days + 10Logically speaking, what the nb trying to do is to fetch all the dates between two dates to do a foreca...

  • 5 kudos
5 More Replies
vr
by Valued Contributor
  • 7047 Views
  • 7 replies
  • 7 kudos

Where can I report about a problem on community.databricks.com?

I tried contact details on the bottom, but they seem to be generic Databricks contact and support links. The issue I faced was this:I think this word made its way to the stop list by a mistake.

wrong stop word
  • 7047 Views
  • 7 replies
  • 7 kudos
Latest Reply
Vartika
Databricks Employee
  • 7 kudos

Hey @Vladimir Ryabtsev​ and @Hubert Dudek​,Thank you for highlighting this. Seems they were added to the block list in combination with other words.We will have this fixed as soon as possible.It's always great to have help from our community members....

  • 7 kudos
6 More Replies
Rishabh-Pandey
by Databricks MVP
  • 4459 Views
  • 6 replies
  • 6 kudos

delta live table

If i have two stages bronze and silver and when i create delta live tables we need to give the target schema to store the results , but i need to store tables in two databases bronze AND silver , for this i need to create two different delta live tab...

  • 4459 Views
  • 6 replies
  • 6 kudos
Latest Reply
Geeta1
Databricks Partner
  • 6 kudos

Hi @Rishabh Pandey​ , yes you have to create 2 DLT tables

  • 6 kudos
5 More Replies
LavaLiah_85929
by New Contributor II
  • 6210 Views
  • 2 replies
  • 1 kudos

Resolved! Log has failed integrity check error when altering a table property

Below is the integrity check error we are getting when trying to set the deletedRetentionFileDuration table property to 10 days. Observation: The table data is sitting in S3. The size of all the files in S3 is in TB. There are millions of files for t...

image.png image
  • 6210 Views
  • 2 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 1 kudos

Please backup your table, then run the repair of filesFSCK REPAIR TABLE table_nameyou can also try to make dry run firstFSCK REPAIR TABLE table_name DRY RUNif data is partitioned can be helpful to refresh metastoreMSCK REPAIR TABLE mytable

  • 1 kudos
1 More Replies
Sreekanth1
by New Contributor II
  • 2498 Views
  • 2 replies
  • 0 kudos

How to pass job task parameters to another task in scala

Hi Team,​I have a requirement in workflow job. Job has two tasks, one is python-task and another one is scala-task (both are running their own cluster).​I have defined dbutils.job.taskValue in python which is not able to read value in scala because o...

  • 2498 Views
  • 2 replies
  • 0 kudos
Latest Reply
Ajay-Pandey
Databricks MVP
  • 0 kudos

Hi @Sreekanth Nallapa​ please refer this link This might help you in this

  • 0 kudos
1 More Replies
Labels