cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Esteemed Contributor III
  • 1351 Views
  • 1 replies
  • 4 kudos

spark 3.4 and databricks 13 introduce two new types of timestamps for handling time zone information:- TIMESTAMP WITH LOCAL TIME ZONE: This type assum...

spark 3.4 and databricks 13 introduce two new types of timestamps for handling time zone information:- TIMESTAMP WITH LOCAL TIME ZONE: This type assumes that the input data is in the session's local time zone and converts it to UTC before processing....

timezone
  • 1351 Views
  • 1 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

This is helpful! Timestamps are always the reason to mess up the business logic as we know.

  • 4 kudos
J_
by New Contributor II
  • 9915 Views
  • 6 replies
  • 5 kudos

Resolved! Clusters stuck on pending indefinitely (community edition)

From yesterday, suddenly clusters do not start and are in the pending state indefinitely (more than 30 minutes). From a previous post, I tried to add 443 port to the firewall but it doesn't help. In the clusters page, the message says: Finding instan...

  • 9915 Views
  • 6 replies
  • 5 kudos
Latest Reply
Reet
New Contributor II
  • 5 kudos

I am also having same issue and there seems to be no outage.....

  • 5 kudos
5 More Replies
bakiya
by New Contributor II
  • 3533 Views
  • 8 replies
  • 0 kudos

I have successfully passed the test after completion of the course I haven't received any badge from your side as promised. I have been provided with a certificate which looks *****. Please provide me with the badge. My cetificate ID is ID: E-03DK31

I tried log in to > https://credentials.databricks.com/ with a registered email address and not able to find any Badge there. I need badge for the test i taken and passed, do needful.

  • 3533 Views
  • 8 replies
  • 0 kudos
Latest Reply
ftmoztl
New Contributor II
  • 0 kudos

Hello!The same situation is valid for me. I couldn't reach my badge from the following link. Could you help me too? @Vidula Khanna​ https://customer-academy.databricks.com/legacy/lms/index.php%3Fr%3DmyActivities/index%26tab%3Dbadges

  • 0 kudos
7 More Replies
Priyag1
by Honored Contributor II
  • 778 Views
  • 0 replies
  • 9 kudos

New unified Databricks navigation - New Release to Public preview check it out : Databricks release a new navigation experience to public preview.  Th...

New unified Databricks navigation - New Release to Public preview check it out :Databricks release a new navigation experience to public preview. The goal is to reduce clicks and context switches required to complete tasks. The new experience include...

  • 778 Views
  • 0 replies
  • 9 kudos
Priyag1
by Honored Contributor II
  • 752 Views
  • 0 replies
  • 9 kudos

Databricks MarketplaceDatabricks Marketplace, an open forum for exchanging data products. Databricks Marketplace takes advantage of Delta Sharing to g...

Databricks MarketplaceDatabricks Marketplace, an open forum for exchanging data products. Databricks Marketplace takes advantage of Delta Sharing to give data providers the tools to share data products securely and data consumers the power to explore...

  • 752 Views
  • 0 replies
  • 9 kudos
PareDesa_10157
by New Contributor II
  • 949 Views
  • 1 replies
  • 1 kudos

My cluster shows time to time this message, what is the permanent solution

MessageCluster terminated. Reason: Spark Image Download FailureHelpFailed to set up spark container due to an image download failure: Exception when downloading spark image: Stdout: /usr/local/bin/fastar /usr/local/bin/fastarIhave tried enabling the ...

  • 949 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi, This error can be surfaced due to various reasons, such as networking , slowness etc.. Is there any other significant errors showing up in the driver logs? Is this for a single cluster? Is this a new workspace? You can start with checking on the ...

  • 1 kudos
duliu
by New Contributor II
  • 8801 Views
  • 6 replies
  • 0 kudos

databricks-connect fails with java.lang.IllegalStateException: No api token found in local properties

I configured databricks-connect locally and want to run spark code against a remote cluster.I verified `databricks-connect test` passes and it can connect to remote databricks cluster.However when I query a tables or read parquet from s3, it fails wi...

  • 8801 Views
  • 6 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Du Liu​ :The error message suggests that there is no API token found in the local properties. This could be the cause of the failure when trying to access the tables or read parquet files from S3.To fix this issue, you need to ensure that the API to...

  • 0 kudos
5 More Replies
Sid0610
by New Contributor II
  • 7497 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks SQL CREATE TABLE ParseException

I am trying to use the following code to create a deltatable%sqlCREATE TABLE rectangles(a INT, b INT, area INT GENERATED ALWAYS AS IDENTITY (START WITH 1, STEP BY 1))I don't know why but I am always getting the ParseException error.I tried all other ...

  • 7497 Views
  • 3 replies
  • 3 kudos
Latest Reply
emiratesevisaon
New Contributor II
  • 3 kudos

How can we use SQL for my website emiratesevisaonline.com backend date?

  • 3 kudos
2 More Replies
Johny
by New Contributor III
  • 6713 Views
  • 2 replies
  • 4 kudos

Insert data to a CDF-enabled Delta table throwing java.lang.StackOverflowError

I am building a bronze table with CDF-enables in these steps:Initially, Reading json file from landing zone and write to table locationdf = spark.readStream.format("cloudFiles") \ .option("cloudFiles.schemaLocation", <schema_loc>) \ .option("clou...

  • 6713 Views
  • 2 replies
  • 4 kudos
Latest Reply
Johny
New Contributor III
  • 4 kudos

I tried with a simple csv file that only has one column. I got the same error.

  • 4 kudos
1 More Replies
varunsaagar
by New Contributor III
  • 9858 Views
  • 17 replies
  • 28 kudos

Request for reattempt voucher. Databricks Certified Machine Learning Professional exam

HiOn December 28th ,I attempted the Databricks Certified Machine Learning Professional exam for 1st time , unfortunately I ended up by failing grade. My passing grade was 70%, and I received 68.33%.I am planning to reattempt the exam, Could you kindl...

  • 9858 Views
  • 17 replies
  • 28 kudos
Latest Reply
girl_chan
New Contributor II
  • 28 kudos

What is the next event where they will give a voucher?

  • 28 kudos
16 More Replies
JKR
by Contributor
  • 2643 Views
  • 2 replies
  • 0 kudos

The spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached.

Getting below error Context: Using Databricks shared interactive cluster for scheduled run multiple parallel jobs at the same time after every 5 mins. When I check Ganglia, driver node's memory reaches almost max and then restart of driver happens an...

  • 2643 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 0 kudos

please check the driver's logs, for example the log4j and the GC logs

  • 0 kudos
1 More Replies
jonasmin
by New Contributor III
  • 8837 Views
  • 7 replies
  • 2 kudos

Error while establishing JDBC connection to Azure databricks via HTTP proxy

I am using the databricks JDBC driver (https://databricks.com/spark/jdbc-drivers-download) to connect to Azure databricks.The connection needs to be routed through a HTTP proxy. I found parameters that can be configured for using the HTTP proxy:By pa...

databricks jdbc
  • 8837 Views
  • 7 replies
  • 2 kudos
Latest Reply
MS_Varma
New Contributor II
  • 2 kudos

Hi @Jonas Minning​ , actually I am also having the same issue and when i looked into the driver related documentation I found that the driver currently only supports SOCKS proxies and I believe this is the reason why we are getting this error. So, I ...

  • 2 kudos
6 More Replies
knowAsha
by New Contributor II
  • 3492 Views
  • 3 replies
  • 3 kudos

Error while running the data engineering course notebook : "DE 2.2 - Providing Options for External Sources"

 Can somebody help me fixing this problem? I am running this notebook on databricks community edition

I am running this notebook in Databricks community edition.
  • 3492 Views
  • 3 replies
  • 3 kudos
Latest Reply
lemfo
New Contributor II
  • 3 kudos

df = spark.read.format('parquet').load(path = datasource_path) df = df.select("*").toPandas() df.to_sql('users', conn, if_exists='replace', index = False)

  • 3 kudos
2 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 2925 Views
  • 2 replies
  • 8 kudos

Implementing a data vault model in databricks can be challenging, but it can significantly improve the manageability of your data, particularly in hea...

Implementing a data vault model in databricks can be challenging, but it can significantly improve the manageability of your data, particularly in heavily regulated industries such as banking. While it may involve significant data duplication, duplic...

ezgif-4-58a39917b2
  • 2925 Views
  • 2 replies
  • 8 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 8 kudos

helpful

  • 8 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels