cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Esteemed Contributor III
  • 658 Views
  • 1 replies
  • 4 kudos

spark 3.4 and databricks 13 introduce two new types of timestamps for handling time zone information:- TIMESTAMP WITH LOCAL TIME ZONE: This type assum...

spark 3.4 and databricks 13 introduce two new types of timestamps for handling time zone information:- TIMESTAMP WITH LOCAL TIME ZONE: This type assumes that the input data is in the session's local time zone and converts it to UTC before processing....

timezone
  • 658 Views
  • 1 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

This is helpful! Timestamps are always the reason to mess up the business logic as we know.

  • 4 kudos
185369
by New Contributor II
  • 810 Views
  • 3 replies
  • 1 kudos

Resolved! DLT with UC Access Denied sqs

I am going to use the newly released DLT with UC.But it keeps getting access denied. As I keep tracking the reasons, it seems that an account. ID other than my account ID or Databricks account ID is being requested.I cannot use '*' in principal attri...

  • 810 Views
  • 3 replies
  • 1 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 1 kudos

Every service on AWS, an SQS queue, and all the other services in your stack using that queue will be configured with minimal permissions, leading to access issues. So, make sure you get your IAM policies set up correctly before deploying to producti...

  • 1 kudos
2 More Replies
Kaniz
by Community Manager
  • 2195 Views
  • 5 replies
  • 11 kudos
  • 2195 Views
  • 5 replies
  • 11 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 11 kudos

Thanks @Kaniz Fatma​  for selecting this as the best answer, Keep adding questions by that we can put our views and people get guidance And the databricks community can grow more.

  • 11 kudos
4 More Replies
J_
by New Contributor II
  • 6558 Views
  • 7 replies
  • 6 kudos

Resolved! Clusters stuck on pending indefinitely (community edition)

From yesterday, suddenly clusters do not start and are in the pending state indefinitely (more than 30 minutes). From a previous post, I tried to add 443 port to the firewall but it doesn't help. In the clusters page, the message says: Finding instan...

  • 6558 Views
  • 7 replies
  • 6 kudos
Latest Reply
Reet
New Contributor II
  • 6 kudos

I am also having same issue and there seems to be no outage.....

  • 6 kudos
6 More Replies
bakiya
by New Contributor II
  • 1516 Views
  • 8 replies
  • 0 kudos

I have successfully passed the test after completion of the course I haven't received any badge from your side as promised. I have been provided with a certificate which looks *****. Please provide me with the badge. My cetificate ID is ID: E-03DK31

I tried log in to > https://credentials.databricks.com/ with a registered email address and not able to find any Badge there. I need badge for the test i taken and passed, do needful.

  • 1516 Views
  • 8 replies
  • 0 kudos
Latest Reply
ftmoztl
New Contributor II
  • 0 kudos

Hello!The same situation is valid for me. I couldn't reach my badge from the following link. Could you help me too? @Vidula Khanna​ https://customer-academy.databricks.com/legacy/lms/index.php%3Fr%3DmyActivities/index%26tab%3Dbadges

  • 0 kudos
7 More Replies
Priyag1
by Honored Contributor II
  • 361 Views
  • 0 replies
  • 9 kudos

New unified Databricks navigation - New Release to Public preview check it out : Databricks release a new navigation experience to public preview.  Th...

New unified Databricks navigation - New Release to Public preview check it out :Databricks release a new navigation experience to public preview. The goal is to reduce clicks and context switches required to complete tasks. The new experience include...

  • 361 Views
  • 0 replies
  • 9 kudos
Priyag1
by Honored Contributor II
  • 319 Views
  • 0 replies
  • 9 kudos

Databricks MarketplaceDatabricks Marketplace, an open forum for exchanging data products. Databricks Marketplace takes advantage of Delta Sharing to g...

Databricks MarketplaceDatabricks Marketplace, an open forum for exchanging data products. Databricks Marketplace takes advantage of Delta Sharing to give data providers the tools to share data products securely and data consumers the power to explore...

  • 319 Views
  • 0 replies
  • 9 kudos
PareDesa_10157
by New Contributor II
  • 471 Views
  • 1 replies
  • 1 kudos

My cluster shows time to time this message, what is the permanent solution

MessageCluster terminated. Reason: Spark Image Download FailureHelpFailed to set up spark container due to an image download failure: Exception when downloading spark image: Stdout: /usr/local/bin/fastar /usr/local/bin/fastarIhave tried enabling the ...

  • 471 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, This error can be surfaced due to various reasons, such as networking , slowness etc.. Is there any other significant errors showing up in the driver logs? Is this for a single cluster? Is this a new workspace? You can start with checking on the ...

  • 1 kudos
duliu
by New Contributor II
  • 2781 Views
  • 6 replies
  • 0 kudos

databricks-connect fails with java.lang.IllegalStateException: No api token found in local properties

I configured databricks-connect locally and want to run spark code against a remote cluster.I verified `databricks-connect test` passes and it can connect to remote databricks cluster.However when I query a tables or read parquet from s3, it fails wi...

  • 2781 Views
  • 6 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

@Du Liu​ :The error message suggests that there is no API token found in the local properties. This could be the cause of the failure when trying to access the tables or read parquet files from S3.To fix this issue, you need to ensure that the API to...

  • 0 kudos
5 More Replies
Sid0610
by New Contributor II
  • 1434 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks SQL CREATE TABLE ParseException

I am trying to use the following code to create a deltatable%sqlCREATE TABLE rectangles(a INT, b INT, area INT GENERATED ALWAYS AS IDENTITY (START WITH 1, STEP BY 1))I don't know why but I am always getting the ParseException error.I tried all other ...

  • 1434 Views
  • 3 replies
  • 3 kudos
Latest Reply
emiratesevisaon
New Contributor II
  • 3 kudos

How can we use SQL for my website emiratesevisaonline.com backend date?

  • 3 kudos
2 More Replies
Johny
by New Contributor III
  • 1127 Views
  • 2 replies
  • 4 kudos

Insert data to a CDF-enabled Delta table throwing java.lang.StackOverflowError

I am building a bronze table with CDF-enables in these steps:Initially, Reading json file from landing zone and write to table locationdf = spark.readStream.format("cloudFiles") \ .option("cloudFiles.schemaLocation", <schema_loc>) \ .option("clou...

  • 1127 Views
  • 2 replies
  • 4 kudos
Latest Reply
Johny
New Contributor III
  • 4 kudos

I tried with a simple csv file that only has one column. I got the same error.

  • 4 kudos
1 More Replies
varunsaagar
by New Contributor III
  • 4714 Views
  • 18 replies
  • 31 kudos

Request for reattempt voucher. Databricks Certified Machine Learning Professional exam

HiOn December 28th ,I attempted the Databricks Certified Machine Learning Professional exam for 1st time , unfortunately I ended up by failing grade. My passing grade was 70%, and I received 68.33%.I am planning to reattempt the exam, Could you kindl...

  • 4714 Views
  • 18 replies
  • 31 kudos
Latest Reply
girl_chan
New Contributor II
  • 31 kudos

What is the next event where they will give a voucher?

  • 31 kudos
17 More Replies
JKR
by New Contributor III
  • 1411 Views
  • 2 replies
  • 0 kudos

The spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached.

Getting below error Context: Using Databricks shared interactive cluster for scheduled run multiple parallel jobs at the same time after every 5 mins. When I check Ganglia, driver node's memory reaches almost max and then restart of driver happens an...

  • 1411 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

please check the driver's logs, for example the log4j and the GC logs

  • 0 kudos
1 More Replies
jonasmin
by New Contributor III
  • 5179 Views
  • 7 replies
  • 2 kudos

Error while establishing JDBC connection to Azure databricks via HTTP proxy

I am using the databricks JDBC driver (https://databricks.com/spark/jdbc-drivers-download) to connect to Azure databricks.The connection needs to be routed through a HTTP proxy. I found parameters that can be configured for using the HTTP proxy:By pa...

databricks jdbc
  • 5179 Views
  • 7 replies
  • 2 kudos
Latest Reply
MS_Varma
New Contributor II
  • 2 kudos

Hi @Jonas Minning​ , actually I am also having the same issue and when i looked into the driver related documentation I found that the driver currently only supports SOCKS proxies and I believe this is the reason why we are getting this error. So, I ...

  • 2 kudos
6 More Replies
Labels
Top Kudoed Authors