cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

inesandres567
by New Contributor
  • 1135 Views
  • 1 replies
  • 0 kudos

Fail start cluster

I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...

  • 1135 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...

  • 0 kudos
mderela
by New Contributor II
  • 6410 Views
  • 3 replies
  • 1 kudos

inegstion time clustering

Hello, in rerence to https://www.databricks.com/blog/2022/11/18/introducing-ingestion-time-clustering-dbr-112.htmlI have a silly question how to use it. So let's assume that I have a few TB of not partitioned data. So, if I would like to query on dat...

  • 6410 Views
  • 3 replies
  • 1 kudos
Simon_T
by New Contributor III
  • 5511 Views
  • 0 replies
  • 0 kudos

Databricks Terraform Cluster Issue.

Error: default auth: cannot configure default credentials. Config: token=***. Env: DATABRICKS_TOKENon cluster.tf line 27, in data “databricks_spark_version” “latest_lts”:27: data “databricks_spark_version” “latest_lts” {

  • 5511 Views
  • 0 replies
  • 0 kudos
diego_poggioli
by Contributor
  • 2738 Views
  • 1 replies
  • 1 kudos

Resolved! Run tasks conditionally "Always" condition missing?

Does the new feature 'Run If' that allows you to run tasks conditionally lack the 'ALWAYS' option? In order to execute the task both when there is OK and error from the dependencies

  • 2738 Views
  • 1 replies
  • 1 kudos
Latest Reply
Lakshay
Databricks Employee
  • 1 kudos

You can choose the All Done option to run the task in both the scenarios

  • 1 kudos
kurtrm
by New Contributor III
  • 6893 Views
  • 2 replies
  • 3 kudos

How to send alert when cluster is running for too long

Hello,Our team recently experienced an issue where a teammate started a new workflow job then went on vacation. This job ended up running continuously without failing for 4.5 days. The usage of the cluster did not seem out of place during the workday...

  • 6893 Views
  • 2 replies
  • 3 kudos
Latest Reply
kurtrm
New Contributor III
  • 3 kudos

@Retired_mod,I ended up creating a job leveraging the Databricks Python SDK to check cluster and active job run times. The script will raise an error and notify the team if the cluster hasn't terminated or restarted in the past 24 hours or if a job h...

  • 3 kudos
1 More Replies
karankumara
by New Contributor
  • 1046 Views
  • 0 replies
  • 0 kudos

DBX Sync Command --unmatched-behaviour=unspecified-delete-unmatched not working

We are using dbx command to sync the objects from the local to Databricks workspace, we are using the below command to sync the data,dbx sync workspace --unmatched-behaviour=unspecified-delete-unmatched  -s /tmp -d /tmpWe have deleted some files loca...

  • 1046 Views
  • 0 replies
  • 0 kudos
apiury
by New Contributor III
  • 714 Views
  • 0 replies
  • 0 kudos

Analyze data metodoly

Hello,I have an ETL process that ingests data into bronze tables, transforms the data, and then ingests it into silver tables before finally populating the gold tables. This workflow is executed every 5 minutes. When I want to analyze the data or app...

  • 714 Views
  • 0 replies
  • 0 kudos
DavidValdez
by New Contributor II
  • 1124 Views
  • 1 replies
  • 0 kudos

I can't access to my account

Hi, I can't access to my account, and need to book an exam. I completed my registration at: https://www.webassessor.com/form/createAccount.do, and when I try to login I have this error: "Login or Password is incorrect"Please help me with this issue. ...

  • 1124 Views
  • 1 replies
  • 0 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 0 kudos

Hi @DavidValdez Looks like you were able to schedule your exam. If you experience any other issues you can request support here.We also have a new FAQ: https://www.databricks.com/learn/certification/faq

  • 0 kudos
kp12
by New Contributor II
  • 12575 Views
  • 1 replies
  • 0 kudos

Accessing TenantId via secret to connect to Azure Data Lake Storage Gen2 doesn't work

Hello,I'm following instructions in this article to connect to ADLS gen2 using Azure service principal. I can access service principal's app id and secret via Databricks key vault backed secret scope. However, this doesn't work for directory-id and I...

  • 12575 Views
  • 1 replies
  • 0 kudos
Latest Reply
kp12
New Contributor II
  • 0 kudos

Hi @Retired_mod , Thanks for the prompt reply. As per the document, the syntax is the text highlighted in red below for accessing keys from secret scope in spark config. I used the same for app id too and that works. But I if use the same syntax for ...

  • 0 kudos
rohithmalla
by New Contributor
  • 1332 Views
  • 1 replies
  • 0 kudos

Snowflake Data Formatting Issue

I'm loading snowflake data to delta tables in databricks, few columns in snowflake data have datatype as Number (20,7) after loading to delta table it is taking as decimal (20,7), for example, if the value is 0.0000000 in snowflake then it is showing...

  • 1332 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

explicit casting seems like the way to go.First try with one column, to see if that solves your issue.If so, you can write a function that casts all decimal columns to a certain precision, something like this:def convert_decimal_precision_scale(df, p...

  • 0 kudos
Mumrel
by Contributor
  • 2807 Views
  • 4 replies
  • 1 kudos

Why is importing python code supported in Repos but not in Workspaces ?

Hi, we currently use a one repo approach which does not require a local development environment (we utilize azure dev ops and nutter for automated tests). We also have shared code accross pipelines and started with %run-sytle modularization and have ...

  • 2807 Views
  • 4 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

the why is most probably because of different development tracks/teams between workspace and repos.If they will consilidate in functionality?  Can't tell, only Databricks knows that; but it seems reasonable to assume the files will also be added to w...

  • 1 kudos
3 More Replies
DeltaTrain
by New Contributor II
  • 1819 Views
  • 3 replies
  • 0 kudos

hive_metastore Access Control by different cluster type

Hello Databricks Community,I'm reaching out with a query regarding access control in the hive_metastore. I've encountered behavior that I'd like to understand better and potentially address.To illustrate the situation:I've set up three users for test...

DeltaTrain_0-1691616911858.png DeltaTrain_1-1691617650542.png
  • 1819 Views
  • 3 replies
  • 0 kudos
Latest Reply
DeltaTrain
New Contributor II
  • 0 kudos

Hi @Debayan, thank you for your reply.  with hive_metastore, still I cannot get the level of isolation, which means that if anyone activates the Single node cluster, she/he can see all the catalog, schema, and table. However, with Unity catalog appli...

  • 0 kudos
2 More Replies
BobBubble2000
by New Contributor II
  • 1711 Views
  • 0 replies
  • 0 kudos

Workflows pricing

Hi there,I checked the Databricks page on the pricing of Databricks Workflows ( https://www.databricks.com/product/pricing/jobs ) and have a question regarding the cost components: the pricing page only mentions compute costs (depending whether it's ...

  • 1711 Views
  • 0 replies
  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels