cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

barnita99
by New Contributor II
  • 734 Views
  • 3 replies
  • 0 kudos

No confirmation mail received after scheduling the exam

Hi team, I have scheduled my Databricks Data Engineer Associate exam for 12th Feb 2025 using the below mail id, but I still have not received any confirmation mail there. I have checked spam folder too.Could you please resend it to barnitac@kpmg.com ...

  • 734 Views
  • 3 replies
  • 0 kudos
Latest Reply
barnita99
New Contributor II
  • 0 kudos

Hi team, I have cleared my exam today. Unfortunately I have not received a single mail either to confirm my exam or to confirm test completion and result. @Cert-Team 

  • 0 kudos
2 More Replies
Bala_K
by New Contributor II
  • 761 Views
  • 2 replies
  • 1 kudos

Partnership with Databricks

Hello,What are the pre-requisites to become Databricks partner?

  • 761 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @Bala_K! For information on becoming a Databricks partner, please email partnerops@Databricks.com. They can guide you through the prerequisites and next steps.

  • 1 kudos
1 More Replies
akshaym0056
by New Contributor
  • 2575 Views
  • 0 replies
  • 0 kudos

How to Define Constants at Bundle Level in Databricks Asset Bundles for Use in Notebooks?

I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...

  • 2575 Views
  • 0 replies
  • 0 kudos
kiko_roy
by Contributor
  • 1632 Views
  • 3 replies
  • 0 kudos

Getting access token error when connecting from azure databricks to GCS bucket

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the dataframe or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values. Also tried uploadi...

  • 1632 Views
  • 3 replies
  • 0 kudos
Latest Reply
KristiLogos
Contributor
  • 0 kudos

@kiko_roy  unfortunately that didnt' work. the error is stating its trying to get the access token from metadata server, I wonder why from the metadata server?

  • 0 kudos
2 More Replies
lukaseder
by New Contributor II
  • 672 Views
  • 2 replies
  • 0 kudos

How to fetch nested data structures in Databricks using JDBC

I've asked the question also here on stack overflow When using nested data structures in Databricks (e.g. `ARRAY` or `ROW`) using JDBC, it appears that the results can be fetched as JSON `String` values, e.g.: try (Statement s = connection.createStat...

  • 672 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @lukaseder, This looks to be a bug, but will get more details about it internally. The driver is relatively new. Have you tried with another version, for instance JDBC: 2.6.40

  • 0 kudos
1 More Replies
AChang
by New Contributor III
  • 6201 Views
  • 2 replies
  • 2 kudos

Model Serving Endpoint keeps failing with SIGKILL error

I am trying to deploy a model in the serving endpoints section, but it keeps failing after attempting to create for an hour. Here are the service logs:Container failed with: 9 +0000] [115] [INFO] Booting worker with pid: 115[2023-09-15 19:15:35 +0000...

Get Started Discussions
model registry
Model serving
serving endpoint
serving endpoints
  • 6201 Views
  • 2 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hello @AChang, This is a common issue when the memory requirements of your model exceed the available memory on your current compute resources.   Moving to a larger compute instance with more memory can help accommodate the memory requirements of yo...

  • 2 kudos
1 More Replies
shiv_DB25
by New Contributor II
  • 2994 Views
  • 1 replies
  • 0 kudos

Getting error while installing applicationinsights

Library installation attempted on the driver node of cluster 0210-115502-3lo6gkwd and failed. Pip could not find a version that satisfies the requirement for the library. Please check your library version and dependencies. Error code: ERROR_NO_MATCHI...

  • 2994 Views
  • 1 replies
  • 0 kudos
Latest Reply
shiv_DB25
New Contributor II
  • 0 kudos

applicationinsights is library i am trying to install from cluster PyPi

  • 0 kudos
eranna_kc
by New Contributor III
  • 1259 Views
  • 6 replies
  • 1 kudos

Facing issue with Databricks Notification with Azure Python webhook

Hi All,     Could you be able to help to resolve the issue related to Azure Databricks Notification. Trigger is not happening whenever job fails or success to webhook. I have created webhook in Azure Automation account and created Python webhook and ...

eranna_kc_0-1738821477907.png
  • 1259 Views
  • 6 replies
  • 1 kudos
Latest Reply
eranna_kc
New Contributor III
  • 1 kudos

Please anybody can help on this issue?

  • 1 kudos
5 More Replies
AnoopS1
by New Contributor II
  • 782 Views
  • 2 replies
  • 0 kudos

Azure databricks connector issue while connecting Power BI to Databricks hosted on AWS

We have setup the integration between PowerBI and Databricks (hosted on AWS) using native databricks connector. However, we required Azure databricks connector to utilize the RBAC from unity catalog. We followed all the pre-requisite mentioned here ,...

  • 782 Views
  • 2 replies
  • 0 kudos
Latest Reply
AnoopS1
New Contributor II
  • 0 kudos

OAuth is already enabled in databricks and all the pre-requisites are in place.

  • 0 kudos
1 More Replies
rsh48
by New Contributor
  • 483 Views
  • 1 replies
  • 0 kudos

unable to create DBX account : Unable to signup. Please try again, or sign in here...

Hi,I have been trying to create an account at databricks community edition but unable to do it for the past we weeks.Can someone from the databricks support team look into it? I am getting this since past few weeks. Unable to attach the HAR file. Do ...

rsh48_0-1739017357447.png
  • 483 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @rsh48, Could you please send a mail to community@databricks.com? and share your email details. Thanks.

  • 0 kudos
Phani1
by Valued Contributor II
  • 4112 Views
  • 1 replies
  • 0 kudos

Databricks On-Premises or in Private Cloud

Hi All,Is it possible to store/process the data on-premises or in a private cloud with Databricks? Will this choice affect costs and performance? Please advise, as the customer wants the data stored on-premises or in a private cloud for security reas...

  • 4112 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@Phani1 Databricks does not provide a product that can be directly installed and self-managed on on-premises or private cloud environments. Instead, Databricks primarily operates as a managed service on public cloud platforms such as AWS, Azure, and ...

  • 0 kudos
Phani1
by Valued Contributor II
  • 599 Views
  • 1 replies
  • 1 kudos

Multiple metastores in Unity Catalog

Hi All,Can we have more than one meta store in a region in Unity Catalog? Having a single meta store per region helps keep metadata management organized, but customers are asking for multiple meta stores for different needs. Is it possible to have se...

  • 599 Views
  • 1 replies
  • 1 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 1 kudos

@Phani1 When attempting to create multiple metastores in a specific region, you may encounter the following error: This region already contains a metastore. Only a single metastore per region is allowed.Databricks recommends having only one metastore...

  • 1 kudos
dbxlearner
by New Contributor II
  • 496 Views
  • 1 replies
  • 0 kudos

Clarification on VACUUM LITE operation

Hi all, I wanted some insight and clarification on the VACUUM LITE command. VACUUM | Databricks on AWSSo I am aware that the VACUUM FULL command will deletes data files outside of the retention duration and all files in the table directory not refere...

  • 496 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @dbxlearner, If you set your retention duration to 3 days for a VACUUM LITE operation, it means that the command will use the Delta transaction log to identify and remove files that are no longer referenced by any table versions within the last...

  • 0 kudos
pksilver
by New Contributor III
  • 1695 Views
  • 10 replies
  • 4 kudos

Custom Tag Usage Reporting

Hi Team,Does anyone have a good SQL query that I can use for showing usage costs against custom tags for example on clusters. The Account Console usage report is good but I can only seem to query one custom tag at a time and ideally I want a dashboar...

  • 1695 Views
  • 10 replies
  • 4 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 4 kudos

For reference:  

  • 4 kudos
9 More Replies
parzival
by New Contributor III
  • 2631 Views
  • 10 replies
  • 2 kudos

Unable to login to Community Edition

Facing the below issueWe were not able to find a Community Edition workspace with this email. Please login to accounts.cloud.databricks.com to find the non-community-edition workspaces you may have access to. For help, please see Community Edition Lo...

parzival_0-1738477941646.png
  • 2631 Views
  • 10 replies
  • 2 kudos
Latest Reply
pakkufab1998
New Contributor III
  • 2 kudos

Hi All,Now neither I can sign up from my account not login. o response from them so good luck to everyone out there who's trying to learn this tool.

  • 2 kudos
9 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels