cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

EngHol
by New Contributor
  • 3410 Views
  • 1 replies
  • 0 kudos

Error uploading files to a Unity Catalog volume in Databricks

Hi everyone,I'm developing an API in Flask that interacts with Databricks to upload files to a Unity Catalog volume, but I'm encountering the following error:{"error_code": "ENDPOINT_NOT_FOUND", "message": "No se encontró API para 'POST /unity-catalo...

  • 3410 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @EngHol, This endpoint: /api/2.0/unity-catalog/volumes/upload is not a valid one, hence the issue. Looking at the API for volumes, unfortunately there is no way to upload to a volume: https://docs.databricks.com/api/workspace/volumes

  • 0 kudos
NehaR
by New Contributor III
  • 862 Views
  • 2 replies
  • 0 kudos

Hide function definition in Unity catalog

Hi ,I have created a function to anonymize user id using secret.I want to give access of this function to other users so they can execute it without giving access to the secret .Is it possible in databricks? I have tested it and see user is not able ...

  • 862 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @NehaR, I am afraid it might not be possible without giving secret access to the users. Another approach would be to use a Service Principal.

  • 0 kudos
1 More Replies
mrstevegross
by Contributor III
  • 979 Views
  • 1 replies
  • 1 kudos

Resolved! Container lifetime?

When launching a job via "Create and trigger a one-time run" (docs), when using a custom image (docs), what's the lifetime of the container? Does it create the cluster, start the container, run the job, then terminate the container? Or does the runni...

  • 979 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @mrstevegross  Cluster Creation: When you submit a job using the "Create and trigger a one-time run" API, a new cluster is created if one is not specified.Container Start: The custom Docker image specified in the cluster configuration is us...

  • 1 kudos
Ibrahim1
by New Contributor
  • 3383 Views
  • 0 replies
  • 0 kudos

DLT detecting changes but not applying them

We have three source tables used for a streaming dimension table in silver. Around 50K records are changed in one of the source tables, and the DLT pipeline shows that it has updated those 50K records, but they remain unchanged. The only way to pick ...

  • 3383 Views
  • 0 replies
  • 0 kudos
barnita99
by New Contributor II
  • 900 Views
  • 3 replies
  • 0 kudos

No confirmation mail received after scheduling the exam

Hi team, I have scheduled my Databricks Data Engineer Associate exam for 12th Feb 2025 using the below mail id, but I still have not received any confirmation mail there. I have checked spam folder too.Could you please resend it to barnitac@kpmg.com ...

  • 900 Views
  • 3 replies
  • 0 kudos
Latest Reply
barnita99
New Contributor II
  • 0 kudos

Hi team, I have cleared my exam today. Unfortunately I have not received a single mail either to confirm my exam or to confirm test completion and result. @Cert-Team 

  • 0 kudos
2 More Replies
Bala_K
by New Contributor II
  • 989 Views
  • 2 replies
  • 1 kudos

Partnership with Databricks

Hello,What are the pre-requisites to become Databricks partner?

  • 989 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @Bala_K! For information on becoming a Databricks partner, please email partnerops@Databricks.com. They can guide you through the prerequisites and next steps.

  • 1 kudos
1 More Replies
kiko_roy
by Contributor
  • 1817 Views
  • 3 replies
  • 0 kudos

Getting access token error when connecting from azure databricks to GCS bucket

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the dataframe or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values. Also tried uploadi...

  • 1817 Views
  • 3 replies
  • 0 kudos
Latest Reply
KristiLogos
Contributor
  • 0 kudos

@kiko_roy  unfortunately that didnt' work. the error is stating its trying to get the access token from metadata server, I wonder why from the metadata server?

  • 0 kudos
2 More Replies
lukaseder
by New Contributor II
  • 823 Views
  • 2 replies
  • 0 kudos

How to fetch nested data structures in Databricks using JDBC

I've asked the question also here on stack overflow When using nested data structures in Databricks (e.g. `ARRAY` or `ROW`) using JDBC, it appears that the results can be fetched as JSON `String` values, e.g.: try (Statement s = connection.createStat...

  • 823 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @lukaseder, This looks to be a bug, but will get more details about it internally. The driver is relatively new. Have you tried with another version, for instance JDBC: 2.6.40

  • 0 kudos
1 More Replies
AChang
by New Contributor III
  • 6438 Views
  • 2 replies
  • 2 kudos

Model Serving Endpoint keeps failing with SIGKILL error

I am trying to deploy a model in the serving endpoints section, but it keeps failing after attempting to create for an hour. Here are the service logs:Container failed with: 9 +0000] [115] [INFO] Booting worker with pid: 115[2023-09-15 19:15:35 +0000...

Get Started Discussions
model registry
Model serving
serving endpoint
serving endpoints
  • 6438 Views
  • 2 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hello @AChang, This is a common issue when the memory requirements of your model exceed the available memory on your current compute resources.   Moving to a larger compute instance with more memory can help accommodate the memory requirements of yo...

  • 2 kudos
1 More Replies
eranna_kc
by New Contributor III
  • 1550 Views
  • 6 replies
  • 1 kudos

Facing issue with Databricks Notification with Azure Python webhook

Hi All,     Could you be able to help to resolve the issue related to Azure Databricks Notification. Trigger is not happening whenever job fails or success to webhook. I have created webhook in Azure Automation account and created Python webhook and ...

eranna_kc_0-1738821477907.png
  • 1550 Views
  • 6 replies
  • 1 kudos
Latest Reply
eranna_kc
New Contributor III
  • 1 kudos

Please anybody can help on this issue?

  • 1 kudos
5 More Replies
AnoopS1
by New Contributor II
  • 1030 Views
  • 2 replies
  • 0 kudos

Azure databricks connector issue while connecting Power BI to Databricks hosted on AWS

We have setup the integration between PowerBI and Databricks (hosted on AWS) using native databricks connector. However, we required Azure databricks connector to utilize the RBAC from unity catalog. We followed all the pre-requisite mentioned here ,...

  • 1030 Views
  • 2 replies
  • 0 kudos
Latest Reply
AnoopS1
New Contributor II
  • 0 kudos

OAuth is already enabled in databricks and all the pre-requisites are in place.

  • 0 kudos
1 More Replies
rsh48
by New Contributor
  • 595 Views
  • 1 replies
  • 0 kudos

unable to create DBX account : Unable to signup. Please try again, or sign in here...

Hi,I have been trying to create an account at databricks community edition but unable to do it for the past we weeks.Can someone from the databricks support team look into it? I am getting this since past few weeks. Unable to attach the HAR file. Do ...

rsh48_0-1739017357447.png
  • 595 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @rsh48, Could you please send a mail to community@databricks.com? and share your email details. Thanks.

  • 0 kudos
Phani1
by Databricks MVP
  • 6774 Views
  • 1 replies
  • 0 kudos

Databricks On-Premises or in Private Cloud

Hi All,Is it possible to store/process the data on-premises or in a private cloud with Databricks? Will this choice affect costs and performance? Please advise, as the customer wants the data stored on-premises or in a private cloud for security reas...

  • 6774 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@Phani1 Databricks does not provide a product that can be directly installed and self-managed on on-premises or private cloud environments. Instead, Databricks primarily operates as a managed service on public cloud platforms such as AWS, Azure, and ...

  • 0 kudos
Phani1
by Databricks MVP
  • 743 Views
  • 1 replies
  • 1 kudos

Multiple metastores in Unity Catalog

Hi All,Can we have more than one meta store in a region in Unity Catalog? Having a single meta store per region helps keep metadata management organized, but customers are asking for multiple meta stores for different needs. Is it possible to have se...

  • 743 Views
  • 1 replies
  • 1 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 1 kudos

@Phani1 When attempting to create multiple metastores in a specific region, you may encounter the following error: This region already contains a metastore. Only a single metastore per region is allowed.Databricks recommends having only one metastore...

  • 1 kudos
dbxlearner
by New Contributor II
  • 621 Views
  • 1 replies
  • 0 kudos

Clarification on VACUUM LITE operation

Hi all, I wanted some insight and clarification on the VACUUM LITE command. VACUUM | Databricks on AWSSo I am aware that the VACUUM FULL command will deletes data files outside of the retention duration and all files in the table directory not refere...

  • 621 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @dbxlearner, If you set your retention duration to 3 days for a VACUUM LITE operation, it means that the command will use the Delta transaction log to identify and remove files that are no longer referenced by any table versions within the last...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels