cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

citizenX7042
by New Contributor
  • 589 Views
  • 4 replies
  • 0 kudos

Error with Read XML data using the spark-xml library

hi, would appritiate any help with an error with loading an XML file with  spark-xml library.my enviorment :14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12)library : com.databricks:spark-xml_2.12:0.15.0on databricks notebook.when running this scrip...

  • 589 Views
  • 4 replies
  • 0 kudos
Latest Reply
barsha_sharma
New Contributor II
  • 0 kudos

UPDATE:It is now possible to read xml files directly: https://docs.databricks.com/en/query/formats/xml.html Make sure to update your Databricks Runtime to 14.3 and above, and remove the spark-xml maven library from your cluster.

  • 0 kudos
3 More Replies
smanda88
by New Contributor
  • 211 Views
  • 1 replies
  • 0 kudos

Handling Over-Usage of Capacity in Databricks Jobs/Processes

Hi all,Is there a tool or method in Databricks to ensure data integrity and stability when a job or process exceeds the allocated capacity? Specifically, I’m looking for ways to:Prevent failures or data loss due to resource overuse.Automatically scal...

  • 211 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @smanda88 - For point 1, please see; https://docs.databricks.com/en/lakehouse-architecture/reliability/best-practices.html For 2, you can use auto-scaling, please refer to: https://docs.databricks.com/en/lakehouse-architecture/cost-optimization...

  • 0 kudos
nathanmle
by New Contributor
  • 269 Views
  • 1 replies
  • 0 kudos

Where to find Jupyter Notebook course materials for Get Started with Databricks for Generative AI

Hello, I can't seem to find any way to gain access to the Jupyter Notebook demo source of "Get Started with Databricks for Generative AI" course.  Please help.  Thank you kindly in advance. 

  • 269 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @nathanmle! We are sorry to inform you that we are no longer offering notebooks or the DBC files for the self-paced courses due to recent changes.If you’re interested in working on labs in a provided Databricks environment, you can purchase the...

  • 0 kudos
larryjiyu
by New Contributor
  • 615 Views
  • 0 replies
  • 0 kudos

Databricks CE - Where is the quickstart tutorial?

Hello! I was looking through Databricks tutorials online, but my interface looks different from many of the videos I'm seeing. What happened to the Quickstart tutorials on the home page? Are they no longer available on the dashboard? 

  • 615 Views
  • 0 replies
  • 0 kudos
pdemeulenaer
by New Contributor II
  • 1417 Views
  • 1 replies
  • 1 kudos

Databricks asset bundles dependencies

Is anyone aware of a way to include a requirements.txt within the job definition of a databricks asset bundle? Documentation mentions how to have dependencies in workspace files, or Unity Catalog volumes, but I wanted to ask if it is possible to decl...

Get Started Discussions
databricksassetbundles
Dependency
  • 1417 Views
  • 1 replies
  • 1 kudos
Latest Reply
cleversuresh
New Contributor III
  • 1 kudos

I have the same question.

  • 1 kudos
EngHol
by New Contributor
  • 321 Views
  • 1 replies
  • 0 kudos

Error uploading files to a Unity Catalog volume in Databricks

Hi everyone,I'm developing an API in Flask that interacts with Databricks to upload files to a Unity Catalog volume, but I'm encountering the following error:{"error_code": "ENDPOINT_NOT_FOUND", "message": "No se encontró API para 'POST /unity-catalo...

  • 321 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @EngHol, This endpoint: /api/2.0/unity-catalog/volumes/upload is not a valid one, hence the issue. Looking at the API for volumes, unfortunately there is no way to upload to a volume: https://docs.databricks.com/api/workspace/volumes

  • 0 kudos
NehaR
by New Contributor III
  • 335 Views
  • 2 replies
  • 0 kudos

Hide function definition in Unity catalog

Hi ,I have created a function to anonymize user id using secret.I want to give access of this function to other users so they can execute it without giving access to the secret .Is it possible in databricks? I have tested it and see user is not able ...

  • 335 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @NehaR, I am afraid it might not be possible without giving secret access to the users. Another approach would be to use a Service Principal.

  • 0 kudos
1 More Replies
reggie
by New Contributor III
  • 2500 Views
  • 4 replies
  • 2 kudos

Resolved! Issue enabling mosaic

Hi,I am trying to install mosaic on my cluster, but get the error once I use 'enable_mosaic': ImportError: cannot import name '_to_java_column' from 'pyspark.sql.functions' (/databricks/spark/python/pyspark/sql/functions/__init__.py) File <command-14...

  • 2500 Views
  • 4 replies
  • 2 kudos
Latest Reply
JuanRomero
New Contributor II
  • 2 kudos

Thanks, I had the same issue and I was able to fix it

  • 2 kudos
3 More Replies
mrstevegross
by Contributor III
  • 493 Views
  • 1 replies
  • 1 kudos

Resolved! Container lifetime?

When launching a job via "Create and trigger a one-time run" (docs), when using a custom image (docs), what's the lifetime of the container? Does it create the cluster, start the container, run the job, then terminate the container? Or does the runni...

  • 493 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @mrstevegross  Cluster Creation: When you submit a job using the "Create and trigger a one-time run" API, a new cluster is created if one is not specified.Container Start: The custom Docker image specified in the cluster configuration is us...

  • 1 kudos
Ibrahim1
by New Contributor
  • 677 Views
  • 0 replies
  • 0 kudos

DLT detecting changes but not applying them

We have three source tables used for a streaming dimension table in silver. Around 50K records are changed in one of the source tables, and the DLT pipeline shows that it has updated those 50K records, but they remain unchanged. The only way to pick ...

  • 677 Views
  • 0 replies
  • 0 kudos
barnita99
by New Contributor II
  • 384 Views
  • 3 replies
  • 0 kudos

No confirmation mail received after scheduling the exam

Hi team, I have scheduled my Databricks Data Engineer Associate exam for 12th Feb 2025 using the below mail id, but I still have not received any confirmation mail there. I have checked spam folder too.Could you please resend it to barnitac@kpmg.com ...

  • 384 Views
  • 3 replies
  • 0 kudos
Latest Reply
barnita99
New Contributor II
  • 0 kudos

Hi team, I have cleared my exam today. Unfortunately I have not received a single mail either to confirm my exam or to confirm test completion and result. @Cert-Team 

  • 0 kudos
2 More Replies
Bala_K
by New Contributor II
  • 271 Views
  • 2 replies
  • 1 kudos

Partnership with Databricks

Hello,What are the pre-requisites to become Databricks partner?

  • 271 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @Bala_K! For information on becoming a Databricks partner, please email partnerops@Databricks.com. They can guide you through the prerequisites and next steps.

  • 1 kudos
1 More Replies
akshaym0056
by New Contributor
  • 639 Views
  • 0 replies
  • 0 kudos

How to Define Constants at Bundle Level in Databricks Asset Bundles for Use in Notebooks?

I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...

  • 639 Views
  • 0 replies
  • 0 kudos
kiko_roy
by Contributor
  • 1251 Views
  • 3 replies
  • 0 kudos

Getting access token error when connecting from azure databricks to GCS bucket

I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the dataframe or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values. Also tried uploadi...

  • 1251 Views
  • 3 replies
  • 0 kudos
Latest Reply
KristiLogos
Contributor
  • 0 kudos

@kiko_roy  unfortunately that didnt' work. the error is stating its trying to get the access token from metadata server, I wonder why from the metadata server?

  • 0 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels