- 589 Views
- 4 replies
- 0 kudos
hi, would appritiate any help with an error with loading an XML file with spark-xml library.my enviorment :14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12)library : com.databricks:spark-xml_2.12:0.15.0on databricks notebook.when running this scrip...
- 589 Views
- 4 replies
- 0 kudos
Latest Reply
UPDATE:It is now possible to read xml files directly: https://docs.databricks.com/en/query/formats/xml.html Make sure to update your Databricks Runtime to 14.3 and above, and remove the spark-xml maven library from your cluster.
3 More Replies
- 211 Views
- 1 replies
- 0 kudos
Hi all,Is there a tool or method in Databricks to ensure data integrity and stability when a job or process exceeds the allocated capacity? Specifically, I’m looking for ways to:Prevent failures or data loss due to resource overuse.Automatically scal...
- 211 Views
- 1 replies
- 0 kudos
Latest Reply
Hello @smanda88 -
For point 1, please see; https://docs.databricks.com/en/lakehouse-architecture/reliability/best-practices.html
For 2, you can use auto-scaling, please refer to: https://docs.databricks.com/en/lakehouse-architecture/cost-optimization...
- 269 Views
- 1 replies
- 0 kudos
Hello, I can't seem to find any way to gain access to the Jupyter Notebook demo source of "Get Started with Databricks for Generative AI" course. Please help. Thank you kindly in advance.
- 269 Views
- 1 replies
- 0 kudos
Latest Reply
Hello @nathanmle!
We are sorry to inform you that we are no longer offering notebooks or the DBC files for the self-paced courses due to recent changes.If you’re interested in working on labs in a provided Databricks environment, you can purchase the...
- 615 Views
- 0 replies
- 0 kudos
Hello! I was looking through Databricks tutorials online, but my interface looks different from many of the videos I'm seeing. What happened to the Quickstart tutorials on the home page? Are they no longer available on the dashboard?
- 615 Views
- 0 replies
- 0 kudos
- 1417 Views
- 1 replies
- 1 kudos
Is anyone aware of a way to include a requirements.txt within the job definition of a databricks asset bundle? Documentation mentions how to have dependencies in workspace files, or Unity Catalog volumes, but I wanted to ask if it is possible to decl...
- 1417 Views
- 1 replies
- 1 kudos
Latest Reply
I have the same question.
- 321 Views
- 1 replies
- 0 kudos
Hi everyone,I'm developing an API in Flask that interacts with Databricks to upload files to a Unity Catalog volume, but I'm encountering the following error:{"error_code": "ENDPOINT_NOT_FOUND", "message": "No se encontró API para 'POST /unity-catalo...
- 321 Views
- 1 replies
- 0 kudos
Latest Reply
Hello @EngHol,
This endpoint: /api/2.0/unity-catalog/volumes/upload is not a valid one, hence the issue.
Looking at the API for volumes, unfortunately there is no way to upload to a volume: https://docs.databricks.com/api/workspace/volumes
by
NehaR
• New Contributor III
- 335 Views
- 2 replies
- 0 kudos
Hi ,I have created a function to anonymize user id using secret.I want to give access of this function to other users so they can execute it without giving access to the secret .Is it possible in databricks? I have tested it and see user is not able ...
- 335 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @NehaR,
I am afraid it might not be possible without giving secret access to the users. Another approach would be to use a Service Principal.
1 More Replies
- 780 Views
- 0 replies
- 0 kudos
I am trying to create a workspace using AWS CloudFormation, but the stack fails with the following error:"The resource CreateWorkspace is in a CREATE_FAILED state. This Custom::CreateWorkspace resource is in a CREATE_FAILED state. Received response s...
- 780 Views
- 0 replies
- 0 kudos
by
reggie
• New Contributor III
- 2500 Views
- 4 replies
- 2 kudos
Hi,I am trying to install mosaic on my cluster, but get the error once I use 'enable_mosaic': ImportError: cannot import name '_to_java_column' from 'pyspark.sql.functions' (/databricks/spark/python/pyspark/sql/functions/__init__.py)
File <command-14...
- 2500 Views
- 4 replies
- 2 kudos
- 493 Views
- 1 replies
- 1 kudos
When launching a job via "Create and trigger a one-time run" (docs), when using a custom image (docs), what's the lifetime of the container? Does it create the cluster, start the container, run the job, then terminate the container? Or does the runni...
- 493 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @mrstevegross
Cluster Creation: When you submit a job using the "Create and trigger a one-time run" API, a new cluster is created if one is not specified.Container Start: The custom Docker image specified in the cluster configuration is us...
- 677 Views
- 0 replies
- 0 kudos
We have three source tables used for a streaming dimension table in silver. Around 50K records are changed in one of the source tables, and the DLT pipeline shows that it has updated those 50K records, but they remain unchanged. The only way to pick ...
- 677 Views
- 0 replies
- 0 kudos
- 384 Views
- 3 replies
- 0 kudos
Hi team, I have scheduled my Databricks Data Engineer Associate exam for 12th Feb 2025 using the below mail id, but I still have not received any confirmation mail there. I have checked spam folder too.Could you please resend it to barnitac@kpmg.com ...
- 384 Views
- 3 replies
- 0 kudos
Latest Reply
Hi team, I have cleared my exam today. Unfortunately I have not received a single mail either to confirm my exam or to confirm test completion and result. @Cert-Team
2 More Replies
by
Bala_K
• New Contributor II
- 271 Views
- 2 replies
- 1 kudos
Hello,What are the pre-requisites to become Databricks partner?
- 271 Views
- 2 replies
- 1 kudos
Latest Reply
Hello @Bala_K!
For information on becoming a Databricks partner, please email partnerops@Databricks.com. They can guide you through the prerequisites and next steps.
1 More Replies
- 639 Views
- 0 replies
- 0 kudos
I'm working with Databricks Asset Bundles and need to define constants at the bundle level based on the target environment. These constants will be used inside Databricks notebooks.For example, I want a constant gold_catalog to take different values ...
- 639 Views
- 0 replies
- 0 kudos
- 1251 Views
- 3 replies
- 0 kudos
I am creating a data frame by reading a table's data residing in Azure backed unity catalog. I need to write the dataframe or file to GCS bucket. I have configured the spark cluster config using the GCP service account json values. Also tried uploadi...
- 1251 Views
- 3 replies
- 0 kudos
Latest Reply
@kiko_roy unfortunately that didnt' work. the error is stating its trying to get the access token from metadata server, I wonder why from the metadata server?
2 More Replies