cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mishrarit
by New Contributor
  • 679 Views
  • 1 replies
  • 0 kudos

job "run name" in "system" "lake flow" "job run timeline" table

For few jobs in unity catalog the "run name" is coming out to be "null" whereas for few we the complete name with system generated batch id. I am not sure how this field is populated and why for some job's the "run name" is present whereas for some i...

  • 679 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @mishrarit! Run name in Unity Catalog job runs is determined by how the job is triggered. For manual runs, Databricks automatically generates a name, and for scheduled or API-triggered runs, the run name remains null unless explicitly defined.

  • 0 kudos
arne_c
by New Contributor II
  • 1780 Views
  • 2 replies
  • 0 kudos

Set up compute policy to allow installing python libraries from a private package index

In our organization, we maintain a bunch of libraries we share code with. They're hosted on a private python package index, which requires a token to allow downloads. My idea was to store the token as a secret which would then be loaded into a cluste...

  • 1780 Views
  • 2 replies
  • 0 kudos
Latest Reply
arne_c
New Contributor II
  • 0 kudos

I figured it out, seems like secrets can only be loaded into environment variables if the content is the secret and nothing else:"value": "{{secrets/global/arneCorpPyPI_token}}" # this will work"value": "foo {{secrets/global/arneCorpPyPI_toke...

  • 0 kudos
1 More Replies
GerardAlexander
by New Contributor III
  • 836 Views
  • 1 replies
  • 0 kudos

Creating Unity Catalog in Personal AZURE Portal Account

Seeking advice on the following:1. Given that I have a Personal - and not an Organization-based - AZURE Portal Account,    2. that I can see I am Global Admin and have Admin Role in Databricks,         3. then why can I not get "Manage Account" for a...

  • 836 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@GerardAlexander Try signing in to the Account Console (https://accounts.azuredatabricks.net/login) using a user account with the appropriate permissions, rather than accessing it from the workspace.If you are unable to sign in, the following resourc...

  • 0 kudos
laeforceable
by New Contributor II
  • 3467 Views
  • 3 replies
  • 1 kudos

Power BI - Azure Databricks Connector shows Error AAD is not setup for domain

Hi Team,What I would like to do is understand what is required for PowerBI gateway to use single sign-on (AAD) to Databricks. Is that something you could have encountered before and know the fix? I currently get message from Power BI that AAD is not ...

image.png
  • 3467 Views
  • 3 replies
  • 1 kudos
Latest Reply
kkitsara
New Contributor II
  • 1 kudos

Hello, did you have any solution for this? I am facing the same issue.

  • 1 kudos
2 More Replies
FanMichelle0729
by New Contributor II
  • 1241 Views
  • 5 replies
  • 0 kudos

Serveless compute does need has cloud accout(AWS、Google 、Azure)

I am a Databricks beginner, and I would like to ask if the Compute created in the Databricks account , it means also exists in the cloud account (e.g., AWS)? If the AWS account is deactivated, the existing compute will not be usable. This is what I h...

  • 1241 Views
  • 5 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@FanMichelleTW No, Databricks recommends using serverless compute, and you can use serverless compute as well.To do so, open a notebook and check the top-right corner to see if a serverless compute option is in a Ready state. If it is, simply select ...

  • 0 kudos
4 More Replies
bmhardy
by New Contributor III
  • 4687 Views
  • 4 replies
  • 4 kudos

Creating a hierarchy without recursive statements

I am looking to build a hierarchy from a parent child relationship table, which I would typically use a recursive statement for in SQL Server / Azure SQL. This would mean setting an anchor, most commonly the top record of the tree, and then join back...

  • 4687 Views
  • 4 replies
  • 4 kudos
Latest Reply
bmhardy
New Contributor III
  • 4 kudos

Thank you, I will give this a try. I'll let you know how it goes.

  • 4 kudos
3 More Replies
citizenX7042
by New Contributor
  • 1689 Views
  • 4 replies
  • 0 kudos

Error with Read XML data using the spark-xml library

hi, would appritiate any help with an error with loading an XML file with  spark-xml library.my enviorment :14.3 LTS (includes Apache Spark 3.5.0, Scala 2.12)library : com.databricks:spark-xml_2.12:0.15.0on databricks notebook.when running this scrip...

  • 1689 Views
  • 4 replies
  • 0 kudos
Latest Reply
barsha_sharma
New Contributor II
  • 0 kudos

UPDATE:It is now possible to read xml files directly: https://docs.databricks.com/en/query/formats/xml.html Make sure to update your Databricks Runtime to 14.3 and above, and remove the spark-xml maven library from your cluster.

  • 0 kudos
3 More Replies
smanda88
by New Contributor
  • 634 Views
  • 1 replies
  • 0 kudos

Handling Over-Usage of Capacity in Databricks Jobs/Processes

Hi all,Is there a tool or method in Databricks to ensure data integrity and stability when a job or process exceeds the allocated capacity? Specifically, I’m looking for ways to:Prevent failures or data loss due to resource overuse.Automatically scal...

  • 634 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @smanda88 - For point 1, please see; https://docs.databricks.com/en/lakehouse-architecture/reliability/best-practices.html For 2, you can use auto-scaling, please refer to: https://docs.databricks.com/en/lakehouse-architecture/cost-optimization...

  • 0 kudos
nathanmle
by New Contributor
  • 600 Views
  • 1 replies
  • 0 kudos

Where to find Jupyter Notebook course materials for Get Started with Databricks for Generative AI

Hello, I can't seem to find any way to gain access to the Jupyter Notebook demo source of "Get Started with Databricks for Generative AI" course.  Please help.  Thank you kindly in advance. 

  • 600 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @nathanmle! We are sorry to inform you that we are no longer offering notebooks or the DBC files for the self-paced courses due to recent changes.If you’re interested in working on labs in a provided Databricks environment, you can purchase the...

  • 0 kudos
larryjiyu
by New Contributor
  • 3088 Views
  • 0 replies
  • 0 kudos

Databricks CE - Where is the quickstart tutorial?

Hello! I was looking through Databricks tutorials online, but my interface looks different from many of the videos I'm seeing. What happened to the Quickstart tutorials on the home page? Are they no longer available on the dashboard? 

  • 3088 Views
  • 0 replies
  • 0 kudos
pdemeulenaer
by New Contributor II
  • 2022 Views
  • 1 replies
  • 2 kudos

Databricks asset bundles dependencies

Is anyone aware of a way to include a requirements.txt within the job definition of a databricks asset bundle? Documentation mentions how to have dependencies in workspace files, or Unity Catalog volumes, but I wanted to ask if it is possible to decl...

Get Started Discussions
databricksassetbundles
Dependency
  • 2022 Views
  • 1 replies
  • 2 kudos
Latest Reply
cleversuresh
New Contributor III
  • 2 kudos

I have the same question.

  • 2 kudos
EngHol
by New Contributor
  • 3478 Views
  • 1 replies
  • 0 kudos

Error uploading files to a Unity Catalog volume in Databricks

Hi everyone,I'm developing an API in Flask that interacts with Databricks to upload files to a Unity Catalog volume, but I'm encountering the following error:{"error_code": "ENDPOINT_NOT_FOUND", "message": "No se encontró API para 'POST /unity-catalo...

  • 3478 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @EngHol, This endpoint: /api/2.0/unity-catalog/volumes/upload is not a valid one, hence the issue. Looking at the API for volumes, unfortunately there is no way to upload to a volume: https://docs.databricks.com/api/workspace/volumes

  • 0 kudos
NehaR
by New Contributor III
  • 887 Views
  • 2 replies
  • 0 kudos

Hide function definition in Unity catalog

Hi ,I have created a function to anonymize user id using secret.I want to give access of this function to other users so they can execute it without giving access to the secret .Is it possible in databricks? I have tested it and see user is not able ...

  • 887 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @NehaR, I am afraid it might not be possible without giving secret access to the users. Another approach would be to use a Service Principal.

  • 0 kudos
1 More Replies
mrstevegross
by Contributor III
  • 1016 Views
  • 1 replies
  • 1 kudos

Resolved! Container lifetime?

When launching a job via "Create and trigger a one-time run" (docs), when using a custom image (docs), what's the lifetime of the container? Does it create the cluster, start the container, run the job, then terminate the container? Or does the runni...

  • 1016 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @mrstevegross  Cluster Creation: When you submit a job using the "Create and trigger a one-time run" API, a new cluster is created if one is not specified.Container Start: The custom Docker image specified in the cluster configuration is us...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels