cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

LuukDSL
by New Contributor III
  • 3918 Views
  • 14 replies
  • 1 kudos

Running jobs as service principal, while pulling code from Azure DevOps

In our Dataplatform, our jobs are defined in a dataplatform_jobs.yml within a Databricks Asset Bundle, and then pushed to Databricks via an Azure Devops Pipeline (Azure Devops is where our codebase resides). Currently, this results in workflows looki...

LuukDSL_0-1751983798686.png
  • 3918 Views
  • 14 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 1 kudos

Hi @LuukDSL had you try the solution I have provided above ?

  • 1 kudos
13 More Replies
JonnyData
by New Contributor III
  • 853 Views
  • 4 replies
  • 4 kudos

Resolved! %run command gives error on free edition

Hi,I'm testing out running one Notebook from another using the %run magic command in the Databricks Free Edition. Just really simple test stuff but get the following error:Failed to parse %run command: string matching regex '\$[\w_]+' expected but 'p...

  • 853 Views
  • 4 replies
  • 4 kudos
Latest Reply
Advika
Databricks Employee
  • 4 kudos

Hello @JonnyData! This parsing error typically appears when the notebook path isn't in the expected format. Could you share the exact %run command you're using?Also, please ensure the path is a workspace path, either absolute (starting with /) or rel...

  • 4 kudos
3 More Replies
Chinu
by New Contributor III
  • 910 Views
  • 2 replies
  • 1 kudos

databricks sdk version installed in serverless compute differ

Hello, I've encountered an issue with my Python notebook where app.list() is failing in some serverless compute clusters but works fine in others. After investigating further, I noticed the following version differences:Working Cluster: SDK version 0...

  • 910 Views
  • 2 replies
  • 1 kudos
Latest Reply
Chinu
New Contributor III
  • 1 kudos

oh, I found the doc and confirmed that list() method is added with version 0.27.0Added list() method for w.apps workspace-level service.https://github.com/databricks/databricks-sdk-py/blob/v0.40.0/CHANGELOG.mdNow, how can I update this to newer versi...

  • 1 kudos
1 More Replies
Witold
by Honored Contributor
  • 4903 Views
  • 4 replies
  • 2 kudos

Databricks runtime and Java Runtime

The Databricks runtime is shipped with two Java Runtimes: JRE 8 and JRE 17. While the first one is used by default, you can use the environment variable JNAME to specify the other JRE: JNAME: zulu17-ca-amd64.FWIW, AFAIK JNAME is available since DBR 1...

  • 4903 Views
  • 4 replies
  • 2 kudos
Latest Reply
catalyst
New Contributor II
  • 2 kudos

@Witold Thanks for the original post here, Any luck with jdk-21 on DBR-17?I'm using some java-17 features in the code alongside spark-4.0.0 which I wanted to run on DBR-17. Sadly the generic jname=zulu21-ca-amd64 did not work for me. I also tried oth...

  • 2 kudos
3 More Replies
Ajay3
by New Contributor
  • 3317 Views
  • 1 replies
  • 0 kudos

How can I install maven coordinates using init script?

Hi,I need to install the below maven coordinates on the clusters using databricks init scripts.1. coordinate: com.microsoft.azure:synapseml_2.12:0.11.2 with repo https://mmlspark.azureedge.net/maven2. coordinate: com.microsoft.azure:spark-mssql-conne...

  • 3317 Views
  • 1 replies
  • 0 kudos
Latest Reply
FRB1984
New Contributor II
  • 0 kudos

Did you get it working? I have the same issue!

  • 0 kudos
daboncanplay
by New Contributor
  • 1308 Views
  • 4 replies
  • 2 kudos

Asset Bundle on Free Edition

I am trying to use Databricks Asset Bundles in the workspace in my Free Edition account. I can deploy the bundle from my own computer, but I want to use the feature within Databricks. I am able to clone an empty repo to a Git folder and create the bu...

  • 1308 Views
  • 4 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @daboncanplay,Databricks Asset Bundles rely on Terraform under the hood to manage the Databricks resources the bundle defines. It automatically downloads the binaries it needs. If it can't, you'll get an error like the one you see. This looks like...

  • 2 kudos
3 More Replies
Anonymous
by Not applicable
  • 428 Views
  • 1 replies
  • 0 kudos

Delta Live Tables not enabled, nor is feature enablement in account settings

I am the account admin of a Premium workspace and I need Delta Live Tables enabled.I created a support request, but it was closed automatically due to lack of a support contract, despite being a Premium customer.There is no way in the UI to enable th...

  • 428 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Anonymous ,Are you sure? If you have premium workspace, you shouldn't do anything, DLT should be there. But they changed name recently, so now it's called Lakeflow Declartive Pipelines.Check if you can see in UI ETL Pipelines button. Go to:- Jobs...

  • 0 kudos
Buavika
by New Contributor II
  • 2022 Views
  • 6 replies
  • 4 kudos

Requesting Assistance with /billing-profiles API Endpoint Access - 503 TEMPORARILY_UNAVAILABLE

Hi,I am trying to use the `/billing-profiles` endpoint from the Accounts API to retrieve billing profile and currency information. I am authenticated using OAuth2 client credentials, but I am consistently getting the following error:Status Code: 503 ...

  • 2022 Views
  • 6 replies
  • 4 kudos
Latest Reply
Buavika
New Contributor II
  • 4 kudos

Hi @Advika ,Yes, I am able to access the workspace endpoint using OAuth2 setup .Our Databricks account is in the ap-southeast-2 region. Please let me know if any additional information is needed.    

  • 4 kudos
5 More Replies
Chris2794
by New Contributor II
  • 819 Views
  • 2 replies
  • 0 kudos

Azure Databricks databricks-cli authentication with M2M using environment variables

Which environment variables do I have to set to use the databricks-cli with m2m oauth using Microsoft Entra ID managed service principals? I already added the service principal to the workspace.I found the following documentation, but I am still conf...

  • 819 Views
  • 2 replies
  • 0 kudos
Latest Reply
Prakazsh
New Contributor II
  • 0 kudos

Hi @ashraf1395, I’m trying to use environment variables to configure the Databricks CLI, instead of relying on the .databrickscfg file.@Chris2794 Have you found a way to authenticate using Machine-to-Machine (M2M) OAuth with Microsoft Entra ID manage...

  • 0 kudos
1 More Replies
jeremy98
by Honored Contributor
  • 6452 Views
  • 33 replies
  • 5 kudos

Databricks to SFTP: Connection Fails Even with Whitelisted NAT Gateway IP

Hi community,I’m experiencing a strange issue with my connection from Databricks to an SFTP server.I provided them with an IP address created for Databricks via a NAT gateway, and that IP is whitelisted on their side. However, even though I have the ...

  • 6452 Views
  • 33 replies
  • 5 kudos
Latest Reply
Kenji_3000
New Contributor III
  • 5 kudos

Hi @szymon_dybczak , Thanks for the suggestion. I tried to attach the ip address to the NAT but it requires a NAT also to have an internet routing preference and Azure doesn't allow that. Quite a unique scenario indeed. We have created a workaround n...

  • 5 kudos
32 More Replies
Paweł_Janczyk
by New Contributor
  • 2226 Views
  • 7 replies
  • 1 kudos

missing github email address in commits

Hello everyone, I would like to configure github client on my databricks workspace. Especially an email address in not visible and is replace by my github user name. Do you know how I can do that? 

Pawe_Janczyk_0-1747049752792.png
  • 2226 Views
  • 7 replies
  • 1 kudos
Latest Reply
aakashnand-kt
New Contributor III
  • 1 kudos

@Louis_Frolio Hi Louis,I am using Enterprise databricks version. Could you please share the exact name of the features that needs to be enabled that I can ask my account team to enable this for us ?Thanks

  • 1 kudos
6 More Replies
PedroRodriguezx
by New Contributor
  • 688 Views
  • 1 replies
  • 1 kudos

Databricks RoadMap

Is there any RoadMap delivered by Databricks Team ? I'm doing some tests in my enviroment in order to decide if I will use Databricks as my GenAI platform. However, there are a lot of features in beta stage that can be usefull or not, and to properly...

  • 688 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey Pedro, Databricks does not have a public-facing roadmap.  Cheers, Lou.

  • 1 kudos
FRB1984
by New Contributor II
  • 1129 Views
  • 1 replies
  • 0 kudos

Error installing python dependencies

Hi guys! At first, I am sorry if my question is dumb but I am newbie to Databricks! I am just starting and as I try to install ibm_db as a dependency for my python notebook, I get this error :Collecting ibm_db (from -r /tmp/tmp-e8a4538f00b7498fab13c2...

  • 1129 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @FRB1984 ,It looks like the ibm_db package doesn’t support the Python version currently used in your Databricks environment.Instead of relying on ibm_db, you can use Spark’s JDBC connector with the IBM DB2 JDBC driver for a more stable and support...

  • 0 kudos
Khaja_Zaffer
by Contributor III
  • 2695 Views
  • 5 replies
  • 4 kudos

Resolved! community portal needs some updates

Hello  databricks teamcan you please make me understand the search bar on community portal doesnt show results. is it for me or for everyone?Also this portal needs good updates. 

  • 2695 Views
  • 5 replies
  • 4 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 4 kudos

@Advika Hey thank you so much.  Just wow. Its working now. I dont know what you did but it works as expected. thank you so much 

  • 4 kudos
4 More Replies
matthiasn
by New Contributor III
  • 832 Views
  • 2 replies
  • 0 kudos

Share notebooks with a AD-user assigned via group

Hi everybody,I try to share a notebook with a user that was assigned to the workspace via a AD-Group (using the new automatic sync). But I only see users directly assigned to the workspace. My expectation would be that I see all users that are within...

  • 832 Views
  • 2 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Here are some helpful tips/tricks, and general guidance.   Sharing notebooks with users who are members of Active Directory (AD) groups can sometimes be challenging due to the way Azure Databricks handles user permissions. Here are some important poi...

  • 0 kudos
1 More Replies