cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Wycliff
by New Contributor II
  • 2290 Views
  • 1 replies
  • 0 kudos

JWT Encoding error while using Azure secret key

My secret value in Azure key vault is like below.private_key="""-----BEGIN RSA PRIVATE KEY-----********-----END RSA PRIVATE KEY-----"""Running this command in Databricks notebook - jwt.encode(claim_set,private_key,algorithm='RS256')While using the ab...

  • 2290 Views
  • 1 replies
  • 0 kudos
Latest Reply
Wycliff
New Contributor II
  • 0 kudos

Thanks much for your troubleshooting methods.Validated the secret scopes, accessing secrets. These looks fine.Key format - I feel problem is with the key format only. As of now I'm awaiting on Azure subscription access. But I printed the secret value...

  • 0 kudos
Lucidity
by New Contributor II
  • 20915 Views
  • 2 replies
  • 1 kudos

Resolved! Deny assignment modification to allow attach/detach of disks in azure databricks

Our application does storage autoscaling on Azure. We would like to deploy our solution with Azure databricks. But even though the service principal associated with our application has the necessary roles and permissions to attach/detach a disk from ...

  • 20915 Views
  • 2 replies
  • 1 kudos
Latest Reply
Lucidity
New Contributor II
  • 1 kudos

Thank you for your replyIs there any way databricks provides to bypass the deny assignment for specific apps? I noticed in the deny assignment unity-catalog-access-connector has been provided exlusion under the excludePrincipals section. is there a w...

  • 1 kudos
1 More Replies
AadityaBhatt
by New Contributor II
  • 2311 Views
  • 3 replies
  • 1 kudos

[UNBOUND_SQL_PARAMETER] When running Placeholder query

I am using the databricks-sql-go library version 1.5.2. I am trying to run a query with placeholders of type '?'The query looks like params, args := databricksParams(values)sql := fmt.Sprintf(`SELECT COUNT(*) FROM %s.%s WHERE %s IN (%s)`, schema, tab...

  • 2311 Views
  • 3 replies
  • 1 kudos
Latest Reply
SergeRielau
Databricks Employee
  • 1 kudos

Can you print out an example after the Sprintf substitutions?It seems you generated a query with a named parameter: ":_58" But args (which should be a Map) does not have a key named "_58".

  • 1 kudos
2 More Replies
Sweetness
by New Contributor II
  • 2483 Views
  • 3 replies
  • 0 kudos

Cannot create a repo because the parent path does not exist

I tried following this docWork With Large Monorepos With Sparse Checkout Support in Databricks Repos | Databricks BlogWhen I hook it up to my repos using Azure DevOps Services and check mark Sparse checkout mode, I pass in a subdirectory in my Cone p...

  • 2483 Views
  • 3 replies
  • 0 kudos
Latest Reply
Sweetness
New Contributor II
  • 0 kudos

This is Azure Databricks

  • 0 kudos
2 More Replies
Lambda
by New Contributor
  • 1051 Views
  • 1 replies
  • 0 kudos

What is the REST API payload for "not sending alert when alert is back to normal"?

Greeting,I am using REST API to create & schedule sql alert by default these alerts will send notifications when they are back to normal which I don't want, In the UI I have the option to uncheck the box (shown in the picture) but I can't find any do...

Lambda_0-1702306187563.png
  • 1051 Views
  • 1 replies
  • 0 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 0 kudos

@Lambda Did you refer this documentation: https://docs.databricks.com/api/workspace/alerts/create

  • 0 kudos
Phani1
by Valued Contributor II
  • 1142 Views
  • 1 replies
  • 0 kudos

Spark VCore for Databrick vCPU

Hi Team,What is the equivalent Spark VCore for Databrick vCPU? , like for below example for DS3 v2 , vCPU=4 and RAM = 14.00 GiB, would like to know equivalent Spark VCore for DS3 v2 as in  Azure Databricks Pricing | Microsoft AzureRegards,Phanindra

  • 1142 Views
  • 1 replies
  • 0 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 0 kudos

@Phani1 In Databricks, the vCPU count is equivalent to the number of Spark vCores. This means that if the DS3 v2 instance has 4 vCPU, it would also have 4 Spark vCores. Please note that the Spark VCore count is based on the vCPU count of the underlyi...

  • 0 kudos
AtomicBoy99
by New Contributor
  • 837 Views
  • 1 replies
  • 0 kudos

Can't enter 'edit mode' using shortcut

Hi,With the recently added AI Assistant to databricks notebooks, I'm having issues entering the 'edit mode' of a notebook cell. Previously, I could simply press the 'Enter' key in order to do this but this no longer works.Is anyone else having the sa...

  • 837 Views
  • 1 replies
  • 0 kudos
Latest Reply
Yeshwanth
Databricks Employee
  • 0 kudos

@AtomicBoy99, it seems like the fix is in place. Can you confirm if it is working well now?

  • 0 kudos
edmundsecho
by New Contributor II
  • 3721 Views
  • 2 replies
  • 1 kudos

Resolved! Difference between username and account_id

I have a web app that can read files from a person's cloud-based drive (e.g., OneDrive, Google Drive, Dropbox).  The app gets access to the files using OAuth2. The app only ever has access to the files for that user.  Part of the configuration requir...

  • 3721 Views
  • 2 replies
  • 1 kudos
Latest Reply
edmundsecho
New Contributor II
  • 1 kudos

The provided links were helpful.  The take-away* usernames are "globally" unique to an individual; the username is the person's email.* a username can be associated with up to 50 accounts; account_ids track the resources available to the user.This cl...

  • 1 kudos
1 More Replies
Kris2
by New Contributor II
  • 2300 Views
  • 1 replies
  • 0 kudos

Resolved! Unable to create a Managed table in Unity Catalog defaul location

We have setup the metastore with Manged Identity and when trying to create a managed table in the default location I am hitting below error. The storage is ADLS Gen2. AbfsRestOperationException: Operation failed: "This request is not authorized to pe...

  • 2300 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @Kris2 , I completely understand your hesitation and appreciate your approach to seeking guidance!  This error generally means that the cluster has connectivity to Unity Catalog configured storage location and it is not authorized to access storag...

  • 0 kudos
AbdurRehman
by New Contributor II
  • 720 Views
  • 0 replies
  • 1 kudos

Error Signing Up for Databricks Community Edition

@Retired_mod I've been trying to sign up for Databricks Community Edition using different email addresses over the past 24 hours, but I keep getting the error message: "An error has occurred. Please try again later." Can anyone help?Tags: #Databricks...

  • 720 Views
  • 0 replies
  • 1 kudos
ChristianRRL
by Valued Contributor
  • 1722 Views
  • 0 replies
  • 0 kudos

Unity Catalog: Databricks *Specific* Features

Good day,Deceptively simple question, are there any "Databricks only" specific features that Unity Catalog offers? I understand that generally speaking enabling UC offers some of the following:Data Discovery and LineageAuditing and MonitoringAccess C...

  • 1722 Views
  • 0 replies
  • 0 kudos
CMA
by New Contributor II
  • 3414 Views
  • 2 replies
  • 0 kudos

Problem login in

Hello allI´m new in this platform, I sign up, validated my email, create my password everything is fine and when I try to log in and start a message came upI create a new password but same happen again! But it works a few times, I think like 3 times....

CMA_0-1706923323303.png
  • 3414 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Can you confirm the username was created all lower case? Login is case sensitive so you need to make sure the username is set same exact as you add it in the console or workspace

  • 0 kudos
1 More Replies
ChristianRRL
by Valued Contributor
  • 6195 Views
  • 3 replies
  • 3 kudos

Resolved! DLT Job Clusters: Continuous vs Triggered Cluster Start Times

Hi there,I'm curious if anyone is able to definitively help me answer how DLT Job Clusters operate/run.For example, the following is my baseline understanding of DLT Job Clusters. If I run a Triggered DLT Pipeline (e.g. daily) the job cluster takes m...

  • 6195 Views
  • 3 replies
  • 3 kudos
Latest Reply
melbourne
Contributor
  • 3 kudos

Ideally one would expect clusters used for DLT pipeline to terminate after the pipeline execution has finished. However, while running in `development` environment, you'll notice it doesn't terminate on its own, whereas in `production` it terminates ...

  • 3 kudos
2 More Replies
Databricks_Java
by New Contributor
  • 2541 Views
  • 1 replies
  • 0 kudos

Databricks Java - Create Jar in Java 11

I am trying to a run simple print java program which is not working and getting compilation version issues though i changed the environment variable points to java 11. Can you please help me ? Can we create java with spark session and execute as a ja...

Get Started Discussions
Databricks
env
jar
java
spark
  • 2541 Views
  • 1 replies
  • 0 kudos
Latest Reply
arpit
Databricks Employee
  • 0 kudos

@Databricks_Java You can run command like this: spark-submit --class com.test.Main example.jarand make sure to check the java version and match with the DBR compatibility

  • 0 kudos
mathijs-fish
by New Contributor III
  • 1421 Views
  • 1 replies
  • 0 kudos

Disable personal compute with the Databricks API or UI

For a production environment, I want to disable the personal compute policy, because I do not want that all users can create personal compute clusters in production. Unfortunately, I am not able to access the account console, so I want to revoke perm...

mathijsfish_0-1702396352437.png mathijsfish_1-1702396390529.png
Get Started Discussions
compute
permissions
policies
  • 1421 Views
  • 1 replies
  • 0 kudos
Latest Reply
arpit
Databricks Employee
  • 0 kudos

@mathijs-fish You need to be admin to disable a policy.

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors