cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gabo2023
by New Contributor III
  • 1237 Views
  • 1 replies
  • 3 kudos

Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)

Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...

image.png image (1).png
  • 1237 Views
  • 1 replies
  • 3 kudos
Latest Reply
gabo2023
New Contributor III
  • 3 kudos

 

  • 3 kudos
555308
by New Contributor
  • 981 Views
  • 1 replies
  • 1 kudos

Cluster failed to start

I am getting this error in my Partner Databricks Account and I had tried several methods to start the cluster. As i don't have access to console.aws.amazon.com/ec2. I was not able to check the details/logs in the ec2 instance. I am getting the follow...

  • 981 Views
  • 1 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

Here is a similar topic:https://community.databricks.com/t5/machine-learning/problem-with-spinning-up-a-cluster-on-a-new-workspace/m-p/29996To actually fix/analyse the issue, you need access to the EC2 console unfortunately.  I assume someone in the ...

  • 1 kudos
diego_poggioli
by Contributor
  • 1144 Views
  • 1 replies
  • 1 kudos

Resolved! Service Principal for remote repository in workflow/job expiring token

I would like to create a databricks Job where the 'Run as' field is set to a ServicePrincipal. The Job points to notebooks stored in Azure DevOps.The step I've already performed are:I created the Service Principal and I'm now able to see it into the ...

  • 1144 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @diego_poggioli, Unfortunately, there is no direct way to bypass the use of expiring tokens when accessing Azure DevOps. The Azure DevOps PAT is used as a security measure to ensure that only authorized users can access the resources, and it is de...

  • 1 kudos
746837
by New Contributor II
  • 1109 Views
  • 2 replies
  • 0 kudos

Databricks and SMTP

Using databricks as aws partner trying to run python script to validate email addresses.  Whenever it gets to the smtp portion it times out.  I am able to telnet from python to the POP servers and get a response, I can ping domains and get replies, b...

  • 1109 Views
  • 2 replies
  • 0 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 0 kudos

When you say timing out, what is the error you are seeing in databricks side?

  • 0 kudos
1 More Replies
SanderJvanDijk
by New Contributor
  • 462 Views
  • 1 replies
  • 0 kudos

Ubuntu 18.4 EOL

Hi,last July 18th we were informed by Databricks that Ubuntu version 20.04 (operating system: Ubuntu 20.04.4 LTS) was going to be the only certified and supported Ubuntu version for the 10.4 runtime cluster we use. We have been experiencing some issu...

  • 462 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @SanderJvanDijk, it's unclear whether the issues you're experiencing with Databricks libraries are directly caused by the new Ubuntu version (20.04.4 LTS) that Databricks pushed. The provided information indicates that Databricks has been continu...

  • 0 kudos
SmileyVille
by New Contributor II
  • 1609 Views
  • 2 replies
  • 0 kudos

Resolved! Leverage Azure PIM with DataBricks with Contributor role privilege

We are trying to leverage Azure PIM.  This works great for most things, however; we've run into a snag.  We want to limit the contributor role to a group and only at the resource group level, not subscription.  We wish to elevate via PIM.  This will ...

  • 1609 Views
  • 2 replies
  • 0 kudos
Latest Reply
SmileyVille
New Contributor II
  • 0 kudos

Thanks - think we were originally overthinking this.We determined we were doing this correctly, the user just needed to switch to 'groups' within PIM to request elevation of permissions.  The larger issue is actually the 40 min user provisioning cycl...

  • 0 kudos
1 More Replies
jrosend
by New Contributor III
  • 3306 Views
  • 1 replies
  • 0 kudos

[Possible Bug] Repo Notebooks being modified without human interaction

Our production workspace has several Repos integrated with GitHub. These repos aways point to master and should never be modified manually by a human directly in the workspace as the pulls are triggered by a GitHub Actions workflow. This workflow cal...

MicrosoftTeams-image.png MicrosoftTeams-image (2).png
  • 3306 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @jrosend , The issue you are facing is that there are minor changes occurring in the notebooks in your Databricks Repos, which are causing conflicts during the automatic update process triggered by the GitHub Actions workflow. These changes are no...

  • 0 kudos
abhaigh
by New Contributor III
  • 1815 Views
  • 1 replies
  • 0 kudos

Error: cannot create permissions: invalid character '<' looking for beginning of value

I'm trying to use terraform to assign a cluster policy to an account-level group (sync'd from AAD via SCIM)My provider is configured like thisprovider "databricks" {alias = "azure_account"host = "accounts.azuredatabricks.net"account_id = "%DATABRICKS...

  • 1815 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @abhaigh ,  • The issue is related to applying a cluster policy to an account-level group using Terraform.• The error message indicates that the returned value from the API endpoint is not as expected. • To resolve the issue, follow these steps:...

  • 0 kudos
_YSF
by New Contributor II
  • 1799 Views
  • 1 replies
  • 0 kudos

Struggling with UC Volume Paths

I am trying to setup my volumes and give them paths in the data lake but I keep getting this message:Input path url 'abfss://my-container@my-storage-account.dfs.core.windows.net/' overlaps with managed storage within 'CreateVolume' callThere WAS some...

  • 1799 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @_YSF, The error message "Input path URL'  abyss://my-container@my-storage-account.dfs.core.windows.net/' overlaps with managed storage within' CreateVolume' call" suggests that there is an issue with setting up volumes and giving them paths in th...

  • 0 kudos
diego_poggioli
by Contributor
  • 4302 Views
  • 5 replies
  • 2 kudos

Resolved! Unable to list service principal in Job details RUN AS

I added the service principal in Admin Settings > Service Principal and then enabled all the Configurations "allow cluster creation", "databricks SQL access" and "workspace access". In the Permission settings I have enabled "Service principal: Manage...

  • 4302 Views
  • 5 replies
  • 2 kudos
Latest Reply
BilalAslamDbrx
Honored Contributor II
  • 2 kudos

For future readers - don't forget to add your email (e.g. me@foo.com) in the Service Principals permissions tab. This way, you will be able to see the newly-created service principal in the dropdown menu.

  • 2 kudos
4 More Replies
DevOps
by New Contributor
  • 1616 Views
  • 1 replies
  • 0 kudos

Workspace creation via terraform provider fails on AWS

I'm trying to create a new workspace in a empty account. I have managed to create all the other resources without issues but when I try to create the workspace it fails with the following error:Error: cannot create mws workspaces: MALFORMED_REQUEST: ...

  • 1616 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @DevOps, Your error is related to  credentials_id your workspace configuration. The credentials_id is a reference to the Databricks credentials configuration ID, which is generated when you create a new set of credentials. This ID represents your ...

  • 0 kudos
OU_Professor
by New Contributor II
  • 5158 Views
  • 1 replies
  • 0 kudos

Resolved! Connect Community Edition to Power BI Desktop

I have submitted this question several times to Databricks over the past few weeks, and I have gotten no response at all, not even an acknowledgement that my request was received.Please help.How can I connect a certain dataset in Databricks Community...

  • 5158 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @OU_Professor, To connect a particular data set in Databricks Community Edition to the Power BI desktop, you can follow these steps: 1. Install Databricks JDBC driver: You can download the JDBC driver from the Databricks JDBC driver download page....

  • 0 kudos
jaganadhg
by New Contributor
  • 742 Views
  • 1 replies
  • 0 kudos

Resolved! Clean up Databricks confidential computing resources

Hello All,I created a Databricks Premium Workspace for a Confidential Computing PoC. After creating a VM from Databricks UI, it came to notice that there is a new RG with managed identity, NAT Gateway, Public IP, security group, and a VNET (/16). I w...

Administration & Architecture
Confidential Compute
  • 742 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @jaganadhg , To delete the resources once you complete the work involving confidential computing, you can use the DELETE API call provided in the given information. Here is an example of how to use it: bashcurl -X DELETE 'https://accounts.cloud.d...

  • 0 kudos
PetePP
by New Contributor II
  • 794 Views
  • 2 replies
  • 0 kudos

Extreme RocksDB memory usage

During migration to production workload, I switched some queries to use RocksDB. I am concerned with its memory usage though. Here is sample output from my streaming query:   "stateOperators" : [ { "operatorName" : "dedupeWithinWatermark", "...

  • 794 Views
  • 2 replies
  • 0 kudos
Latest Reply
PetePP
New Contributor II
  • 0 kudos

Thank you for the input. Is there any particular reason why deduplication watermark makes it store everything and not just the key needed for deduplication? The 1st record has to be written to the table anyway, and its content is irrelevant as it jus...

  • 0 kudos
1 More Replies
Bagger
by New Contributor II
  • 1267 Views
  • 1 replies
  • 1 kudos

Resolved! Monitoring job metrics

Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...

Administration & Architecture
jobs
metrics
prometheus
  • 1267 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Bagger, You can monitor Databricks jobs and get the required metrics using a combination of Databricks features and Prometheus. Here's a general idea of how you could approach each metric you mentioned. 1. Failed jobs: Databricks provides a REST ...

  • 1 kudos
Labels