cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

rew_data
by New Contributor II
  • 1630 Views
  • 4 replies
  • 0 kudos

Resolved! Databricks apps, data plane configuration not supported

Unable to create app, get 'This workspace has a data plane configuration that is not yet supported' message.   Is there something specific I should look for configuration wise to correct the issue?   Azure hosted. Virtual network.

  • 1630 Views
  • 4 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @rew_data, You might want to check if you region is available to use Databricks Apps, please refer to: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-apps/ The error message "This workspace has a data plane configuratio...

  • 0 kudos
3 More Replies
gyorgyjelinek
by New Contributor II
  • 632 Views
  • 1 replies
  • 1 kudos

REST API List dashboard schedules - 501 NOT IMPLEMENTED

When I try to retrieve the dashboard scheduling info based on REST API List dashboard schedules I receive the following `501 NOT IMPLEMENTED` response:{ "error_code": "NOT_IMPLEMENTED", "message": "This API is not yet supported." }But e.g the...

  • 632 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hello @gyorgyjelinek, I just tried testing on my end and got the same failure as yours: python3 list_dashboardID.py Error 501: {"error_code":"NOT_IMPLEMENTED","message":"This API is not yet supported."} This endpoint might not be fully supported yet ...

  • 1 kudos
iskidet01
by New Contributor II
  • 1577 Views
  • 1 replies
  • 0 kudos

Resolved! Challenge isolating databricks workspace with single unity catalog metstore for multiple workspaces

Hello Community,I am currently managing multiple workspaces for various projects and facing challenges in achieving data asset isolation between these workspaces. My goal is to ensure that data sharing happens exclusively through Delta Sharing.The cu...

  • 1577 Views
  • 1 replies
  • 0 kudos
Latest Reply
Stefan-Koch
Valued Contributor II
  • 0 kudos

Hi iskidet01You can use use workspace-catalog bindings. https://learn.microsoft.com/en-us/azure/databricks/catalogs/#workspace-catalog-binding. When you create a catalog, you can assign it to specific workspace, instead of "All workspaces have access...

  • 0 kudos
jonxu
by New Contributor III
  • 1089 Views
  • 3 replies
  • 0 kudos

PAT needed but not allowed in "Advanced Data Engineering - 6.5L Deploy pipeline with the CLI Lab"

It is stated in the lab notebook that:Run the setupRun the setup script for this lesson by running the cell below. This will ensure that:The Databricks CLI is installedAuthentication is configuredA pipeline is createdHowever, when I tried to run the ...

  • 1089 Views
  • 3 replies
  • 0 kudos
Latest Reply
jonxu
New Contributor III
  • 0 kudos

... and if I start with step 5, using workspace-level authorisation, I ended up with "localhost refused to connect." in the generated link.

  • 0 kudos
2 More Replies
martkev
by New Contributor III
  • 927 Views
  • 1 replies
  • 0 kudos

Will Lakehouse Federation between Databricks and Snowflake support Azure Entra ID?

The Lakehouse Federation between Databricks and Snowflake looks promising, but the lack of support for Azure Entra ID as an identity provider (IdP) is a big limitation for enterprises standardized on it.Managing separate OAuth flows or using Snowflak...

  • 927 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @martkev, Currently, Azure Databricks does not support using Azure Entra ID (formerly Azure Active Directory) directly as an identity provider (IdP) for federated queries on Snowflake. The only supported OAuth integration for Snowflake is Snowf...

  • 0 kudos
seckinaktas
by New Contributor II
  • 2956 Views
  • 2 replies
  • 2 kudos

Resolved! system schemas permission

Hi,I'm an account admin on Databricks and  when I try to set select permission for system schemasI take "PERMISSION_DENIED: User is not an owner of Schema 'system.compute'." When I try to set permission for system catalog,I take "Requires ownership o...

  • 2956 Views
  • 2 replies
  • 2 kudos
Latest Reply
seckinaktas
New Contributor II
  • 2 kudos

Thank you so much

  • 2 kudos
1 More Replies
AlbertWang
by Valued Contributor
  • 2740 Views
  • 3 replies
  • 2 kudos

Networking configuration of Azure Databricks managed storage account

Hi all,I created an Azure Databricks Workspace, and the workspace creates an Azure Databricks managed storage account.The networking configuration of the storage account is "Enabled from all networks".Shall I change it to "Enabled from selected virtu...

  • 2740 Views
  • 3 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

You dont need view on the subnets itself.In regards the Disabling key access you could use any of the other authentication methods listed here: https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage#connect-to-azure-data-lak...

  • 2 kudos
2 More Replies
SunilPoluri
by New Contributor
  • 1998 Views
  • 2 replies
  • 0 kudos

Create Databricks managed service principal programatically ?

For the current Databricks service principal API or the Databricks SDK, an ID is required. However, when dealing with Databricks-managed service principals, you typically only have the name. For registering with cloud providers, like Microsoft Entra ...

  • 1998 Views
  • 2 replies
  • 0 kudos
Latest Reply
giladba
New Contributor III
  • 0 kudos

Have you found a solution on how to programmatically create a Databricks managed service principal?

  • 0 kudos
1 More Replies
gyorgyjelinek
by New Contributor II
  • 1747 Views
  • 2 replies
  • 2 kudos

Resolved! Default schema in SQL Editor is not 'default' when unity catalog is set as default catalog

In workspace settings: Workspace admin - advanced - other - Default catalog for the workspace is set to different than hive_metastore, it is set to a `Unity Catalog` catalog - the expected behaviour is copied here from the related more info panel:"Se...

  • 1747 Views
  • 2 replies
  • 2 kudos
Latest Reply
gyorgyjelinek
New Contributor II
  • 2 kudos

Hi @Alberto_Umana ,Thank you for the explanation. I mark your comment as the accepted solution as it contains the current implementation logic and the work around. Good to know that the more info panel is a bit misleading as of now because the SQL Ed...

  • 2 kudos
1 More Replies
charl-p-botha
by New Contributor III
  • 3863 Views
  • 10 replies
  • 3 kudos

Error "Integrating Apache Spark with Databricks Unity Catalog Assets via Open APIs" on Azure

Great blog post: https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-catalog-assets/ba-p/97533I have attempted to reproduce this with Azure Databricks, and ADLS gen2 as the storage backend.Although I'm ab...

  • 3863 Views
  • 10 replies
  • 3 kudos
Latest Reply
charl-p-botha
New Contributor III
  • 3 kudos

Thanks @dkushari I looked at the github issue you posted, but it has to do specifically with DELTA_UNSUPPORTED_SCHEMA_DURING_READ when streaming *from* a delta table.The specific error I'm seeing is a key error for the Azure storage account hosting t...

  • 3 kudos
9 More Replies
kevindenham
by New Contributor
  • 928 Views
  • 1 replies
  • 0 kudos

Python User Input During Run-time

I'm new to Advana and was putting together a Python script that requires user interactions during run-time. However, the program pauses at an 'input()' command without a console cell to accept responses. Am I missing something about this instance of ...

  • 928 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

This is a common issue because Jupyter Notebooks are not designed to handle interactive input in the same way as a standard Python script run in a terminal. In Jupyter Notebooks, the input() function does not work as expected because the notebook int...

  • 0 kudos
AnkurMittal008
by New Contributor III
  • 962 Views
  • 1 replies
  • 3 kudos

Disable Catalog for predictive optimization

Let's we disabled predictive optimization for a specific Catalog name "CatalogXYZ" and after that enabled the predictive optimization at Account level. After that can schema owner for the Schema "CatalogXYZ.TestSchema" can enable the predictive optim...

  • 962 Views
  • 1 replies
  • 3 kudos
Latest Reply
Walter_C
Databricks Employee
  • 3 kudos

If predictive optimization is disabled for the catalog "CatalogXYZ" and then enabled at the account level, the schema owner for "CatalogXYZ.TestSchema" cannot enable predictive optimization for this schema. This is because the predictive optimization...

  • 3 kudos
Phani1
by Databricks MVP
  • 708 Views
  • 1 replies
  • 0 kudos

Downstream usage control on Serverless

Hi All, We've noticed a significant increase in our Databricks  Serverless usage due to downstream system activity. We would like to reduce overall consumption by serverless. Please suggest us the possible ways and  best practices we can implement to...

  • 708 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @Phani1, You might want to review this document: https://docs.databricks.com/en/compute/serverless/best-practices.html Let me know if you have any further question.

  • 0 kudos
amberleong
by New Contributor
  • 828 Views
  • 1 replies
  • 0 kudos

How to only allow one git branch, one folder?

Users are able to switch branches in the git UI. How to restrict or only allow one branch?Also, for the sparse checkout, how to view only one folder (without files from root)

  • 828 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @amberleong, To restrict users from switching branches in the Git UI and only allow one branch, you can implement branch protection rules in your Git repository, directly from your source code git tool.

  • 0 kudos
dtb_usr
by New Contributor II
  • 2163 Views
  • 2 replies
  • 0 kudos

Okta SSO Unified login in GCP

Hi,There are versions of this question posted already but they seem to refer to legacy features. Our organisation uses google workspace IP provisioned via Okta as the first landing point and all apps are secured behind this. We have purchased Databri...

  • 2163 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @dtb_usr, It is possible to use OKTA IdP to log into Databricks in GCP, please refer to: https://docs.gcp.databricks.com/en/admin/users-groups/scim/okta.html

  • 0 kudos
1 More Replies