cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

noorbasha534
by Valued Contributor
  • 924 Views
  • 1 replies
  • 1 kudos

Disable usage of serverless jobs & serverless all-purpose clusters usage

Dear all,I see some developers started using serverless jobs and serverless all-purpose clusters. as a platform admin, I like to disable them as we are not yet prepared as a team to move to serverless; we get huge discounts on compute from Microsoft ...

  • 924 Views
  • 1 replies
  • 1 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 1 kudos

You can disable the serverless compute featuure from your account console :https://docs.databricks.com/aws/en/admin/workspace-settings/serverless#enable-serverless-computeI have heard that for some ,If this option is not available , it means it is au...

  • 1 kudos
Chris2794
by New Contributor II
  • 341 Views
  • 1 replies
  • 0 kudos

Azure Databricks databricks-cli authentication with M2M using environment variables

Which environment variables do I have to set to use the databricks-cli with m2m oauth using Microsoft Entra ID managed service principals? I already added the service principal to the workspace.I found the following documentation, but I am still conf...

  • 341 Views
  • 1 replies
  • 0 kudos
Latest Reply
ashraf1395
Honored Contributor
  • 0 kudos

 I would suggest to create .databrickscfg profile ,best practice and most easyYou will need to create it in your ~ directory vim ~/.databrickscfginside the file you can define multiple profiles like this https://learn.microsoft.com/en-us/azure/databr...

  • 0 kudos
bhanu_dp
by New Contributor III
  • 958 Views
  • 2 replies
  • 0 kudos

How to restore if a catalog is deleted

I am looking to identify potential pitfall in the decentralized workspace framework where the key business owner have full access to their respective workspace and catalogs. In case of accidental delete/drop schema or catalog from the UC, what are th...

Administration & Architecture
catalog
DR
Recovery
  • 958 Views
  • 2 replies
  • 0 kudos
Latest Reply
KaranamS
Contributor III
  • 0 kudos

Hi @bhanu_dp , To retrieve accidental deletes, you can -1. Restore it to a previous version using time travel featurehttps://docs.databricks.com/gcp/en/delta/history#restore-a-delta-table-to-an-earlier-state2. Use UNDROP commandhttps://docs.databrick...

  • 0 kudos
1 More Replies
pranav_
by New Contributor
  • 1476 Views
  • 1 replies
  • 0 kudos

How to Query All the users who have access to a databricks workspace?

Hi There,I'm new to Databricks and we currently have a lot of users among different groups having access to a databricks workspace. I would like to know how I could query the users, groups and Entitlements of each groups using SQL or the API. Incase ...

  • 1476 Views
  • 1 replies
  • 0 kudos
Latest Reply
tejaskelkar
New Contributor II
  • 0 kudos

To query all users who have access to a Databricks workspace, you can follow these steps:1. Check Workspace Users via Admin ConsoleIf you are a workspace admin, navigate to the Admin Console in the Databricks UI. Under the "Users" tab, you can view a...

  • 0 kudos
borft
by New Contributor
  • 1490 Views
  • 0 replies
  • 0 kudos

Databricks on GCP admin console access

Hi,I'm trying to update the GCP permissions for Databricks as described here: https://docs.databricks.com/gcp/en/admin/cloud-configurations/gcp/gce-updateTo be able to do that, I have to log in to the account console here: https://accounts.gcp.databr...

  • 1490 Views
  • 0 replies
  • 0 kudos
SANJAYKJ
by New Contributor II
  • 572 Views
  • 1 replies
  • 0 kudos

Spark Executor - Parallelism Question

I was reading the book Spark: The Definitive Guide, I came across below statement in Chapter 2 on partitions."If you have many partitions but only one executor, Spark will still have a parallelism of only one because there is only one computation res...

  • 572 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Contributor III
  • 0 kudos

Hey @SANJAYKJ It is correct in the sense that a single executor is a limiting factor, but the actual parallelism within that executor depends on the number of cores assigned to it. If you want to leverage multiple partitions effectively, you either n...

  • 0 kudos
mrstevegross
by Contributor III
  • 1808 Views
  • 4 replies
  • 0 kudos

Resolved! Possible to programmatically adjust Databricks instance pool more intelligently?

We'd like to adopt Databricks instance pool in order to reduce instance-acquisition times (a significant contributor to our test latency). Based on my understanding of the docs, the main levers we can control are: min instance count, max instance cou...

  • 1808 Views
  • 4 replies
  • 0 kudos
Latest Reply
Isi
Contributor III
  • 0 kudos

Hi Steve,If the goal is to pre-warm 100 instances in the Databricks Instance Pool, you could create a temporary job that will request instances from the pool. This ensures that Databricks provisions the required instances before the actual test run.T...

  • 0 kudos
3 More Replies
ErikApption
by New Contributor II
  • 1984 Views
  • 1 replies
  • 0 kudos

Notebook runs not found due to retention limits with dbutils.notebook.run

We saw this odd error in an AWS deployment, we have one notebook calling another one through dbutils.notebook.run(...) and this suddenly stopped working and failed with "Notebook runs not found due to retention limits", the "learn more" points to Dat...

  • 1984 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Contributor III
  • 0 kudos

Hey @ErikApption ,Maybe I am wrong but I will give you my opinion.Each time you execute dbutils.notebook.run(), it launches a new and independent execution within the same cluster. So, if you run the cell today and then run it again tomorrow, there s...

  • 0 kudos
tom_1
by New Contributor III
  • 1531 Views
  • 0 replies
  • 0 kudos

Customer Managed VPC: Databricks IP Address Ranges

Hello,how often does Databricks change its public ip addresses (the ones that must be whitelisted in a customer managed vpc) and where can I find them?I found this list, but it seems to be incomplete.We moved from a managed vpc to a customer-managed ...

  • 1531 Views
  • 0 replies
  • 0 kudos
zsucic1
by New Contributor III
  • 9563 Views
  • 3 replies
  • 1 kudos

Current Azure Managed Identity capabilities 2024?

Hello everyone, I have a few questions about MI capabilites: Is it possible to define a managed identity for Azure Databricks Service resource and use it for e.g.: Writing to Azure SQL Server database Authenticating to Azure Devops in order to downlo...

  • 9563 Views
  • 3 replies
  • 1 kudos
Latest Reply
zsucic1
New Contributor III
  • 1 kudos

Kaniz, thank you very much, you are the best! I will get to work implementing your advice

  • 1 kudos
2 More Replies
sam_tw
by New Contributor II
  • 2572 Views
  • 1 replies
  • 1 kudos

Resolved! Community Edition - Photon enabled possible?

Is it possible to use a photon enabled cluster in the community edition? I want to use DBR 13.3 LTS, but choosing that there is no option to enable photon.  I want to use test the spatial functionality in Databricks library Mosiac, and appears photon...

  • 2572 Views
  • 1 replies
  • 1 kudos
Latest Reply
Stefan-Koch
Valued Contributor II
  • 1 kudos

Hi @sam_tw Photon is not available in Community Edition

  • 1 kudos
SarahA
by New Contributor II
  • 1557 Views
  • 1 replies
  • 1 kudos

Resolved! Databricks app - permissions needed

Hi. I am trying to create a new databricks app and I get the following error"Failed to create app [appname]. User does not have permission to grant resource sql-warehouse."Can someone tell me what level of access I require in order to generate a data...

  • 1557 Views
  • 1 replies
  • 1 kudos
Latest Reply
lingareddy_Alva
Honored Contributor II
  • 1 kudos

Hi @SarahA ,Requied Permissions to Create a Data AppYou'll need the following permissions in Databricks:Permission Required Role / Grant Purpose CAN MANAGE on the SQL WarehouseSQL Warehouse Admin or OwnerTo manage warehouse settings and assign it to ...

  • 1 kudos
DavidSzedlak
by New Contributor
  • 1303 Views
  • 2 replies
  • 0 kudos

Mismatch of Columns in databricks vs Athena.

We are trying to expose one of our external tables to databricks via unity catalog, but we are having an issue with column mismatch, ie few of our columns are not visible in databricks. Is this a known issue? If so, can anyone best advise me on where...

  • 1303 Views
  • 2 replies
  • 0 kudos
Latest Reply
lingareddy_Alva
Honored Contributor II
  • 0 kudos

Hi @DavidSzedlak ,1.  Unity Catalog caches metadata for performance. If new columns were added to the source table after the initial creation, they may not be reflected.Run the following command to refresh metadata: ALTER TABLE <catalog>.<schema>.<ta...

  • 0 kudos
1 More Replies
kch
by New Contributor II
  • 822 Views
  • 3 replies
  • 0 kudos

Timeout on docker pull in Databricks Container Services

Hello,There is a timeout that limits the size of images used in Docker Container Service. When using images containing large ML libraries, the size often exceeds the limit that could be pulled. Is there any plan to add parametrization of this timeout...

  • 822 Views
  • 3 replies
  • 0 kudos
Latest Reply
kch
New Contributor II
  • 0 kudos

Are there any new or planned changes in the policy?

  • 0 kudos
2 More Replies