- 2115 Views
- 4 replies
- 2 kudos
Resolved! Disable ability to choose PHOTON
Dear all,as an administrator, I want to restrict developers from choosing 'photon' option in job clusters. I see this in the job definition when they choose it -"runtime_engine": "PHOTON"How can I pass this as input in the policy and restrict develop...
- 2115 Views
- 4 replies
- 2 kudos
- 2 kudos
You also need to make sure the policy permissions are set up properly. You can/should fix preexisting compute affected by the policy with the wizard in the policy edit screen.
- 2 kudos
- 6142 Views
- 5 replies
- 1 kudos
Databricks App in Azure Databricks with private link cluster (no Public IP)
Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...
- 6142 Views
- 5 replies
- 1 kudos
- 1 kudos
Behwar : you should have to create a specific private DNS zone for azure.databricksapps.com - if you do a nslookup on your apps url - you will see that it points to your workspace. In Azure using Azure (recursive) DNS you can see an important behavio...
- 1 kudos
- 508 Views
- 1 replies
- 0 kudos
remove s3 buckets
Hi,My databricks is based on AWS S3, I deleted my buckets, now Databricks is not working, how do I delete my Databricks?regards
- 508 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @owly! To delete Databricks after AWS S3 bucket deletion: - Terminate all clusters and instance pools.- Clean up associated resources, like IAM roles, S3 storage configurations, and VPCs.- Delete the workspace from the Databricks Account Consol...
- 0 kudos
- 3755 Views
- 1 replies
- 0 kudos
Databricks delta sharing design
DearsI wanted to have a mindshare around delta sharing - how do you decide how many shares to be created and share with other departments if you are maintaining an enterprise wide data warehoouse/lakehouse using Azure Databricks. I see from the docum...
- 3755 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @noorbasha534 ,Let me share a bit about our use case and how we’re handling Delta Sharing.Delta Sharing is indeed a simple and lightweight solution, and one of its main advantages is that it’s free to use. However, it still has several limitations...
- 0 kudos
- 1453 Views
- 4 replies
- 0 kudos
get permissions assignment done from the workspaces UI
Hi all,I am looking to capture events of permissions assigned on catalog/schemas/tables/views from the workspaces UI; example, someone gave another user USE CATALOG permission from the UI.Is it possible to capture all such events?appreciate the minds...
- 1453 Views
- 4 replies
- 0 kudos
- 0 kudos
@Advika can you kindly please let me know the action name that I should filter upon...
- 0 kudos
- 2290 Views
- 3 replies
- 1 kudos
misbehavior of spots with fallback to on demand on job clusters
In the last few days, I've encountered in Azure (and before that also in AWS, but a bit different) this message about failing to start a cluster"run failed with error message Cluster '0410-173007-1pjmdgi1' was terminated. Reason: INVALID_ARGUMENT (CL...
- 2290 Views
- 3 replies
- 1 kudos
- 1 kudos
I see "Fleet instances do not support GPU instances" so in this case it's a no-op
- 1 kudos
- 838 Views
- 2 replies
- 0 kudos
Access locked out with SSO
We were locked out of our account (expired secret for login via Azure Entra ID and password-based login disabled).How can i add a new secret in databricks if i'm only able to login with SSO and this is broken?
- 838 Views
- 2 replies
- 0 kudos
- 1672 Views
- 1 replies
- 0 kudos
Implementing Governance on DLT pipelines using compute policy
I am implementing governance over compute creation in the workspaces by implementing custom compute policies for all-purpose, job and dlt pipelines. I was successfully able to create compute policies for all-purpose and jobs where I could restrict th...
- 1672 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DeepankarB, To enforce compute policies for DLT pipelines, make sure your policy JSON includes policy_family_id: dlt and set apply_policy_default_values: true in the pipeline cluster settings. This helps apply the instance restrictions correctly ...
- 0 kudos
- 2192 Views
- 4 replies
- 0 kudos
Databricks Predictive optimization
If we want to enable Databricks Predictive Optimization, then is it also mandatory to enable serverless Job/Notebook Compute in our account. We already have Serverless SQL warehouse available in our workspaces.
- 2192 Views
- 4 replies
- 0 kudos
- 0 kudos
The documentation states this:Predictive optimization identifies tables that would benefit from ANALYZE, OPTIMIZE, and VACUUM operations and queues them to run using serverless compute for jobs.If I don't have serverless workloads enabled how does pr...
- 0 kudos
- 4346 Views
- 2 replies
- 0 kudos
Resolved! Migrate to a new account
Hey Team,We're looking into migrating our correct Databricks solution from 1 AWS account (us-east-1 region) to another (eu-central-1 region). I have no documentation left on/about how the corrent solution was provisioned, but I can see CloudFormation...
- 4346 Views
- 2 replies
- 0 kudos
- 0 kudos
I ended up using the terrafrom-databricks-provider tool to perform an export and import of the old workspace into the new one. All that was needed was a PAT in each, export from the old, sed the region, account and PAT and apply. This got me about 7...
- 0 kudos
- 796 Views
- 1 replies
- 0 kudos
Does using SDK API calls cost money?
When using the Databricks SDK to retrieve metadata—such as catalogs, schemas, or tables—through its built-in API endpoints, does this incur any cost similar to running SQL queries?Specifically, executing SQL queries via the API spins up a compute clu...
- 796 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi there @Skully, You are right since you are just fetching the metadata information from catalog, tables etc instead of directly interacting or running any SQL queries, it doesn't cost same as creating a compute. when we retrieve the metadata inform...
- 0 kudos
- 2261 Views
- 4 replies
- 1 kudos
Resolved! Enable Databricks system error
Hi,We want to enable some system system tables in our databricks workspace using this command:curl -v -X PUT -H "Authorization: Bearer <PAT token>" "https://adb-0000000000.azuredatabricks.net/api/2.0/unity-catalog/metastores/<metastore-id>/systemsche...
- 2261 Views
- 4 replies
- 1 kudos
- 1 kudos
While disabling some system schemas we disabled billing system schema and now we cannot enable it again due to this error: billing system schema can only be enabled by Databricks.How can I re-enable billing schema?
- 1 kudos
- 1462 Views
- 1 replies
- 0 kudos
Collation problem with df.first() when different from UTF8_BINARY
I'm getting a error when I want to select the first() from a dataframe when using a collation different than UTF8_BINARYThis works :df_result = spark.sql(f""" SELECT 'en-us' AS ETLLanguageCode""")display(df_result)print(df_resu...
- 1462 Views
- 1 replies
- 0 kudos
- 1823 Views
- 1 replies
- 0 kudos
Resolved! Can a SQL Warehouse Pro be shared across multiple workspaces
I'm currently using a SQL Warehouse Pro in one of my Databricks workspaces, and I’m trying to optimize costs. Since the Pro Warehouse can be quite expensive to run, I’d prefer not to spin up additional instances in each workspace.Is there any way to ...
- 1823 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @jfid A SQL Warehouse Pro instance cannot be shared directly across multiple Databricks workspaces. Each workspace requires its own SQL Warehouse instance, even if the compute and data access needs are similar. This is because compute resources li...
- 0 kudos
- 3800 Views
- 2 replies
- 0 kudos
Convert Account to Self-managed
I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...
- 3800 Views
- 2 replies
- 0 kudos
- 0 kudos
Or better yet if we could delete it so I can re-create the account.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
58 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |