- 1967 Views
- 5 replies
- 5 kudos
Resolved! Databricks cluster pool deployed through Terraform does not have UC enabled
Hello everyone,we have a workspace with UC enabled, we already have a couple of catalogs attached and when using our personal compute we are able to read/write tables in those catalogs.However for our jobs we deployed a cluster pool using Terraform b...
- 1967 Views
- 5 replies
- 5 kudos
- 717 Views
- 1 replies
- 0 kudos
GCP Databricks | Workspace Creation Error: Storage Credentials Limit Reached
Hi Team,We are encountering an issue while trying to create a Databricks Workspace in the GCP region us-central1. Below is the error message:Error Message:Workspace Status: FailedDetails: Workspace failed to launch.Error: BAD REQUEST: Cannot create 1...
- 717 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @karthiknuvepro, Do you have an active support plan? Over a ticket with us we can request the increase of this limit.
- 0 kudos
- 1747 Views
- 4 replies
- 0 kudos
How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months
How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months
- 1747 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @AnkitShah, I just tried on my end and found these 2 tables that might be useful. They do not exact show who downloaded a model artifact but who interacted with it: https://docs.databricks.com/en/ai-gateway/configure-ai-gateway-endpoints.html#usag...
- 0 kudos
- 1912 Views
- 6 replies
- 1 kudos
Governance to restrict compute creation
Hi,Cluster policies used to be an easy way to handle governance on computes. However, more and more, there seem to be no way to control many new compute features within the platform. We currently have this issue for model serving endpoints and vector...
- 1912 Views
- 6 replies
- 1 kudos
- 1 kudos
If you are looking to restrict end users to create certain cluster configuration only, you can do so by using databricks APIs. Through python and Databricks API, you can specify what kind of cluster configurations are allowed and also restrict users ...
- 1 kudos
- 3173 Views
- 1 replies
- 4 kudos
High memory usage on Databricks cluster
In my team we have a very high memory usage even when the cluster has just been started and nothing has been run yet. Additionally, memory usage never drops to lower levels - total used memory always fluctuates around 14GB.Where is this memory usage ...
- 3173 Views
- 1 replies
- 4 kudos
- 4 kudos
This is not necessarily an issue. Linux uses a lot of RAM for caching but this does not mean it cannot be released for processes (dynamic memory mgmt).Basically the philosophy is that RAM that is not used (so actually 'free') is useless.Here is a re...
- 4 kudos
- 1069 Views
- 1 replies
- 0 kudos
Newbie DAB question regarding wheels
I am trying to build a wheel using a DAB. It errors saying I don't have permissions to install my wheel onto a cluster I am have been given. Is it possible to just upload the wheel to a subdir the /Shared directory and use it from there instead of ...
- 1069 Views
- 1 replies
- 0 kudos
- 0 kudos
May I know the exact error you are getting on the cluster?You can use the following code to use a wheel in a shared folder: resources: jobs: my-job: name: my-job tasks: - task_key: my-task new_cluster: ...
- 0 kudos
- 1486 Views
- 2 replies
- 0 kudos
Error when Connecting Databricks Cluster to RStudio Desktop App
Hi! I am trying to connect RStudio to my Databricks Cluster, I already change the permissions to CAN MANAGE and CAN ATTACH to the cluster. Also I have verified to have the correct python version and Databricks version in my computer.This is the error...
- 1486 Views
- 2 replies
- 0 kudos
- 0 kudos
This seems to solve the problem: https://github.com/sparklyr/sparklyr/issues/3449Apparently sparklyr requires that Unity Catalog is enabled on the cluster in order to get the connection working right.
- 0 kudos
- 956 Views
- 1 replies
- 1 kudos
Resolved! How do I track notebooks in all purpose compute?
I am trying to map out costs for a Shared cluster used in our organization. Since Databricks does not store the sessions in all purpose compute or who accessed the cluster, what are some possible options that I can track which notebooks were attached...
- 956 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sparkplug, You can use the audit logs and billing usage table: https://docs.databricks.com/en/admin/account-settings/audit-logs.html
- 1 kudos
- 5238 Views
- 9 replies
- 1 kudos
Need to move files from one Volume to other
We recently enabled Unity catalog on our workspace, as part of certain transformations(Custom clustered Datapipelines(python)) we need to move file from one volume to other volume. As the job itself runs on a service principal that has access to exte...
- 5238 Views
- 9 replies
- 1 kudos
- 1 kudos
Not all job clusters work well with Volumes. I used following type cluster to access files from Volume.
- 1 kudos
- 596 Views
- 1 replies
- 2 kudos
Missing Permission option for DLT pipline
I do not see permission option in DLT pipeline's page kebab menu as well, can some one help on this?
- 596 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @kpriya, Ensure you are on the correct page where the pipeline is listed. Click on the pipeline name to go into its details Once you are on the pipeline details page, look for the kebab menu (three vertical dots) associated with the pipeline. This...
- 2 kudos
- 1849 Views
- 2 replies
- 0 kudos
databricks bundle init is not working
I use Databricks Standard Tier workspaces in Azure.When I am running databricks bundle init, I am getting an error:Error: failed to compute file content for {{.project_name}}/resources/{{.project_name}}.pipeline.yml.tmpl. template: :6:20: executing "...
- 1849 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @templier2, The error you are encountering when running databricks bundle init is due to the fact that Unity Catalog is not supported on Standard tier workspaces
- 0 kudos
- 1402 Views
- 3 replies
- 0 kudos
Resolved! CREATE EXTERNAL LOCATION read only through SQL
Hello,I have set up a storage credential with READ ONLY access to my Azure storage account (landing-zone). I want to create an EXTERNAL LOCATION using SQL, set the EXTERNAL LOCATION to read only and use the read-only storage credential I made.I canno...
- 1402 Views
- 3 replies
- 0 kudos
- 0 kudos
Sure, happy to help, let me know in case you need any other assistance.
- 0 kudos
- 2047 Views
- 2 replies
- 0 kudos
Resolved! connect azure openai service deployments from databricks workspace
HiFor the company i work for, I have created an azure openai service instance, with the intention to deploy models and interact with them from a databricks workspace.The basic properties of my azure openai service are:Network: separate vnet for the r...
- 2047 Views
- 2 replies
- 0 kudos
- 0 kudos
This sounds like a scenario for the Mosaic AI Gateway, Mosaic AI Gateway | Databricks
- 0 kudos
- 5657 Views
- 3 replies
- 0 kudos
How to get Workspace name with workspaceId?
I have an AWS Managed Databricks instance. I am trying to get a workspace name with workspace ID. Thank you very much for your time and assistance.
- 5657 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi All, There's a few ways to find your workspace name: Quickly: it's the name in the URL before .cloud, so in the URL https://devrel.cloud.databricks.com/?o=556812371150522 the name is 'devrel'Whilst in the notebook: spark.conf.get("spark.databricks...
- 0 kudos
- 3824 Views
- 7 replies
- 3 kudos
Azure basic public IP SKU retirement
With Azure notifying that retirement of the basic public IP SKU in 2025 I'd like to understand how compute cluster workers can be moved over from basic to standard SKU? We've used Terraform to deploy our Databricks environment but I'd like to underst...
- 3824 Views
- 7 replies
- 3 kudos
- 3 kudos
I contacted Microsoft, they spoke with their SME of the Databricks team who have confirmed they're aware of this and will be transitioning from Basic to Standard SKU before deprecation - a update from Microsoft will be getting released regarding this...
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
57 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |