- 234 Views
- 5 replies
- 5 kudos
Resolved! Databricks cluster pool deployed through Terraform does not have UC enabled
Hello everyone,we have a workspace with UC enabled, we already have a couple of catalogs attached and when using our personal compute we are able to read/write tables in those catalogs.However for our jobs we deployed a cluster pool using Terraform b...
- 234 Views
- 5 replies
- 5 kudos
- 81 Views
- 1 replies
- 0 kudos
GCP Databricks | Workspace Creation Error: Storage Credentials Limit Reached
Hi Team,We are encountering an issue while trying to create a Databricks Workspace in the GCP region us-central1. Below is the error message:Error Message:Workspace Status: FailedDetails: Workspace failed to launch.Error: BAD REQUEST: Cannot create 1...
- 81 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @karthiknuvepro, Do you have an active support plan? Over a ticket with us we can request the increase of this limit.
- 0 kudos
- 209 Views
- 2 replies
- 0 kudos
Disable 'Allow trusted Microsoft services to bypass this firewall' for Azure Key Vault
Currently even when using vnet injected Databricks workspace, we are unable to fetch the secrets from AKV if the 'Allow trusted Microsoft services to bypass this firewall' is disabled.The secret is used a AKV backed secret scope and the key vault is ...
- 209 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @rdadhichi, Have you set "Allow access from" to "Private endpoint and selected networks" on the firewall?
- 0 kudos
- 154 Views
- 4 replies
- 0 kudos
How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months
How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months
- 154 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @AnkitShah, I just tried on my end and found these 2 tables that might be useful. They do not exact show who downloaded a model artifact but who interacted with it: https://docs.databricks.com/en/ai-gateway/configure-ai-gateway-endpoints.html#usag...
- 0 kudos
- 285 Views
- 6 replies
- 0 kudos
Governance to restrict compute creation
Hi,Cluster policies used to be an easy way to handle governance on computes. However, more and more, there seem to be no way to control many new compute features within the platform. We currently have this issue for model serving endpoints and vector...
- 285 Views
- 6 replies
- 0 kudos
- 0 kudos
If you are looking to restrict end users to create certain cluster configuration only, you can do so by using databricks APIs. Through python and Databricks API, you can specify what kind of cluster configurations are allowed and also restrict users ...
- 0 kudos
- 253 Views
- 1 replies
- 3 kudos
High memory usage on Databricks cluster
In my team we have a very high memory usage even when the cluster has just been started and nothing has been run yet. Additionally, memory usage never drops to lower levels - total used memory always fluctuates around 14GB.Where is this memory usage ...
- 253 Views
- 1 replies
- 3 kudos
- 3 kudos
This is not necessarily an issue. Linux uses a lot of RAM for caching but this does not mean it cannot be released for processes (dynamic memory mgmt).Basically the philosophy is that RAM that is not used (so actually 'free') is useless.Here is a re...
- 3 kudos
- 122 Views
- 1 replies
- 0 kudos
Newbie DAB question regarding wheels
I am trying to build a wheel using a DAB. It errors saying I don't have permissions to install my wheel onto a cluster I am have been given. Is it possible to just upload the wheel to a subdir the /Shared directory and use it from there instead of ...
- 122 Views
- 1 replies
- 0 kudos
- 0 kudos
May I know the exact error you are getting on the cluster?You can use the following code to use a wheel in a shared folder: resources: jobs: my-job: name: my-job tasks: - task_key: my-task new_cluster: ...
- 0 kudos
- 655 Views
- 2 replies
- 0 kudos
Error when Connecting Databricks Cluster to RStudio Desktop App
Hi! I am trying to connect RStudio to my Databricks Cluster, I already change the permissions to CAN MANAGE and CAN ATTACH to the cluster. Also I have verified to have the correct python version and Databricks version in my computer.This is the error...
- 655 Views
- 2 replies
- 0 kudos
- 0 kudos
This seems to solve the problem: https://github.com/sparklyr/sparklyr/issues/3449Apparently sparklyr requires that Unity Catalog is enabled on the cluster in order to get the connection working right.
- 0 kudos
- 12551 Views
- 25 replies
- 22 kudos
Resolved! Unable to login to Azure Databricks Account Console
I have a personal Azure pay-as-you-go subscription in which I have the 'Global Administrator' role. I am also the databricks account administrator.Until two weeks ago, I was able to access the databricks account console without any issues, but I am f...
- 12551 Views
- 25 replies
- 22 kudos
- 22 kudos
Hi @RameshRetnasamy thanks for your solution.I was able to login to Databricks Account Console but was unable to set the metastore path.Something got broke along the way.It would definitely be beneficial if we could have the same email address for ou...
- 22 kudos
- 365 Views
- 5 replies
- 0 kudos
Resolved! Updating Workspace Cluster
Hello,My organization is experiencing difficulties updating our Google Kubernetes Engine (GKE) cluster.We've reviewed the official GKE documentation for automated cluster updates, but it appears to primarily focus on AWS integrations. We haven't foun...
- 365 Views
- 5 replies
- 0 kudos
- 0 kudos
I apologize for the delay I am still looking into this
- 0 kudos
- 281 Views
- 2 replies
- 0 kudos
Databricks App in Azure Databricks with private link cluster (no Public IP)
Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...
- 281 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @Behwar, Did you make sure that your internal DNS is configured to map the web application workspace URL to your front-end VPC endpoint. This involves creating an A-record in your internal DNS that maps the workspace URL directly to the front-e...
- 0 kudos
- 177 Views
- 1 replies
- 0 kudos
Multiple feature branches per user using Databricks Asset Bundles
I'm currently helping a team migrating to DABs from dbx and they would like to be able to work on multiple features at the same time.What I was able to do is pass the current branch as a variable in the root_path and various names, so when the bundle...
- 177 Views
- 1 replies
- 0 kudos
- 0 kudos
deleting the DABs in Databricks and also in my local version under .databricks solved this one for me.
- 0 kudos
- 201 Views
- 1 replies
- 1 kudos
Resolved! How do I track notebooks in all purpose compute?
I am trying to map out costs for a Shared cluster used in our organization. Since Databricks does not store the sessions in all purpose compute or who accessed the cluster, what are some possible options that I can track which notebooks were attached...
- 201 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sparkplug, You can use the audit logs and billing usage table: https://docs.databricks.com/en/admin/account-settings/audit-logs.html
- 1 kudos
- 837 Views
- 9 replies
- 1 kudos
Need to move files from one Volume to other
We recently enabled Unity catalog on our workspace, as part of certain transformations(Custom clustered Datapipelines(python)) we need to move file from one volume to other volume. As the job itself runs on a service principal that has access to exte...
- 837 Views
- 9 replies
- 1 kudos
- 1 kudos
Not all job clusters work well with Volumes. I used following type cluster to access files from Volume.
- 1 kudos
- 147 Views
- 1 replies
- 3 kudos
Resolved! Markdown Cells Do Not Render Consistently
When I am creating a notebook in the UI editor on DataBricks, markdown cells do not always render after I run them. They still appear in 'editing mode'. See the screenshot below, it should have rendered a H1.Again, this behavior is not consistent. So...
- 147 Views
- 1 replies
- 3 kudos
- 3 kudos
Hi @bdanielatl, Thank you for reporting the issue with markdown cells not rendering consistently. This appears to be a known issue that has been encountered by other users as well. I will report it internally.
- 3 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
73 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
21 | |
12 | |
9 |