- 182 Views
- 1 replies
- 1 kudos
Resolved! Compute fleets on Azure Databricks
It seems that compute fleets have been supported on AWS for almost 2 years. Azure compute fleets went into preview in November 2024. Has anyone heard of how or when compute fleets will be supported on Azure Databricks?
- 182 Views
- 1 replies
- 1 kudos
- 1 kudos
This is currently on development but to ETA has been shared yet by our engineering team. But might be coming soon.
- 1 kudos
- 659 Views
- 11 replies
- 0 kudos
Can't create AWS p3 instance
Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...
- 659 Views
- 11 replies
- 0 kudos
- 0 kudos
I was able to start a cluster with same exact configuration on my internal environment with no issues, I have selected east-1a as the AZ to deploy.By any chance have you engaged AWS support on this?
- 0 kudos
- 357 Views
- 3 replies
- 0 kudos
Mimic system table functionality at custom catalog level
Hi,I am exploring system tables . I want to have our environment specific data in different catalogs. While it is possible to get audit and other usage info from system tables under system catalog,how can I achieve the same in my custom catalog that ...
- 357 Views
- 3 replies
- 0 kudos
- 0 kudos
Just to be clear, what you want to do is have a set of system tables such as audit logs in each catalog for your environments, so when you query the data for those tables you just get information from your environment. On this case there is no built ...
- 0 kudos
- 437 Views
- 1 replies
- 1 kudos
Delta lake : delete data from storage manually instead of vacuum
Hi AllWe have a unique use case where we are unable to run vacuum to clean our storage space of delta lake tables. Since we have data partitioned by date, we plan to delete files older than a certain date directly from storage. Could this lead to any...
- 437 Views
- 1 replies
- 1 kudos
- 1 kudos
Deleting files older than a certain date directly from storage without using the VACUUM command can lead to potential issues with your Delta Lake tables. Here are the key points to consider: Corruption Risk: Directly deleting files from storage can ...
- 1 kudos
- 349 Views
- 1 replies
- 1 kudos
Resolved! Where is the Open Apache Hive Metastore API?
I 2023 it was announced that databricks has made a "Hive Metastore (HMS) interface for Databricks Unity Catalog, which allows any software compatible with Apache Hive to connect to Unity Catalog".Is this discontinued? If not, is there any documentati...
- 349 Views
- 1 replies
- 1 kudos
- 1 kudos
It seems that this option has been deprecated, it was a private preview but is no longer available for enrollment
- 1 kudos
- 708 Views
- 10 replies
- 0 kudos
Resolved! Failed to add 3 workers to the compute. Will attempt retry: true. Reason: Driver unresponsive
Currently I trying to Create a Compute Cluster on a Workspaces with Privatelink and Custom VPC.I'm using Terraform: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceAfter the deployment is com...
- 708 Views
- 10 replies
- 0 kudos
- 0 kudos
Hi @ambigus9, Looks like based on connectivity test to the RDS it's not working. Can you check if there is any Firewall blocking the request, since connection is not going through the RDS.
- 0 kudos
- 356 Views
- 1 replies
- 0 kudos
Resolved! How to Retrieve Admin and Non-Admin Permissions at Workspace Level in Azure Databricks.
Hello,I am working on a project to document permissions for both admins and non-admin users across all relevant objects at the workspace level in Azure Databricks (e.g., tables, jobs, clusters, etc.).I understand that admin-level permissions might be...
- 356 Views
- 1 replies
- 0 kudos
- 0 kudos
In Databricks the object permissions are based in the object itself and not the user. Unfortunately as of now there is no way to get all the objects permissions in a single built in query.There is custom options as for example for clusters, first run...
- 0 kudos
- 459 Views
- 2 replies
- 1 kudos
Resolved! Databricks Connect: Enabling Arrow on Serverless Compute
I recently upgraded my Databricks Connect version to 15.4 and got set up for Serverless, but ran into the following error when I ran the standard code to enable Arrow on Pyspark: >>> spark.conf.set(key='spark.sql.execution.arrow.pyspark.enabled', val...
- 459 Views
- 2 replies
- 1 kudos
- 1 kudos
Gotcha, thanks! Missed it in the limitations.
- 1 kudos
- 222 Views
- 3 replies
- 0 kudos
Can we change our cloud service connected with our Databricks account
We are moving from old aws account to azure account. Is there any way. I can migrate my old databricks account to this new azure account.I have my Databricks partner workspace access with this Databricks account. That's the reason, I want to keep thi...
- 222 Views
- 3 replies
- 0 kudos
- 0 kudos
Unfortunately I am not able to find a way to move the workspace, if you have an account representative within Databricks I will suggest to reach out to see any options you can have to migrate also this credits if possible
- 0 kudos
- 430 Views
- 4 replies
- 0 kudos
Get a static IP for my Databricks App
Hello,I'm trying to find how to set-up a static IP for a Azure Databricks App. I tried to set-up a NAT gateway to have a static IP for the workspace, but it doesn't change anything, I still can't access my OpenAI ressource even if I authorize the NaT...
- 430 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, I’m following up here as I have the same issue. Did the solution provided in the replies help resolve this for you?
- 0 kudos
- 282 Views
- 1 replies
- 1 kudos
Determining spill from system tables
I'm trying to optimize machine selection (D, E, or L types on Azure) for job clusters and all-purpose compute and am struggling to identify where performance is sagging on account of disk spill. Disk spill would suggest that more memory is needed. ...
- 282 Views
- 1 replies
- 1 kudos
- 1 kudos
For historical diagnostics, you might need to consider setting up a custom logging mechanism that captures these metrics over time and stores them in a persistent storage solution, such as a database or a logging service. This way, you can query and ...
- 1 kudos
- 924 Views
- 15 replies
- 0 kudos
Resolved! Permissions error on cluster requirements.txt installation
Hi Databricks Community,I'm looking to resolve the following error:Library installation attempted on the driver node of cluster {My cluster ID} and failed. Please refer to the following error message to fix the library or contact Databricks support. ...
- 924 Views
- 15 replies
- 0 kudos
- 0 kudos
Noting here for other users: I was able to resolve the issue on a shared cluster by cloning the cluster and using the clone.
- 0 kudos
- 646 Views
- 8 replies
- 3 kudos
PrivateLink Validation Error - When trying to access to Workspace
We have a workspace that had been deployed on AWS customer architecture using Terraform privatelink: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceThe fact is when we disable the Public Acc...
- 646 Views
- 8 replies
- 3 kudos
- 3 kudos
Can you share your workspace id so I can do a validation?
- 3 kudos
- 19325 Views
- 10 replies
- 9 kudos
Resolved! Installing libraries on job clusters
Simple question : what is the way to go to install libraries on job clusters ? There does not seem to be a "Libraries" tab on the UI as opposed to regular clusters. Does it mean that the only option is to use init scripts ?
- 19325 Views
- 10 replies
- 9 kudos
- 9 kudos
I am not able to select the requirements.txt file from my workspace folder, I can see the file but cannot select it. How do I overcome this problem?
- 9 kudos
- 173 Views
- 2 replies
- 0 kudos
Can't create cluster in AWS with p3 instance type
Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...
- 173 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes sorry for the double post (I couldn't figure out how to delete this one)
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
75 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
24 | |
15 | |
9 |