- 649 Views
- 10 replies
- 0 kudos
Resolved! Failed to add 3 workers to the compute. Will attempt retry: true. Reason: Driver unresponsive
Currently I trying to Create a Compute Cluster on a Workspaces with Privatelink and Custom VPC.I'm using Terraform: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceAfter the deployment is com...
- 649 Views
- 10 replies
- 0 kudos
- 0 kudos
Hi @ambigus9, Looks like based on connectivity test to the RDS it's not working. Can you check if there is any Firewall blocking the request, since connection is not going through the RDS.
- 0 kudos
- 342 Views
- 1 replies
- 0 kudos
Resolved! How to Retrieve Admin and Non-Admin Permissions at Workspace Level in Azure Databricks.
Hello,I am working on a project to document permissions for both admins and non-admin users across all relevant objects at the workspace level in Azure Databricks (e.g., tables, jobs, clusters, etc.).I understand that admin-level permissions might be...
- 342 Views
- 1 replies
- 0 kudos
- 0 kudos
In Databricks the object permissions are based in the object itself and not the user. Unfortunately as of now there is no way to get all the objects permissions in a single built in query.There is custom options as for example for clusters, first run...
- 0 kudos
- 422 Views
- 2 replies
- 1 kudos
Resolved! Databricks Connect: Enabling Arrow on Serverless Compute
I recently upgraded my Databricks Connect version to 15.4 and got set up for Serverless, but ran into the following error when I ran the standard code to enable Arrow on Pyspark: >>> spark.conf.set(key='spark.sql.execution.arrow.pyspark.enabled', val...
- 422 Views
- 2 replies
- 1 kudos
- 1 kudos
Gotcha, thanks! Missed it in the limitations.
- 1 kudos
- 217 Views
- 3 replies
- 0 kudos
Can we change our cloud service connected with our Databricks account
We are moving from old aws account to azure account. Is there any way. I can migrate my old databricks account to this new azure account.I have my Databricks partner workspace access with this Databricks account. That's the reason, I want to keep thi...
- 217 Views
- 3 replies
- 0 kudos
- 0 kudos
Unfortunately I am not able to find a way to move the workspace, if you have an account representative within Databricks I will suggest to reach out to see any options you can have to migrate also this credits if possible
- 0 kudos
- 415 Views
- 4 replies
- 0 kudos
Get a static IP for my Databricks App
Hello,I'm trying to find how to set-up a static IP for a Azure Databricks App. I tried to set-up a NAT gateway to have a static IP for the workspace, but it doesn't change anything, I still can't access my OpenAI ressource even if I authorize the NaT...
- 415 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi, I’m following up here as I have the same issue. Did the solution provided in the replies help resolve this for you?
- 0 kudos
- 270 Views
- 1 replies
- 1 kudos
Determining spill from system tables
I'm trying to optimize machine selection (D, E, or L types on Azure) for job clusters and all-purpose compute and am struggling to identify where performance is sagging on account of disk spill. Disk spill would suggest that more memory is needed. ...
- 270 Views
- 1 replies
- 1 kudos
- 1 kudos
For historical diagnostics, you might need to consider setting up a custom logging mechanism that captures these metrics over time and stores them in a persistent storage solution, such as a database or a logging service. This way, you can query and ...
- 1 kudos
- 910 Views
- 15 replies
- 0 kudos
Resolved! Permissions error on cluster requirements.txt installation
Hi Databricks Community,I'm looking to resolve the following error:Library installation attempted on the driver node of cluster {My cluster ID} and failed. Please refer to the following error message to fix the library or contact Databricks support. ...
- 910 Views
- 15 replies
- 0 kudos
- 0 kudos
Noting here for other users: I was able to resolve the issue on a shared cluster by cloning the cluster and using the clone.
- 0 kudos
- 626 Views
- 8 replies
- 3 kudos
PrivateLink Validation Error - When trying to access to Workspace
We have a workspace that had been deployed on AWS customer architecture using Terraform privatelink: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceThe fact is when we disable the Public Acc...
- 626 Views
- 8 replies
- 3 kudos
- 3 kudos
Can you share your workspace id so I can do a validation?
- 3 kudos
- 19237 Views
- 10 replies
- 9 kudos
Resolved! Installing libraries on job clusters
Simple question : what is the way to go to install libraries on job clusters ? There does not seem to be a "Libraries" tab on the UI as opposed to regular clusters. Does it mean that the only option is to use init scripts ?
- 19237 Views
- 10 replies
- 9 kudos
- 9 kudos
I am not able to select the requirements.txt file from my workspace folder, I can see the file but cannot select it. How do I overcome this problem?
- 9 kudos
- 155 Views
- 2 replies
- 0 kudos
Can't create cluster in AWS with p3 instance type
Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...
- 155 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes sorry for the double post (I couldn't figure out how to delete this one)
- 0 kudos
- 187 Views
- 2 replies
- 1 kudos
Databricks in AWS K8 cluster
Hi. I have a question. I have recently started using Databricks. I need the databricks to be deployed on AWS K8 cluster. Where can I find the sources?
- 187 Views
- 2 replies
- 1 kudos
- 129 Views
- 1 replies
- 0 kudos
Querying on multi-node cluster on AWS does not complete
Querying in isolation mode is completely fine but when trying to run the same query using the multi-node it does complete or error out. Any assistance to troubleshoot this issue? oh, Happy New year if you're reading this.
- 129 Views
- 1 replies
- 0 kudos
- 0 kudos
hello John,Happy new year to you, can you please confirm what is the error message received? when you say isolation mode do you mean single node or do you refer to single user cluster while the other is shared mode?
- 0 kudos
- 376 Views
- 5 replies
- 2 kudos
Resolved! S3 access credentials: Pandas vs Spark
Hi,I need to read Parquet files located in S3 into the Pandas dataframe.I configured "external location" to access my S3 bucket and havedf = spark.read.parquet(s3_parquet_file_path)working perfectly well.However, df = pd.read_parquet(s3_parquet_file_...
- 376 Views
- 5 replies
- 2 kudos
- 2 kudos
Yes, you understand correctly. The Spark library in Databricks uses the Unity Catalog credential model, which includes the use of "external locations" for managing data access. This model ensures that access control and permissions are centrally mana...
- 2 kudos
- 344 Views
- 2 replies
- 1 kudos
Assistance Required: Integrating Databricks ODBC Connector with Azure App Service
Hi,I have successfully established an ODBC connection with Databricks to retrieve data from the Unity Catalog in a local C# application using the Simba Spark ODBC Driver, and it is working as expected.I now need to integrate this functionality into a...
- 344 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @nanda_ ,So basically what you need to do is to install simba odbc driver on your Azure App Service environment. Then your code should work in the same way as in your local machine.One possibility is to use Windows or Linux Containers on Azure App...
- 1 kudos
- 424 Views
- 1 replies
- 0 kudos
Resolved! How to add 'additionallyAllowedTenants' in Databricks config or PySpark config?
I have a multi-tenant Azure app. I am using this app's credentials to read ADLS container files from Databricks cluster using PySpark dataframe.I need to set this 'additionallyAllowedTenants' flag value to '*' or a specific tenant_id of the multi-ten...
- 424 Views
- 1 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
75 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
24 | |
14 | |
9 |