- 156 Views
- 3 replies
- 0 kudos
private endpoint to non-storage azure resource
I'm trying to set up a ncc and private endpoint for a container app environment in azure. However I get the following error:Error occurred when creating private endpoint rule: : BAD_REQUEST: Can not create Private Link Endpoint with name databricks-x...
- 156 Views
- 3 replies
- 0 kudos
- 0 kudos
All the azure subscriptions have this registered. Could this not be a azure subscription within the databricks tenant?
- 0 kudos
- 21 Views
- 1 replies
- 1 kudos
How to find the billing of each cell in a notebook?
Suppose I have run ten different statements/tasks/cells in a notebook, and I want to know how many DBUs each of these ten tasks used. Is this possible?
- 21 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey,I really think this it’s not possible to directly determine the cost of a single cell in Databricks.However, you can approach this in two ways, depending on the type of cluster you’re using, as different cluster types have different pricing model...
- 1 kudos
- 47 Views
- 2 replies
- 0 kudos
Databricks Workspace Access and Permissions
Hi Team,The GCP Databricks URL https://accounts.gcp.databricks.com/ for GCP Databricks is linked to the GCP Billing Account.We have two clients with separate GCP Organizations:client1.example.comclient2.example.comBoth GCP Organizations share the sam...
- 47 Views
- 2 replies
- 0 kudos
- 0 kudos
HIi @karthiknuvepro, To isolate resources you can follow these steps: Create Separate GCP Projects for Each Client: Create a separate GCP project for each client within their respective GCP Organizations.This ensures that each client has isolated ...
- 0 kudos
- 36 Views
- 2 replies
- 1 kudos
Driver log storage location
What directory would the driver log normally be stored in? Is it DBFS?
- 36 Views
- 2 replies
- 1 kudos
- 26 Views
- 0 replies
- 0 kudos
JVM Heap Memory Graph - more memory used than available
I'm analyzing the memory usage of my Spark application and I see something strange when checking JVM Heap Memory Graph (see screenshot below). Each line on the graph is representing one executor.Why the memory usage sometimes reaches over 10GB, when ...
- 26 Views
- 0 replies
- 0 kudos
- 27 Views
- 3 replies
- 0 kudos
GPU accelerator not matching with desired memory.
Hello, We have opted for Standard_NC8as_T4_v3 which claims to have 56GB memory. But, when I am doing nvidia-smi in the notebook, its showing only ~16 GB, Why?Please let me know what is happening here? Jay
- 27 Views
- 3 replies
- 0 kudos
- 0 kudos
Please refer to: https://learn.microsoft.com/en-us/azure/databricks/compute/gpu
- 0 kudos
- 24 Views
- 0 replies
- 0 kudos
Error "Gateway authentication failed for 'Microsoft.Network'" While Creating Azure Databricks
Hi All,I'm encountering an issue while trying to create a Databricks service in Azure. During the setup process, I get the following error:"Gateway authentication failed for 'Microsoft.Network'"I've checked the basic configurations, but I'm not sure ...
- 24 Views
- 0 replies
- 0 kudos
- 159 Views
- 5 replies
- 5 kudos
Resolved! Databricks cluster pool deployed through Terraform does not have UC enabled
Hello everyone,we have a workspace with UC enabled, we already have a couple of catalogs attached and when using our personal compute we are able to read/write tables in those catalogs.However for our jobs we deployed a cluster pool using Terraform b...
- 159 Views
- 5 replies
- 5 kudos
- 33 Views
- 0 replies
- 0 kudos
Driver: how much memory is actually available?
I have a cluster where Driver type is Standard_DS3_v2 (14GB Memory and 4 Cores). When I use free -h command in Web terminal (see attached screenshot) I get the response that I only have 8.9GB memory available on my driver - why is that?fyi, spark.dri...
- 33 Views
- 0 replies
- 0 kudos
- 37 Views
- 1 replies
- 0 kudos
GCP Databricks | Workspace Creation Error: Storage Credentials Limit Reached
Hi Team,We are encountering an issue while trying to create a Databricks Workspace in the GCP region us-central1. Below is the error message:Error Message:Workspace Status: FailedDetails: Workspace failed to launch.Error: BAD REQUEST: Cannot create 1...
- 37 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @karthiknuvepro, Do you have an active support plan? Over a ticket with us we can request the increase of this limit.
- 0 kudos
- 130 Views
- 2 replies
- 0 kudos
Disable 'Allow trusted Microsoft services to bypass this firewall' for Azure Key Vault
Currently even when using vnet injected Databricks workspace, we are unable to fetch the secrets from AKV if the 'Allow trusted Microsoft services to bypass this firewall' is disabled.The secret is used a AKV backed secret scope and the key vault is ...
- 130 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @rdadhichi, Have you set "Allow access from" to "Private endpoint and selected networks" on the firewall?
- 0 kudos
- 103 Views
- 4 replies
- 0 kudos
How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months
How do we get user list who accessed/downloaded specific model in Unity catalog for last 6 months
- 103 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @AnkitShah, I just tried on my end and found these 2 tables that might be useful. They do not exact show who downloaded a model artifact but who interacted with it: https://docs.databricks.com/en/ai-gateway/configure-ai-gateway-endpoints.html#usag...
- 0 kudos
- 3089 Views
- 10 replies
- 2 kudos
"Azure Container Does Not Exist" when cloning repositories in Azure Databricks
Good Morning, I need some help with the following issue:I created a new Azure Databricks resource using the vnet-injection procedure. (here) I then proceeded to link my Azure Devops account using a personal account token. If I try to clone a reposito...
- 3089 Views
- 10 replies
- 2 kudos
- 2 kudos
HiAlso having problems with this after some IaC testing deleting and recreating the workspace with the same name. We are working in Azure.Is the "container" referred to, the container in the storage account deployed by Databricks Instance into the ma...
- 2 kudos
- 171 Views
- 6 replies
- 0 kudos
Governance to restrict compute creation
Hi,Cluster policies used to be an easy way to handle governance on computes. However, more and more, there seem to be no way to control many new compute features within the platform. We currently have this issue for model serving endpoints and vector...
- 171 Views
- 6 replies
- 0 kudos
- 0 kudos
If you are looking to restrict end users to create certain cluster configuration only, you can do so by using databricks APIs. Through python and Databricks API, you can specify what kind of cluster configurations are allowed and also restrict users ...
- 0 kudos
- 152 Views
- 1 replies
- 3 kudos
High memory usage on Databricks cluster
In my team we have a very high memory usage even when the cluster has just been started and nothing has been run yet. Additionally, memory usage never drops to lower levels - total used memory always fluctuates around 14GB.Where is this memory usage ...
- 152 Views
- 1 replies
- 3 kudos
- 3 kudos
This is not necessarily an issue. Linux uses a lot of RAM for caching but this does not mean it cannot be released for processes (dynamic memory mgmt).Basically the philosophy is that RAM that is not used (so actually 'free') is useless.Here is a re...
- 3 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
70 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
23 | |
14 | |
12 | |
9 |