- 393 Views
- 3 replies
- 0 kudos
Databricks Predictive optimization
If we want to enable Databricks Predictive Optimization, then is it also mandatory to enable serverless Job/Notebook Compute in our account. We already have Serverless SQL warehouse available in our workspaces.
- 393 Views
- 3 replies
- 0 kudos
- 0 kudos
Thanks. This was answered in https://community.databricks.com/t5/administration-architecture/enable-predictive-optimization/td-p/98731
- 0 kudos
- 2323 Views
- 5 replies
- 2 kudos
Adhoc workflows - managing resource usage on shared clusters
We run a shared cluster that is used for general purpose adhoc analytics, which I assume is a relatively common use case to try to keep costs down. However, the technical experience of users of this cluster varies a lot, so we run into situations whe...
- 2323 Views
- 5 replies
- 2 kudos
- 2 kudos
Hi, @JameDavi_51481 , were you able to figure something out?Planning a Databricks migration and realized we might need something similar too.
- 2 kudos
- 236 Views
- 2 replies
- 0 kudos
Sporadic HTTP failure with SQL Serverless (bug?)
Our SQL Serverless installation has sporadic failures to our blob container in Azure. The blob container is locked down to a vnet, and we are using the private endpoint to enable serverless access. It will work fine for several hours, and then show...
- 236 Views
- 2 replies
- 0 kudos
- 0 kudos
I've confirmed all of that. This seems like an AI generated response. It seems more likely that Databricks rolled out a feature a week ago that is causing instability in the serverless warehouses. Any other specific things to check would be apprec...
- 0 kudos
- 574 Views
- 1 replies
- 2 kudos
Privileged Identity Management for Databricks with Microsoft Entra ID
Privileged Identity Management (PIM) can be used to secure access to critical Databricks roles with Just-in-Time (JIT) access. This approach helps organizations enforce time-bound permissions, approval workflows, and centralized auditing for sensitiv...
- 574 Views
- 1 replies
- 2 kudos
- 2 kudos
Thanks. However, as what I know, Azure PIM does not work for Service Principals. It's only applied to human user access
- 2 kudos
- 362 Views
- 1 replies
- 0 kudos
I want to create custom tag in cluster policy so that clusters created using that policy get those
"I want to create custom tags in a cluster policy so that clusters created using this policy will automatically include those tags for billing purposes. Consider the following example:"cluster_type": {"type": "fixed","value": "all-purpose"},"custom_t...
- 362 Views
- 1 replies
- 0 kudos
- 0 kudos
Are you having any issue while running this code in the policy?
- 0 kudos
- 341 Views
- 1 replies
- 0 kudos
Databricks Serverless best practices
Hi All,We are configuring a Databricks serverless that adjusts according to the workload type,like choosing different cluster sizes such as extra small ,small ,large etc, and auto scale option.We're also looking at the average time it takes to comple...
- 341 Views
- 1 replies
- 0 kudos
- 0 kudos
You can refer to our Serverless Compute Best practices: https://docs.databricks.com/en/compute/serverless/best-practices.htmlIf you refer to the Serverless Warehouses you can refer to https://docs.databricks.com/en/compute/sql-warehouse/warehouse-beh...
- 0 kudos
- 2765 Views
- 9 replies
- 0 kudos
What is the best practice for connecting Power BI to Azure Databricks?
I refer this document to connect Power BI Desktop and Power BI Service to Azure Databricks.Connect Power BI to Azure Databricks - Azure Databricks | Microsoft LearnHowever, I have a couple of quesitions and concerns. Can anyone kindly help?It seems l...
- 2765 Views
- 9 replies
- 0 kudos
- 0 kudos
Hi @AlbertWang You have multiple options to connect Power BI with Databricks:Using Cluster Credentials: Under the cluster details, go to the Advanced Options and select JDBC/ODBC. Here, you’ll find the necessary credentials, such as hostname and HTTP...
- 0 kudos
- 257 Views
- 1 replies
- 0 kudos
Manage Account option
HiI have created a premium databricks workspace on my azure free trial account and also have the global administrator role on my azure acount. I have setup all the necessary configurations like by providing the role of storage data blob contributor t...
- 257 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @SALAHUDDINKHAN, If you are unable to see the "Manage Account" option, it is likely that you do not have the necessary account admin privileges. Please ensure you have the required permissions indicated here: https://learn.microsoft.com/en-us/a...
- 0 kudos
- 10087 Views
- 2 replies
- 1 kudos
Resolved! Any way to move the unity catalog to a new external storage location?
Dear Databricks CommunityThe question is about changing an existing unity catalog to a new storage location. For example: With an existing unity catalog (i.e. catalog1) includeing schemas and volumes. The catalog is based on an external location (i....
- 10087 Views
- 2 replies
- 1 kudos
- 1 kudos
https://docs.databricks.com/ja/sql/language-manual/sql-ref-syntax-ddl-alter-location.html
- 1 kudos
- 449 Views
- 3 replies
- 1 kudos
cloud_infra_costs
I was looking at the system catalog and realized that there is an empty table called cloud_infra_costs. Could you tell me what is this for and why it is empty?
- 449 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks for replying. This makes sense. Any idea why it is empty and what to do to populate this?
- 1 kudos
- 1202 Views
- 5 replies
- 2 kudos
Ingress/Egress private endpoint
Hello ,We have configured our Databricks environment with private endpoint connections injected into our VNET, which includes two subnets (public and private). We have disabled public IPs and are using Network Security Groups (NSGs) on the subnet, as...
- 1202 Views
- 5 replies
- 2 kudos
- 2 kudos
@Fkebbati First, traffic cost in Azure are not reported as a separate Resource Type, but appended to main resource causing the traffic. If you want to distinguish them use for instance Service Name. In this case traffic cost is appended to Databricks...
- 2 kudos
- 12027 Views
- 1 replies
- 2 kudos
Authentication for Databricks Apps
Databricks Apps allows us to define dependencies & an entrypoint to execute a Python application like Gradio, Streamlit, etc. It seems I can also run a FastAPI application and access via an authenticated browser which is potentially a very powerful c...
- 12027 Views
- 1 replies
- 2 kudos
- 2 kudos
According to the response in this Reddit thread, it appears that this feature is not yet supported: https://www.reddit.com/r/databricks/comments/1g5ni8e/comment/lsceh4n/
- 2 kudos
- 908 Views
- 3 replies
- 0 kudos
Databricks (GCP) Cluster not resolving Hostname into IP address
we have #mongodb hosts that must be resolved to private internal loadbalancer ips ( of another cluster ), and that we are unable to add host aliases in the Databricks GKE cluster in order for the spark to be able to connect to a mongodb and resolve t...
- 908 Views
- 3 replies
- 0 kudos
- 0 kudos
Also found this - https://community.databricks.com/t5/data-engineering/why-i-m-getting-connection-timeout-when-connecting-to-mongodb/m-p/14868
- 0 kudos
- 1552 Views
- 3 replies
- 0 kudos
Data leakage risk happened when we use the Azure Databricks workspace
Context:We are utilizing an Azure Databricks workspace for data management and model serving within our project, with delegated VNet and subnets configured specifically for this workspace. However, we are consistently observing malicious flow entries...
- 1552 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello everyone! We have worked with our security team, Microsoft, and other customers who have seen similar log messages. This log message is very misleading, as it appears to state that the malicious URI was detected within your network — this would...
- 0 kudos
- 433 Views
- 1 replies
- 0 kudos
Unable to add a databricks permission to existing policy
Hi, We're using databricks provider v1.49.1 to manager our Azure databricks cluster and other resources. Having an issue setting permissions with the databricks terraform resource "databricks_permissions" where the error indicates that the clust...
- 433 Views
- 1 replies
- 0 kudos
- 0 kudos
Is this cluster policy a custom policy? If you try for testing purposes to modify it in the UI does it allows you to
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
73 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
20 | |
12 | |
9 |