- 45 Views
- 2 replies
- 0 kudos
Networking configuration of Azure Databricks managed storage account
Hi all,I created an Azure Databricks Workspace, and the workspace creates an Azure Databricks managed storage account.The networking configuration of the storage account is "Enabled from all networks".Shall I change it to "Enabled from selected virtu...
- 45 Views
- 2 replies
- 0 kudos
- 0 kudos
Changing the networking configuration of your Azure Databricks managed storage account to "Enabled from selected virtual networks and IP addresses" is a good step for enhancing security. However, the "Insufficient permissions" status you are seeing f...
- 0 kudos
- 47 Views
- 2 replies
- 0 kudos
Resolved! Default schema in SQL Editor is not 'default' when unity catalog is set as default catalog
In workspace settings: Workspace admin - advanced - other - Default catalog for the workspace is set to different than hive_metastore, it is set to a `Unity Catalog` catalog - the expected behaviour is copied here from the related more info panel:"Se...
- 47 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Alberto_Umana ,Thank you for the explanation. I mark your comment as the accepted solution as it contains the current implementation logic and the work around. Good to know that the more info panel is a bit misleading as of now because the SQL Ed...
- 0 kudos
- 182 Views
- 10 replies
- 1 kudos
Error "Integrating Apache Spark with Databricks Unity Catalog Assets via Open APIs" on Azure
Great blog post: https://community.databricks.com/t5/technical-blog/integrating-apache-spark-with-databricks-unity-catalog-assets/ba-p/97533I have attempted to reproduce this with Azure Databricks, and ADLS gen2 as the storage backend.Although I'm ab...
- 182 Views
- 10 replies
- 1 kudos
- 1 kudos
Thanks @dkushari I looked at the github issue you posted, but it has to do specifically with DELTA_UNSUPPORTED_SCHEMA_DURING_READ when streaming *from* a delta table.The specific error I'm seeing is a key error for the Azure storage account hosting t...
- 1 kudos
- 35 Views
- 0 replies
- 0 kudos
How to calculate accurate usage cost for a longer contractual period?
Hi Experts!I work on providing and accurate total cost (in DBU and USD as well) calculation for my team for the whole ongoing contractual period. I'v checked the following four options:Account console: Manage account - Usage - Consumption (Legacy): t...
- 35 Views
- 0 replies
- 0 kudos
- 49 Views
- 0 replies
- 0 kudos
Will Lakehouse Federation between Databricks and Snowflake support Azure Entra ID?
The Lakehouse Federation between Databricks and Snowflake looks promising, but the lack of support for Azure Entra ID as an identity provider (IdP) is a big limitation for enterprises standardized on it.Managing separate OAuth flows or using Snowflak...
- 49 Views
- 0 replies
- 0 kudos
- 32 Views
- 0 replies
- 0 kudos
Budget Policy - Service Principals don't seem to be allowed to use budget policies
ObjectiveTransfer existing DLT pipeline to new owner (service principal). Budget policies enabled.Steps to reproduceCreated a service principalAssigned it group membership of a group that is allowed to use a budget policyEnsured it has access to the ...
- 32 Views
- 0 replies
- 0 kudos
- 39 Views
- 0 replies
- 0 kudos
Restrict a Workspace User from Creating/Managing Databricks Jobs
Hello Databricks team,I currently have a workspace user, and I want to disable their ability to create or manage Databricks jobs entirely. Specifically, I would like to prevent the user from accessing the "Create Job" option in the Databricks UI or v...
- 39 Views
- 0 replies
- 0 kudos
- 60 Views
- 1 replies
- 0 kudos
PrivateLink Validation Error - When trying to access to Workspace
We have a workspace that had been deployed on AWS customer architecture using Terraform privatelink: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceThe fact is when we disable the Public Acc...
- 60 Views
- 1 replies
- 0 kudos
- 0 kudos
If you create a VM inside the same VPC of your workspace are you able to access the workspace? Also have you granted access to all the ports as provided in docs https://docs.databricks.com/en/security/network/classic/privatelink.html#step-1-configure...
- 0 kudos
- 65 Views
- 1 replies
- 0 kudos
Python User Input During Run-time
I'm new to Advana and was putting together a Python script that requires user interactions during run-time. However, the program pauses at an 'input()' command without a console cell to accept responses. Am I missing something about this instance of ...
- 65 Views
- 1 replies
- 0 kudos
- 0 kudos
This is a common issue because Jupyter Notebooks are not designed to handle interactive input in the same way as a standard Python script run in a terminal. In Jupyter Notebooks, the input() function does not work as expected because the notebook int...
- 0 kudos
- 73 Views
- 2 replies
- 0 kudos
Databricks Apps
I've created a Databricks App that is essentially a REST Api in Flask. It returns a json object. Data is in sql-warehouse. It works when calling directly in browser but I want to access it with rest api and Bearer token. But when I login with a servi...
- 73 Views
- 2 replies
- 0 kudos
- 0 kudos
Does the Service Principal has permission access to all the resources involved within you app?
- 0 kudos
- 53 Views
- 1 replies
- 1 kudos
Disable Catalog for predictive optimization
Let's we disabled predictive optimization for a specific Catalog name "CatalogXYZ" and after that enabled the predictive optimization at Account level. After that can schema owner for the Schema "CatalogXYZ.TestSchema" can enable the predictive optim...
- 53 Views
- 1 replies
- 1 kudos
- 1 kudos
If predictive optimization is disabled for the catalog "CatalogXYZ" and then enabled at the account level, the schema owner for "CatalogXYZ.TestSchema" cannot enable predictive optimization for this schema. This is because the predictive optimization...
- 1 kudos
- 84 Views
- 1 replies
- 0 kudos
Downstream usage control on Serverless
Hi All,We've noticed a significant increase in our Databricks Serverless usage due to downstream system activity. We would like to reduce overall consumption by serverless. Please suggest us the possible ways and best practices we can implement to ...
- 84 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Phani1, You might want to review this document: https://docs.databricks.com/en/compute/serverless/best-practices.html Let me know if you have any further question.
- 0 kudos
- 61 Views
- 1 replies
- 0 kudos
How to only allow one git branch, one folder?
Users are able to switch branches in the git UI. How to restrict or only allow one branch?Also, for the sparse checkout, how to view only one folder (without files from root)
- 61 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @amberleong, To restrict users from switching branches in the Git UI and only allow one branch, you can implement branch protection rules in your Git repository, directly from your source code git tool.
- 0 kudos
- 69 Views
- 2 replies
- 0 kudos
Okta SSO Unified login in GCP
Hi,There are versions of this question posted already but they seem to refer to legacy features. Our organisation uses google workspace IP provisioned via Okta as the first landing point and all apps are secured behind this. We have purchased Databri...
- 69 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @dtb_usr, It is possible to use OKTA IdP to log into Databricks in GCP, please refer to: https://docs.gcp.databricks.com/en/admin/users-groups/scim/okta.html
- 0 kudos
- 88 Views
- 1 replies
- 0 kudos
Datadog, OpenTelemetry, and Databricks container service
We have successfully gotten Datadog agent(s) installed and running on databricks clusters via init script - this part seems to be working fine. We are working on instrumenting our jobs using the OpenTelemetry endpoint feature of the Datadog agent, wh...
- 88 Views
- 1 replies
- 0 kudos
- 0 kudos
The agent installations via the init script would install the agents in the Spark containers (All user workloads + spark processes run in the container). The users don't have direct access to the host machine and can't install any agents. You may nee...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
53 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
37 | |
9 | |
9 | |
8 | |
8 |