- 523 Views
- 4 replies
- 0 kudos
Table level data masking issue
Hi,We are seeing this error and does anyone know the ways to fix it?[RequestId=6cf95ee8-f312-47cd-846c-dcd87158c939 ErrorClass=INVALID_PARAMETER_VALUE.ROW_COLUMN_ACCESS_POLICIES_NOT_SUPPORTED_ON_ASSIGNED_CLUSTERS] Query on table with row filter or co...
- 523 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @Harish_Kumar_M, The error you are seeing, "Query on table with row filter or column mask not supported on assigned clusters" (INVALID_PARAMETER_VALUE.ROW_COLUMN_ACCESS_POLICIES_NOT_SUPPORTED_ON_ASSIGNED_CLUSTERS), occurs because your cluster's ac...
- 0 kudos
- 480 Views
- 2 replies
- 2 kudos
Resolved! NetSuite JDBC Driver 8.10.184.0 Suppor
Hello,I am currently attempting to integrate NetSuite with Databricks using the NetSuite JDBC driver version 8.10.184.0. When I attempt to ingestion information from NetSuite to Databricks, I find that the job fails with a checksum error and informs ...
- 480 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @b_pinter, The NetSuite JDBC driver version 8.10.184.0 is indeed supported by Databricks for managed ingestion via Lakeflow Connect. The officially supported driver versions are 8.10.147.0, 8.10.170.0, and 8.10.184.0. The "JAR checksum does not ma...
- 2 kudos
- 593 Views
- 5 replies
- 1 kudos
Resolved! Advise on "airlocking" Databricks service
Need advice: I'm building a data analysis service solution on top of DataBricks and need to protect it from unauthorized data leaks, specifically file downloads.As far as I can tell, I need some sort of remote browser isolation (RBI).Is this the corr...
- 593 Views
- 5 replies
- 1 kudos
- 1 kudos
I do not require an "NSA-level airlock." Indeed, a malicious actor could develop a script that projects scrolled data onto the screen and records it with an external device, or more effectively, they could create a series of QR code movies to address...
- 1 kudos
- 1094 Views
- 2 replies
- 0 kudos
Resolved! Regarding - Managed vs External volumes and tables
From a creation perspective, the steps for managed and external volumes appear almost identical:Both require a storage credentialBoth require an external locationBoth point to customer-owned S3So what exactly makes a volume “managed” vs “external”?Wh...
- 1094 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @APJESK, You are right that the setup steps look similar on the surface, but the differences between managed and external volumes (and tables) are meaningful once you understand what Unity Catalog does with the data after creation. WHAT "MANAGED" ...
- 0 kudos
- 343 Views
- 1 replies
- 0 kudos
Databricks on AWS Marketplace – Unity Catalog & S3 Access Failing with SSL “Connection reset”
Hi All,I’m facing an issue accessing AWS S3 and Unity Catalog from a Databricks AWS Marketplace workspace.Problem:Whenever Databricks tries to access S3 or Unity Catalog, it fails with:javax.net.ssl.SSLException: Connection resetWhat works:Spark job...
- 343 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @asim_mirza_12, The javax.net.ssl.SSLException: Connection reset error you are seeing with Unity Catalog and S3 operations, while basic Spark jobs and curl commands work, typically points to a networking layer issue where the JVM's SSL handshake i...
- 0 kudos
- 1458 Views
- 5 replies
- 4 kudos
Best practices for 3-layer access control in Databricks
Identity and access management model for Databricks and want to implement a clear 3-layer authorization approach:Account level: Account RBAC roles (account admin, metastore admin, etc.)Workspace level: Workspace roles/entitlements + workspace ACLs (c...
- 1458 Views
- 5 replies
- 4 kudos
- 4 kudos
Hi @APJESK, Your 3-layer model (Account RBAC, Workspace ACLs, Unity Catalog privileges) is the right framework. I want to address both the overall design and the specific follow-up you posted about the Home folder and compute behavior, since those ar...
- 4 kudos
- 530 Views
- 4 replies
- 3 kudos
Resolved! How to delete and "Account Level" Storage Credential ? (... I think)
This is not a production platform, but I'd like to know the answer. I suspect I have done something stupid.Using Account APIs, I created a Storage Credential.Q1: I cannot see this in a workspace, and I do not know how to see it in the account console...
- 530 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @ThePussCat, You’re not missing anything. This is mostly about where UC is surfaced, not about who controls it. Unity Catalog objects (including storage credentials and their workspace bindings) are metastore‑scoped, and the metastore is attached ...
- 3 kudos
- 278 Views
- 2 replies
- 0 kudos
Azure DevOps Release (CD) pipeline - Databricks tasks no longer available
Hello and happy new year everyone.We've noticed that our Azure DevOps Release (CD) pipelines have got all of their Databricks tasks uninstalled, and we cannot find them in the marketplace anymore. The author for both is Microsoft DevLabsWe mainly rel...
- 278 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @BigAlThePal,As @szymon_dybczak mentioned, the "DevOps for Azure Databricks" extension by Microsoft DevLabs (which provided the "Configure Databricks CLI" and "Deploy Notebooks to Workspace" tasks) was deprecated and has since been removed from th...
- 0 kudos
- 1914 Views
- 2 replies
- 0 kudos
GitHub Actions OIDC with Databricks: wildcard subject for pull_request workflows
Hi,I’m configuring GitHub Actions OIDC authentication with Databricks following the official documentation:https://docs.databricks.com/aws/en/dev-tools/auth/provider-githubWhen running a GitHub Actions workflow triggered by pull_request, authenticati...
- 1914 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Valerio,The challenge you are running into is a common one when setting up OIDC federation for pull_request-triggered workflows. Here is a breakdown of the issue and several approaches to solve it.UNDERSTANDING THE SUBJECT CLAIM FOR PULL REQUESTS...
- 0 kudos
- 681 Views
- 7 replies
- 0 kudos
No workspace in Free Edition
Hi, I have been using free edition from some time using my this mail id. But from last 3-4 days I can’t see any workspace. when ever I am logging in I am getting two accounts name and In no workspace is available. When I tried creating another accoun...
- 681 Views
- 7 replies
- 0 kudos
- 0 kudos
when I logging I am getting above page. where no workspace space and no way to create a new one
- 0 kudos
- 253 Views
- 2 replies
- 2 kudos
User Management tab not showing
Hi,I created the workspace with my contributor role from the Azure portal. However, while logged in, I cannot find the User Management tab. I am trying to set up Unity Catalog for user administration.How can I access this?Thanks
- 253 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @ZafarJ, This is a common point of confusion when getting started with Azure Databricks, and the answer depends on which level of user management you need. WORKSPACE-LEVEL USER MANAGEMENT As a workspace admin, you can manage users directly in your...
- 2 kudos
- 932 Views
- 6 replies
- 0 kudos
How to restrict Databricks Apps and Vector Search endpoint creation for workspace users
I am looking to restrict all workspace users' access to create Databricks Apps and Vector Search endpoints.I am aware there is no simple toggle, what is the best way to implement it?
- 932 Views
- 6 replies
- 0 kudos
- 0 kudos
Hi @Raman_Unifeye, You are correct that there is no single toggle to block creation of these resources today. Here is a breakdown of the proactive and detective controls available for each. VECTOR SEARCH ENDPOINTS Vector Search endpoints use access c...
- 0 kudos
- 425 Views
- 3 replies
- 1 kudos
Identifying workload in azure and AWS
we are looking for some python codes that can helps us, we need to have an overview of all Databricks workspaces, their owner names, and mainly the runtime versions that they use, in every Azure and AWS subscriptions that we manage. Can someone pleas...
- 425 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @Saurabh_kanoje, There are two complementary approaches to get an overview of all your Databricks workspaces, their owners, and the runtime versions in use across Azure and AWS. I will walk through both. APPROACH 1: SYSTEM TABLES (RECOMMENDED, NO ...
- 1 kudos
- 315 Views
- 2 replies
- 0 kudos
Databricks Workspace ACL Enforcement
Databricks Workspace ACL Enforcement – How to Prevent Users from Creating Objects Outside Team Folder and Attaching to Shared Clusters?BackgroundI am configuring workspace-level access control in Databricks to restrict Data Engineers (DE group) to op...
- 315 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @APJESK, The two behaviors you are observing are both by design in how Databricks workspace ACLs work. Let me walk through each one and then cover what you can do to tighten governance. ISSUE 1: USERS CAN CREATE NOTEBOOKS IN THEIR HOME FOLDER Ever...
- 0 kudos
- 741 Views
- 2 replies
- 2 kudos
Best Practice for Sharing AI/BI Dashboards across Workspaces in the same Account
Hello everyone,I’m looking for the most efficient way to share dashboards between two workspaces (Workspace A and Workspace B) within the same Databricks account.[Current Setup]Account: Single account with two workspaces (A and B).Data Governance: Bo...
- 741 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Seunghyun, This is a common architecture question, and there are several approaches depending on your requirements around freshness, governance, and operational overhead. Let me address each of your questions directly and then recommend an overal...
- 2 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
74 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 53 | |
| 38 | |
| 36 | |
| 25 |