- 1361 Views
- 1 replies
- 0 kudos
Custom hostname for Azure Databricks workspace
For a client it's required that all applications use a subdomain of an internal hostname. Does anyone have a suggestion on how to achieve this? Options could be in the line of CNAME or Reverse Proxy, but couldn't get that to work yet.Does Azure Datab...
- 1361 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @gt-ferdyh, Azure Databricks doesn’t natively support custom subdomains. However, to make it accessible via an internal subdomain, you can:Enable Azure Private Link for your Databricks workspace.In your internal DNS, create an A record or CNAME th...
- 0 kudos
- 1422 Views
- 2 replies
- 0 kudos
Resolved! "Aws Invalid Kms Key State" when trying to start new cluster on AWS
Hey,We just established new environment based on AWS. Our first step was to create Cluster but while doing so, we have encountered an error. We tried different policies, configurations and instance types. All resulted in:Aws Invalid Kms Key State: Th...
- 1422 Views
- 2 replies
- 0 kudos
- 0 kudos
Yes, I followed this document and that fixed it. Thanks.
- 0 kudos
- 1165 Views
- 2 replies
- 2 kudos
Databricks Logs
I’m trying to understand the different types of logs available in Databricks and how to access and interpret them. Could anyone please guide me on:What types of logs are available in Databricks?Where can I find these logs?How can I use these logs to ...
- 1165 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @APJESK,In addition, job logs are very useful to monitor and troubleshoot any job failures. They can be found under Workflows. Workspace admins role is required to have full access on all jobs unless explicitly granted to the user by Job Owner/adm...
- 2 kudos
- 5759 Views
- 1 replies
- 8 kudos
Azure Databricks Multi Tenant Solution
Hello Everyone,For the past few months, we’ve been extensively exploring the use of Databricks as the core of our data warehousing product. We provide analytics dashboards to other organizations and are particularly interested in the Column-Level Sec...
- 5759 Views
- 1 replies
- 8 kudos
- 8 kudos
Could you please let me know or provide a link how is it done in Databricks AWS using OIDC or SAML directly as SSO providers.I have exactly same kind of use case
- 8 kudos
- 2332 Views
- 3 replies
- 2 kudos
Resolved! Clarification on Databricks Access Modes: Standard vs. Dedicated with Unity Catalog
Hello,I’d like to ask for clarification regarding the access modes in Databricks, specifically the intent and future direction of the “Standard” and “Dedicated” modes.According to the documentation below:https://docs.databricks.com/aws/ja/compute/acc...
- 2332 Views
- 3 replies
- 2 kudos
- 2 kudos
Hi @Yuki Thank you, I am glad I could help!
- 2 kudos
- 1416 Views
- 2 replies
- 0 kudos
Have authenticator app set up but databricks still resorts to email mfa
Good monring,I set up an authenticator app through settings, profile, mfa. However, when I logout and log back into the workspace, I am still only getting the email verification to get in. Is there a way I can turn off email verification or have it d...
- 1416 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, @andreapeterson Do you have Account Admin privileges?If you do, you can enforce MFA from the account console, just as shown in the attached image.If you’re using the Databricks Free Edition rather than a paid account, you can’t enforce MFA. This ...
- 0 kudos
- 1429 Views
- 1 replies
- 0 kudos
- 1429 Views
- 1 replies
- 0 kudos
- 0 kudos
hey @prakash4 I understand you're having trouble connecting to a remote server / getting your Databricks workspace to start. Would you be able to share a little more detail to help track down your issue?Specifically:- What happens when you try to sta...
- 0 kudos
- 625 Views
- 1 replies
- 3 kudos
Resolved! Cost of SQL Warehouse scaled to 0 clusters
A SQL Warehouse can be scaled to a minimum of 0.Presumably, there is still a cost to keeping the resource active, because we also have Auto-Stop which can completely stop the warehouse after a configurable amount of time.This cost is not documented. ...
- 625 Views
- 1 replies
- 3 kudos
- 3 kudos
1. Just to clarify, minimum is 1, not 0. See https://docs.databricks.com/api/workspace/warehouses/create#min_num_clusters 2. How it works is based on concurrency. Assuming your max clusters is 2, and based on the concurrency, we don't need the 2nd cl...
- 3 kudos
- 1502 Views
- 2 replies
- 2 kudos
Resolved! JJDBC Insert Performance and Unsupported Data Types
We are reaching out regarding two observations with the Databricks JDBC driver:We’ve noticed that each INSERT query is taking approximately 1 second to execute via the JDBC driver (please refer to the attached screenshot). This seems unusually slow f...
- 1502 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @ankit_kothiya1 Please find below my findings for your query 1. Slow INSERT Performance via Databricks JDBC Driver Observation:Each INSERT query takes about 1 second via the Databricks JDBC driver, which is unusually slow for high-throughput use ...
- 2 kudos
- 1136 Views
- 2 replies
- 0 kudos
Guidance on Populating the cloud_infra_cost Table in System Catalog
In the system catalog, there are three tables: cloud_infra_cost, list_prices, and usage. While the list_prices and usage tables contain cost-related information, the cloud_infra_cost table is currently empty. I am using AWS cloud. Can anyone provide ...
- 1136 Views
- 2 replies
- 0 kudos
- 0 kudos
I am opted in for the features in preview but I dont see any data in this table
- 0 kudos
- 976 Views
- 3 replies
- 1 kudos
Principals given access to and their owners
Hi allIn a large global data platform built with Azure Databricks, I like to know the best practice of how we maintain the users to which Databricks objects (typically views) have been access to, for example - a view has been given access to a servic...
- 976 Views
- 3 replies
- 1 kudos
- 1 kudos
@noorbasha534 - I created a helper function/script to do this in my environment that queries the Unity Catalog system tables to generate a unique list of impacted principals/users. It takes in a list of fully qualified object names and will display ...
- 1 kudos
- 2435 Views
- 8 replies
- 2 kudos
Resolved! Creating Groups with API and Python
I am working on a notebook to help me create Azure Databricks Groups. When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them. When I create a group using the API, and I create the w...
- 2435 Views
- 8 replies
- 2 kudos
- 2 kudos
I have a couple of questions regarding the Token to achieve this, If I create a workspace PAT token, is it limited to only the workspace or all the workspaces I have access to. And Do my account admin privileges translated to the PAT token I create i...
- 2 kudos
- 613 Views
- 1 replies
- 0 kudos
How to write files to Databricks Volumes while running code in local VS Code (without cp)
How to write files to Databricks Volumes while running code in local VS Code (without cp)I always struggle to seamlessly use VS Code with databricks. Its so not user friendly. Do you also feel the same?
- 613 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @gowtham-talluru, If you're trying to write directly to Volumes from local code, you can use the Databricks SDK for Python.Try something like this:from databricks.sdk import WorkspaceClientw = WorkspaceClient()with open("local_file.csv", "rb") as ...
- 0 kudos
- 1372 Views
- 2 replies
- 2 kudos
Resolved! Transfer Account Ownership
I have the same issue as this previous user, who had their question resolved before an actionable solution was provided: https://community.databricks.com/t5/data-engineering/how-to-transfer-ownership-of-a-databricks-cloud-standard-account/td-p/34737I...
- 1372 Views
- 2 replies
- 2 kudos
- 2 kudos
You are going to have a very difficult time with the transfer as it can only be done on the backside by Databricks. Your only real option would be to have your customer create their own account and migrate the workspace assets over outside of having...
- 2 kudos
- 891 Views
- 2 replies
- 1 kudos
users' usage report (for frontend power bi)
Hi All,Hi All,I'm currently working on retrieving usage information by querying system tables. At the moment, I'm using the system.access.audit table. However, I've noticed that the list of users retrieved appears to be incomplete when compared to si...
- 891 Views
- 2 replies
- 1 kudos
- 1 kudos
thank you for the replay.if i understand correctly, when using PBI direct query connectivity the users being used is not s service principle but the end user who open the PBI dashboard. correct?did you implement any usage report? Regards,Uri
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
45 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 107 | |
| 37 | |
| 32 | |
| 25 | |
| 19 |