- 3791 Views
- 1 replies
- 0 kudos
How to calculate accurate usage cost for a longer contractual period?
Hi Experts!I work on providing and accurate total cost (in DBU and USD as well) calculation for my team for the whole ongoing contractual period. I'v checked the following four options:Account console: Manage account - Usage - Consumption (Legacy): t...
- 3791 Views
- 1 replies
- 0 kudos
- 0 kudos
Based on your description, the REST API for billable usage logs (Option 4) is likely the most comprehensive and reliable method for retrieving usage and cost data for the full contractual period, including potentially the missing first two months. Th...
- 0 kudos
- 3451 Views
- 1 replies
- 1 kudos
Get managedResourceGroup from serverless
Hello,In my job I have a task where I should modify a notebook to get dynamically the environment, for example:This is how we get it:dic = {"D":"dev", "Q":"qa", "P":"prod"}managedResourceGroup = spark.conf.get("spark.databricks.xxxxx")xxxxx_Index = m...
- 3451 Views
- 1 replies
- 1 kudos
- 1 kudos
To dynamically detect your Databricks environment (dev, qa, prod) in a serverless notebook, without relying on manual REST API calls, you typically need a reliable way to extract context directly inside the notebook. However, serverless notebooks oft...
- 1 kudos
- 3201 Views
- 1 replies
- 0 kudos
Query has been timed out due to inactivity while connecting from Tableau Prep
Hi,We are experiencing Query timed out error while running Tableau flows with connections to Databricks. The query history for Serverless SQL warehouse initially showing as finished in Databricks. But later the query status change to "Query has been ...
- 3201 Views
- 1 replies
- 0 kudos
- 0 kudos
The "Query has been timed out due to inactivity" error with Tableau flows connected to Databricks Serverless SQL Warehouse is a known and intermittent issue impacting several users, even when the SQL warehouse does not auto-terminate during the proce...
- 0 kudos
- 217 Views
- 2 replies
- 2 kudos
Resolved! Learning Databricks
Hi All,I am new to databricks and trying to learn things around, i have experience in platform administration and Platform integration and managements roles.Can someone please guide a correct path learning path around platform administration and is t...
- 217 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi @Saurabh_kanoje , welcome to the Databricks community!In the Databricks Academy, there’s a free course called Databricks Platform Administration Fundamentals, which is a great starting point.I’d also recommend exploring the Azure, AWS AND GCP Data...
- 2 kudos
- 3805 Views
- 1 replies
- 0 kudos
Windows ODBC connection error
Hi all,I'm just started learning Databricks and have created a community-level workspace and loaded few tables.Now I'm trying to get access to the data from Excel ODBC connector following the guide here:https://docs.databricks.com/en/integrations/exc...
- 3805 Views
- 1 replies
- 0 kudos
- 0 kudos
The “Status: 500 – Internal Server Error” when connecting Databricks to Excel via the ODBC connector usually means something on the Databricks end is not properly configured, or there is an issue with the authentication flow. Here are the main troubl...
- 0 kudos
- 4494 Views
- 1 replies
- 0 kudos
Implementing Databricks Persona in
Hi all,I am looking to implement the "persona" based access control across multiple workspaces for multiple user groups in Azure Databricks workspaces. Specifically,- I have a "DEV" workspace where the developer groups (Data Engineers and ML Engineer...
- 4494 Views
- 1 replies
- 0 kudos
- 0 kudos
You can implement persona-based access control for Azure Databricks workspaces using Terraform and the Databricks provider, aligning with the setup you described for DEV and PROD environments. Terraform allows you to codify workspace configuration, u...
- 0 kudos
- 4106 Views
- 1 replies
- 0 kudos
Programmatically setting tags for securables
Unity Catalog securable objects can be tagged with key value pairs: https://learn.microsoft.com/en-us/azure/databricks/database-objects/tagsIs it possible tag objects via REST API calls?I initially thought any Unity Catalog resource in the Databricks...
- 4106 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @camilo_s , thanks for sharing the doc link and the details you observed in the UI network calls. Short answer There isn’t a documented, stable, public REST endpoint specifically for “tags on UC securables” today. You should use SQL DDL to man...
- 0 kudos
- 3588 Views
- 1 replies
- 0 kudos
Delete Users that are Maintenance Readers
I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...
- 3588 Views
- 1 replies
- 0 kudos
- 0 kudos
When trying to delete users in Databricks (Azure) and encountering the message "ABORTED: Account <account> is read-only during maintenance and cannot be updated," this means that your Databricks account is currently in a maintenance mode where no cha...
- 0 kudos
- 4149 Views
- 1 replies
- 0 kudos
Databricks report error: unexpected end of stream, read 0 bytes from 4 (socket was closed by server)
Has anyone encountered this error and knows how to resolve it?"Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)."This occurs in Databricks while generating reports.I've already adjusted the wait_timeout to 28,800, and both ...
- 4149 Views
- 1 replies
- 0 kudos
- 0 kudos
Yes, the "Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)" error has been encountered by other Databricks users when generating reports with MySQL. You've already set the major MySQL timeout parameters to their maximums, w...
- 0 kudos
- 3648 Views
- 1 replies
- 0 kudos
Need to create an Identity Federation between my Databricks workspace/account and my AWS account
Hello,I need to set up an identification between my Databricks workspace/account and my AWS account, where Databricks is already deployed.The goal is to make an easy authentification without access and secret keys.So I thought that OIDC will be the s...
- 3648 Views
- 1 replies
- 0 kudos
- 0 kudos
To set up identification between your Databricks workspace/account and your AWS account without using access or secret keys, you can leverage OIDC (OpenID Connect) federation. Instead of traditional SSO, what you’re looking for is a model where AWS t...
- 0 kudos
- 3852 Views
- 2 replies
- 0 kudos
OAuth Url and ClientId Validation
HiI am trying to setup an oauth connection with databricks, so I ask the user to enter their Workspace URL and ClientId.Once the user enters these values, I want to validate whether they are correct or not, so I ask them to login by redirecting them ...
- 3852 Views
- 2 replies
- 0 kudos
- 0 kudos
If you’re using OAuth with Databricks and want to validate both the Workspace URL and ClientId before proceeding, you’re facing an issue seen by others: when the Workspace URL is correct but the ClientId is wrong, Databricks just displays a generic e...
- 0 kudos
- 3769 Views
- 1 replies
- 0 kudos
Custom Runtime marketplace
Hi! Is there a possibility to share the solution accelerator on the custom runtime via the databricks marketplace?
- 3769 Views
- 1 replies
- 0 kudos
- 0 kudos
Greetings @evgenyvainerman , sorry for the delayed response. Your question is not entirely clear but I will take a swing at providing an answer. Short answer: Yes, you can share a Solution Accelerator through Databricks Marketplace, but Marketplace...
- 0 kudos
- 4095 Views
- 1 replies
- 0 kudos
Unity Catalog Not Enabled on Job Cluster When Creating DLT in GCP Databricks
I am trying to create a Delta Live Table (DLT) in my GCP Databricks workspace, but I am encountering an issue where Unity Catalog is not enabled on the job cluster.Steps I followed:Created a DLT pipeline using the Databricks UI.Selected the appropria...
- 4095 Views
- 1 replies
- 0 kudos
- 0 kudos
The error “Unity Catalog is not enabled on this job cluster” during Delta Live Table (DLT) pipeline execution in your GCP Databricks workspace is a common issue, especially after Unity Catalog onboarding. Your troubleshooting steps cover most essenti...
- 0 kudos
- 3579 Views
- 1 replies
- 0 kudos
Databricks Managed RG Storage cost is High
Hi Community,How to calculate the databricks storage cost and where to see the data which is stored and charged in databricks.I'm trying to understand the storage cost on a managed resource group and i'm clueless about the data and where it is stored...
- 3579 Views
- 1 replies
- 0 kudos
- 0 kudos
To calculate the storage cost for Databricks in Azure and view the data being stored and charged, you need to consider both the Databricks compute (DBUs) and the storage resources (such as Azure Data Lake Storage or Blob Storage) linked to your Datab...
- 0 kudos
- 3638 Views
- 1 replies
- 0 kudos
Lakehouse Federation - Unable to connect to Snowflake using "PEM Private Key"
Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...
- 3638 Views
- 1 replies
- 0 kudos
- 0 kudos
To assist with your Databricks Lakehouse Federation to Snowflake using a PEM Private Key, let's clarify the underlying issue. You mentioned that: The connection works with a service credential (user id & password) but fails when switching to the "PE...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
57 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |