cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Saurabh_kanoje
by New Contributor
  • 179 Views
  • 2 replies
  • 2 kudos

Resolved! Learning Databricks

Hi All,I am new to databricks and trying to learn things around, i have experience in platform administration and Platform integration and managements roles.Can someone please guide a correct path learning path around platform administration and is t...

  • 179 Views
  • 2 replies
  • 2 kudos
Latest Reply
bianca_unifeye
New Contributor III
  • 2 kudos

Hi @Saurabh_kanoje , welcome to the Databricks community!In the Databricks Academy, there’s a free course called Databricks Platform Administration Fundamentals, which is a great starting point.I’d also recommend exploring the Azure, AWS AND GCP Data...

  • 2 kudos
1 More Replies
Vadimalk
by New Contributor II
  • 3716 Views
  • 1 replies
  • 0 kudos

Windows ODBC connection error

Hi all,I'm just started learning Databricks and have created a community-level workspace and loaded few tables.Now I'm trying to get access to the data from Excel ODBC connector following the guide here:https://docs.databricks.com/en/integrations/exc...

  • 3716 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The “Status: 500 – Internal Server Error” when connecting Databricks to Excel via the ODBC connector usually means something on the Databricks end is not properly configured, or there is an issue with the authentication flow. Here are the main troubl...

  • 0 kudos
d_kailthya
by New Contributor
  • 4382 Views
  • 1 replies
  • 0 kudos

Implementing Databricks Persona in

Hi all,I am looking to implement the "persona" based access control across multiple workspaces for multiple user groups in Azure Databricks workspaces. Specifically,- I have a "DEV" workspace where the developer groups (Data Engineers and ML Engineer...

  • 4382 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

You can implement persona-based access control for Azure Databricks workspaces using Terraform and the Databricks provider, aligning with the setup you described for DEV and PROD environments. Terraform allows you to codify workspace configuration, u...

  • 0 kudos
camilo_s
by Contributor
  • 4027 Views
  • 1 replies
  • 0 kudos

Programmatically setting tags for securables

Unity Catalog securable objects can be tagged with key value pairs: https://learn.microsoft.com/en-us/azure/databricks/database-objects/tagsIs it possible tag objects via REST API calls?I initially thought any Unity Catalog resource in the Databricks...

  • 4027 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hello @camilo_s ,  thanks for sharing the doc link and the details you observed in the UI network calls. Short answer There isn’t a documented, stable, public REST endpoint specifically for “tags on UC securables” today. You should use SQL DDL to man...

  • 0 kudos
daniel23
by New Contributor II
  • 3515 Views
  • 1 replies
  • 0 kudos

Delete Users that are Maintenance Readers

I am an Account Admin at Databricks (Azure), and trying to delete users that are being offboarded.I have managed to delete most users. However, for a couple, I get the following message (see screenshot):ABORTED: Account <account> is read-only during ...

abort-delete.PNG
  • 3515 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

When trying to delete users in Databricks (Azure) and encountering the message "ABORTED: Account <account> is read-only during maintenance and cannot be updated," this means that your Databricks account is currently in a maintenance mode where no cha...

  • 0 kudos
SolaireOfAstora
by New Contributor
  • 4058 Views
  • 1 replies
  • 0 kudos

Databricks report error: unexpected end of stream, read 0 bytes from 4 (socket was closed by server)

Has anyone encountered this error and knows how to resolve it?"Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)."This occurs in Databricks while generating reports.I've already adjusted the wait_timeout to 28,800, and both ...

  • 4058 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Yes, the "Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)" error has been encountered by other Databricks users when generating reports with MySQL. You've already set the major MySQL timeout parameters to their maximums, w...

  • 0 kudos
axelboursin
by New Contributor II
  • 3559 Views
  • 1 replies
  • 0 kudos

Need to create an Identity Federation between my Databricks workspace/account and my AWS account

Hello,I need to set up an identification between my Databricks workspace/account and my AWS account, where Databricks is already deployed.The goal is to make an easy authentification without access and secret keys.So I thought that OIDC will be the s...

  • 3559 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To set up identification between your Databricks workspace/account and your AWS account without using access or secret keys, you can leverage OIDC (OpenID Connect) federation. Instead of traditional SSO, what you’re looking for is a model where AWS t...

  • 0 kudos
Leo_310
by New Contributor II
  • 3759 Views
  • 2 replies
  • 0 kudos

OAuth Url and ClientId Validation

HiI am trying to setup an oauth connection with databricks, so I ask the user to enter their Workspace URL and ClientId.Once the user enters these values, I want to validate whether they are correct or not, so I ask them to login by redirecting them ...

  • 3759 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

If you’re using OAuth with Databricks and want to validate both the Workspace URL and ClientId before proceeding, you’re facing an issue seen by others: when the Workspace URL is correct but the ClientId is wrong, Databricks just displays a generic e...

  • 0 kudos
1 More Replies
evgenyvainerman
by New Contributor
  • 3682 Views
  • 1 replies
  • 0 kudos

Custom Runtime marketplace

Hi! Is there a possibility to share the solution accelerator on the custom runtime via the databricks marketplace?

  • 3682 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Greetings @evgenyvainerman , sorry for the delayed response. Your question is not entirely clear but I will take a swing at providing an answer.   Short answer: Yes, you can share a Solution Accelerator through Databricks Marketplace, but Marketplace...

  • 0 kudos
vidya_kothavale
by Contributor
  • 4004 Views
  • 1 replies
  • 0 kudos

Unity Catalog Not Enabled on Job Cluster When Creating DLT in GCP Databricks

I am trying to create a Delta Live Table (DLT) in my GCP Databricks workspace, but I am encountering an issue where Unity Catalog is not enabled on the job cluster.Steps I followed:Created a DLT pipeline using the Databricks UI.Selected the appropria...

  • 4004 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error “Unity Catalog is not enabled on this job cluster” during Delta Live Table (DLT) pipeline execution in your GCP Databricks workspace is a common issue, especially after Unity Catalog onboarding. Your troubleshooting steps cover most essenti...

  • 0 kudos
Srujanm01
by New Contributor III
  • 3466 Views
  • 1 replies
  • 0 kudos

Databricks Managed RG Storage cost is High

Hi Community,How to calculate the databricks storage cost and where to see the data which is stored and charged in databricks.I'm trying to understand the storage cost on a managed resource group and i'm clueless about the data and where it is stored...

  • 3466 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To calculate the storage cost for Databricks in Azure and view the data being stored and charged, you need to consider both the Databricks compute (DBUs) and the storage resources (such as Azure Data Lake Storage or Blob Storage) linked to your Datab...

  • 0 kudos
saiV06
by New Contributor III
  • 3550 Views
  • 1 replies
  • 0 kudos

Lakehouse Federation - Unable to connect to Snowflake using "PEM Private Key"

Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...

  • 3550 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To assist with your Databricks Lakehouse Federation to Snowflake using a PEM Private Key, let's clarify the underlying issue. You mentioned that: The connection works with a service credential (user id & password) but fails when switching to the "PE...

  • 0 kudos
ViliamG
by New Contributor
  • 3741 Views
  • 1 replies
  • 0 kudos

MLFlow Tracking versions

Hi team,we are migrating from self-self hosted MLFlow Tracking server to the Databricks-hosted one. However, there are concerns about the unclear process of version changes and releases at the Tracking server side. Is there any public  information av...

  • 3741 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Hey @ViliamG ,  thanks for raising this—here’s how versioning and client compatibility work for the Databricks-hosted MLflow Tracking service, and where you can track changes publicly. What’s publicly available about versions The Databricks-hosted M...

  • 0 kudos
dataminion01
by New Contributor II
  • 3776 Views
  • 1 replies
  • 0 kudos

DLT constantly failing with time out errors

DLT was working but then started getting time outs frequentlycom.databricks.pipelines.common.errors.deployment.DeploymentException: Failed to launch pipeline cluster xxxxxxxxxxxx: Self-bootstrap timed out during launch. Please try again later and con...

  • 3776 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Frequent timeouts and bootstrap errors when launching Databricks Delta Live Table (DLT) pipeline clusters on AWS are usually caused by network connectivity issues, VPC misconfigurations, or resource allocation problems between Databricks' control pla...

  • 0 kudos
Rjdudley
by Honored Contributor
  • 4080 Views
  • 2 replies
  • 1 kudos

Resolved! Lakeflow Connect: can't change general privilege requirements

I want to set up Lakeflow Connect to ETL data from Azure SQL Server (Microsoft SQL Azure (RTM) - 12.0.2000.8 Feb 9 2025) using change tracking (we don't need the data retention of CDC).  In the documentation, there is a list off system tables, views ...

  • 4080 Views
  • 2 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

You are hitting a known limitation in Azure SQL Database: it does not allow you to grant or modify permissions directly on most system objects, such as system stored procedures, catalog views, or extended stored procedures, resulting in the error "Ms...

  • 1 kudos
1 More Replies