cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NikeshPurohit
by Databricks Partner
  • 992 Views
  • 1 replies
  • 1 kudos

Unable to connect Claude connector to Databricks MCP server (Genie)

Hello everyone,I am trying to connect Claude’s Custom Connector (Remote MCP) to a Databricks Managed MCP Server using a Genie space, but I keep encountering an authentication connection error. I would appreciate guidance on whether this is a configur...

  • 992 Views
  • 1 replies
  • 1 kudos
Latest Reply
emma_s
Databricks Employee
  • 1 kudos

Hi, updating from my previous post to give the latest update. As of two weeks ago, there is a public preview for this, it needs to be enabled in your workspace. You also need to have the Claude IP added to Databricks if you have any IP workspace rest...

  • 1 kudos
RikL
by Databricks Partner
  • 304 Views
  • 2 replies
  • 2 kudos

Resolved! Automatic Identity Management with Nested Groups and API Access

Hi all,I’m exploring Automatic Identity Management for synchronizing nested groups in Azure Databricks. According to the documentation, this feature supports nested groups: Automatic Identity ManagementHowever, the same article notes that groups sync...

  • 304 Views
  • 2 replies
  • 2 kudos
Latest Reply
emma_s
Databricks Employee
  • 2 kudos

Hi, I've just had a look internally and there is some discussion about making this functionality available but I can't give you a definitive idea of when this might be.  In terms of workarounds the best one I can find is to use Tarracurl to make raw ...

  • 2 kudos
1 More Replies
margarita_shir
by New Contributor III
  • 330 Views
  • 4 replies
  • 0 kudos

Private Route53 hosted zone for cloud.databricks.com

We have Tailscale deployed on the same VPC as our Databricks workspaces. To enable PrivateLink for some workspaces, we created a private Route53 hosted zone for cloud.databricks.com with CNAME records pointing workspace hostnames to the PrivateLink e...

  • 330 Views
  • 4 replies
  • 0 kudos
Latest Reply
margarita_shir
New Contributor III
  • 0 kudos

Thank you for your response. I added a new private hosted zone in Amazon Route 53 for privatelink.cloud.databricks.com, and created an A record for us-east-1.privatelink.cloud.databricks.com pointing to the private endpoint IP.I had a question regard...

  • 0 kudos
3 More Replies
lubiarzm1
by Contributor
  • 554 Views
  • 3 replies
  • 1 kudos

Resolved! [AZURE] Usage of self managable Storage Account instead Default Databricks File Storage

 Hi Team,I’m encountering an issue with my architecture.I’m using VNet Injection and serverless with NCC. Our organization has a firewall controlling both inbound and outbound traffic.I want to restrict all traffic to private networks only, but I ran...

lubiarzm1_0-1771241572642.png
  • 554 Views
  • 3 replies
  • 1 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 1 kudos

Hi TENDK, That is expected behavior and does not necessarily mean something is wrong. Here is what is happening: When you run nslookup from inside a Databricks notebook, the notebook is executing on a cluster that sits inside the Databricks-managed V...

  • 1 kudos
2 More Replies
kohei-matsumura
by Databricks Partner
  • 364 Views
  • 2 replies
  • 1 kudos

Resolved! Best practices for health monitoring

Status page: https://status.databricks.com/ REST API: https://docs.databricks.com/api/workspace/workspace/getstatus I'm trying to perform health monitoring using the above status page and API, but is this the best method? If the API returns an error ...

  • 364 Views
  • 2 replies
  • 1 kudos
Latest Reply
aleksandra_ch
Databricks Employee
  • 1 kudos

Hi @kohei-matsumura , You can subscribe to the status with different methods, including a Webhook. You should provide a URL of a service which will receive POST calls every time there is a new event. You can also subscribe through Email or Slack. Che...

  • 1 kudos
1 More Replies
bdanielatl
by New Contributor II
  • 2405 Views
  • 4 replies
  • 3 kudos

Resolved! Markdown Cells Do Not Render Consistently

When I am creating a notebook in the UI editor on DataBricks, markdown cells do not always render after I run them. They still appear in 'editing mode'. See the screenshot below, it should have rendered a H1.Again, this behavior is not consistent. So...

bdanielatl_0-1736861864741.png
  • 2405 Views
  • 4 replies
  • 3 kudos
Latest Reply
Saf4Databricks
Contributor
  • 3 kudos

Hi @bdanielatl I had the same issue. Google AI's one suggestion of refreshing the browser helped me resolved the issue. Another suggestion (among other suggestions of Google AI) was to clear the cache of the browser.

  • 3 kudos
3 More Replies
ismaelhenzel
by Contributor III
  • 352 Views
  • 2 replies
  • 2 kudos

Resolved! Terraform folder structure and states

Hi everyone, I have a few questions regarding Terraform folder structure and state management for Databricks, and I'd love to get your opinions.For context, our Databricks environment is already deployed and Unity Catalog is configured. The Terraform...

  • 352 Views
  • 2 replies
  • 2 kudos
Latest Reply
Ashwin_DSA
Databricks Employee
  • 2 kudos

Hi @ismaelhenzel, In terms of your first question, Terraform automatically loads all *.tf files in the same directory, so it’s common practice to organise them by concern. For example,   envs/ prod/ backend.tf # remote state config ...

  • 2 kudos
1 More Replies
wonoh90
by New Contributor
  • 250 Views
  • 1 replies
  • 0 kudos

Resolved! Accidentally Removed myself as account admin

Dear Support Team, I am currently on a 14-day free trial of the Premium Edition and have accidentally removed myself as the account administrator. As I was the only admin, I am now locked out and unable to log in to the account. Could you please help...

  • 250 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ashwin_DSA
Databricks Employee
  • 0 kudos

Hi @wonoh90 Thanks for reaching out and sorry you’re running into this. Since this involves restoring admin access at the account level, it’s not something the community can fix directly. You’ll need to open a support ticket with Databricks so the su...

  • 0 kudos
LokeshChikuru
by Databricks Partner
  • 394 Views
  • 2 replies
  • 3 kudos

Resolved! Best practice:Using Databricks managed storage vs customer‑owned ADLS for enterprise production data

We are currently setting up Azure Databricks for enterprise analytics and wanted to validate our storage architecture against Databricks best practices.Today, we are ingesting data directly from external enterprise sources (Oracle DB, SQL Server, etc...

  • 394 Views
  • 2 replies
  • 3 kudos
Latest Reply
LokeshChikuru
Databricks Partner
  • 3 kudos

@emma_s Thanks for the update and appreciate the immediate response.

  • 3 kudos
1 More Replies
saiV06
by New Contributor III
  • 3493 Views
  • 1 replies
  • 0 kudos

Lakehouse Federation - Unable to connect to Snowflake using "PEM Private Key"

Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...

Administration & Architecture
FederatedQuery
LakehouseFederation
SnowflakeConnection
  • 3493 Views
  • 1 replies
  • 0 kudos
Latest Reply
pradeep_singh
Contributor III
  • 0 kudos

Can you check if you have encrypted your keys . The Snowflake JDBC driver does not support authentication with encrypted private keys.  https://docs.databricks.com/aws/en/query-federation/snowflake-pem#limitationsTry generating the key with no encryp...

  • 0 kudos
learti
by New Contributor III
  • 276 Views
  • 1 replies
  • 1 kudos

Resolved! Downgrade from Enterprise to Premium Plan

We upgraded to Enterprise to test something and we want now to switch back to Premium Plan. How can I achieve this. Am I forced to create a new account and delete this one?

  • 276 Views
  • 1 replies
  • 1 kudos
Latest Reply
Lu_Wang_ENB_DBX
Databricks Employee
  • 1 kudos

No need to cancel your account. There is no self-service option to downgrade your account.  Please submit a support ticket and work with your Databricks account team to downgrade.

  • 1 kudos
vziog
by New Contributor III
  • 821 Views
  • 4 replies
  • 0 kudos

Databricks Default Catalog with House Icon

Hi everyone,I have a question regarding workspace catalogs in Databricks with Unity Catalog.In our setup, when a new workspace is created and automatically assigned to a Unity Catalog metastore (with automatic workspace catalog creation enabled), a c...

vziog_0-1772791652417.png
  • 821 Views
  • 4 replies
  • 0 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 0 kudos

Hi @vziog, I can see the earlier suggestion about using the kebab menu Rename didn't fully solve the problem. There are actually a few distinct concepts at play here, and understanding the difference is the key. WHAT IS THE WORKSPACE CATALOG (HOUSE I...

  • 0 kudos
3 More Replies
PradeepPrabha
by New Contributor III
  • 2376 Views
  • 6 replies
  • 0 kudos

Resolved! Any documentation mentioning connectivity from Azure SQL database connectivity to Azure Databricks

Any documentation available to connect from the Azure SQL database to Azure Databricks SQL workspace. We created a SQL warehouse personal access token for a user in a different team who can connect from his on-prem SQL DB to Databricks using the conn...

  • 2376 Views
  • 6 replies
  • 0 kudos
Latest Reply
PradeepPrabha
New Contributor III
  • 0 kudos

Thank you for the detailed answer

  • 0 kudos
5 More Replies
PradeepPrabha
by New Contributor III
  • 368 Views
  • 2 replies
  • 1 kudos

Resolved! Any recommended way for a different app to start their dependent job based on Databricks job?

How can we configure a job in a different Azure application to be triggered after the completion of an Azure Databricks job? Once the Databricks job is successful, the job in the third-party application hosted in Azure should start. I attempted to us...

  • 368 Views
  • 2 replies
  • 1 kudos
Latest Reply
PradeepPrabha
New Contributor III
  • 1 kudos

Thank you.  Thank you for the detailed answer!I have tested the Azure function way and also using an Azure runbook as well. Both works fine.Also tested the option of adding as the final task and a condition to "if all other notebooks" successful, the...

  • 1 kudos
1 More Replies
bradleyjamrozik
by New Contributor III
  • 21432 Views
  • 5 replies
  • 1 kudos

Resolved! databricks OAuth is not supported for this host

I'm trying to deploy using Databricks Asset Bundles via an Azure DevOps pipeline. I keep getting this error when trying to use oauth:Error: default auth: oauth-m2m: oidc: databricks OAuth is not supported for this host. Config: host=https://<workspac...

  • 21432 Views
  • 5 replies
  • 1 kudos
Latest Reply
saadansari-db
Databricks Employee
  • 1 kudos

Hi @bradleyjamrozik, thank you for posting your question. You will need to use ARM_ variables to make it work Specifically ARM_CLIENT_ID ARM_TENANT_ID ARM_CLIENT_SECRET https://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth#environment-3 f...

  • 1 kudos
4 More Replies