cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SylvainW
by New Contributor III
  • 507 Views
  • 5 replies
  • 3 kudos

How to embed a databricks dashboard in an iframe when running on localhost?

Hello!I'm working on embedding one of our databricks dashboards into an iframe but even after adding localhost to the Approved domains list, I'm getting the message that embedding for this domain isn't possible (see below).We need to be able to devel...

SylvainW_0-1755638534643.png
  • 507 Views
  • 5 replies
  • 3 kudos
Latest Reply
SylvainW
New Contributor III
  • 3 kudos

So for tunnelling, something like what is described here: https://dev.to/tahsin000/free-services-to-expose-localhost-to-https-a-comparison-5c19

  • 3 kudos
4 More Replies
boitumelodikoko
by Valued Contributor
  • 303 Views
  • 1 replies
  • 1 kudos

Clarification on Data Privacy with ai_query Models

Hi everyone,We've had a client ask about the use of the Claude 3.7 Sonnet model (and others) in the Databricks SQL editor via the ai_query function. Specifically, they want to confirm whether any data passed to these models is ringfenced — i.e., not ...

  • 303 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

HI @boitumelodikoko ,The documentation you've provided is official confirmation by Databricks (otherwise they wouldn't put it in public documentation in the first place). Every customer that uses ai functions within databricks should expect that any ...

  • 1 kudos
RossMacrae777
by New Contributor II
  • 1651 Views
  • 4 replies
  • 1 kudos

Connecting to Databricks from Workato. JDBCDriver 500593 Communication Link failure

I'm trying to connect to Databricks from Workato, to pull data in as part of a Workato Recipe.  I'm getting the following error when I test the connection:"Database bridge error: Failed since could not connect to the database - Failed to initialize p...

  • 1651 Views
  • 4 replies
  • 1 kudos
Latest Reply
nicolai_nqbt
New Contributor II
  • 1 kudos

Hi WiliamRosaI am in fact trying to connect via the recommended method of using the built-in databricks connector. When I provide all the relevant details for the configuration I get the error mentioned in the original post. I don't understand why i ...

  • 1 kudos
3 More Replies
toko_chi
by New Contributor II
  • 713 Views
  • 2 replies
  • 2 kudos

Resolved! Network Connectivity Configurations : "Default rules" tab not visible

Hi, I'm trying to enable static outbound IPs for Serverless Compute in order to whitelist them on external SaaS.I created an NCC in the same AWS region as my workspace (customer-managed VPC)According to the documentation(Serverless Firewall) , I expe...

  • 713 Views
  • 2 replies
  • 2 kudos
Latest Reply
Advika
Databricks Employee
  • 2 kudos

Hello @toko_chi! Yes, as noted in the documentation, this feature is currently in Public Preview and requires enablement by your Databricks account team.Since you’re not seeing the Default rules tab and your API response shows an empty egress_config,...

  • 2 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 254 Views
  • 1 replies
  • 0 kudos

Unity Catalog Backup

Dear allHas anyone explored on backing up unity catalog?...because of gaps in the security model of Databricks, we had to make certain user groups as the schema owners and the schema owners can drop the schema I always wonder how to recover the schem...

  • 254 Views
  • 1 replies
  • 0 kudos
Latest Reply
WiliamRosa
Contributor
  • 0 kudos

Hi @noorbasha534,Currently, there’s no native backup/restore. The pragmatic approach is.Treat Unity Catalog metadata as code (Terraform / SQL DDL checked into Git), regularly export UC object definitions with REST API and lock down schema ownership s...

  • 0 kudos
dbdev
by Contributor
  • 779 Views
  • 4 replies
  • 2 kudos

Resolved! VNet Injected Workspace trouble connecting to the Storage Account of a catalog (UC),

Hi! I have some issues with setting up my workspace and storage account within a virtual network and letting them connect to each other.Background set up (all done in terraform):- Databricks workspace with VNet Injection and Unity Catalog enabled. (3...

  • 779 Views
  • 4 replies
  • 2 kudos
Latest Reply
dbdev
Contributor
  • 2 kudos

It was correctly linked.Turned out we were missing one extra private endpoint of type 'dfs'.So for our storage account we needed to create 2 private endpoints, both configured on the same subnet with one of subresource type 'blob' and one subresource...

  • 2 kudos
3 More Replies
davide-maestron
by New Contributor
  • 325 Views
  • 3 replies
  • 3 kudos

Databricks Workspace APIs

Hello, I'm trying to collect metadata about the files stored in a Workspace Repo Folder, but it looks like the "size" field is always missing. The field is listed in a few SDKs (for example, in the Python documentation: https://databricks-sdk-py.read...

  • 325 Views
  • 3 replies
  • 3 kudos
Latest Reply
WiliamRosa
Contributor
  • 3 kudos

@szymon_dybczak, Help me understand if this makes sense: what if I use the Files API instead of the Workspace API to get the size?

  • 3 kudos
2 More Replies
alex_svyatets
by New Contributor II
  • 484 Views
  • 2 replies
  • 1 kudos

Verbose logging

Hello! I want audit access to some delta tables in unity catalog from jobs and notebooks. We enabled audit verbose logging. But not all queries that were run from notebooks are  written to system.access.audit, column_lineage and table_lineage tables

  • 484 Views
  • 2 replies
  • 1 kudos
Latest Reply
alex_svyatets
New Contributor II
  • 1 kudos

It are queries to registered table name.

  • 1 kudos
1 More Replies
vishnuvardhan
by New Contributor II
  • 702 Views
  • 6 replies
  • 2 kudos

Getting 403 Forbidden Error When Joining Tables Across Two Unity Catalogs in the Same Workspace

 Hi everyone, I’m facing an unusual issue in my Databricks environment and would appreciate your guidance. I’m using a consumer workspace with access to two different Unity Catalogs. I can successfully query tables from both catalogs individually wit...

  • 702 Views
  • 6 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 2 kudos

Hi @vishnuvardhan, I appreciate the information isn't directly going to answer your question but I wanted to share what I've found. Firstly, I've not heard of a consumer databricks workspace prior to your post. I'm not sure if you've looked into the ...

  • 2 kudos
5 More Replies
eoferreira
by New Contributor
  • 423 Views
  • 2 replies
  • 4 kudos

Lakebase security

Hi team,We are using Databricks Enterprise and noticed that our Lakebase instances are exposed to the public internet. They can be reached through the JDBC endpoint with only basic username and password authentication. Is there a way to restrict acce...

  • 423 Views
  • 2 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Hi @eoferreira ,The thing is that Lakebase is still in public preview and there isn't anything in docs regarding private connectivity. I'm quite sure they will add an option to disable public access in near future though, but for now I think it's not...

  • 4 kudos
1 More Replies
fhameed
by New Contributor
  • 665 Views
  • 2 replies
  • 0 kudos

Unity Catalog federation with Snowflake-managed Iceberg table fails with Universal Format conversion

Hello All, Error:[DELTA_UNIVERSAL_FORMAT_CONVERSION_FAILED] Failed to convert the table version 3 to the universal format iceberg. Clone validation failed - Size and number of data files in target table should match with source table. srcTableSize: 7...

  • 665 Views
  • 2 replies
  • 0 kudos
Latest Reply
mani_22
Databricks Employee
  • 0 kudos

@fhameed  The error occurs if the Iceberg metadata written by Snowflake does not match the number of files in object storage. When attempting to read the table in Databricks, there is a verification process that checks to see if the Iceberg metadata ...

  • 0 kudos
1 More Replies
Marco37
by Contributor
  • 401 Views
  • 3 replies
  • 6 kudos

Resolved! Add a tag to a catalog with REST API

Goodday,I have an Azure DevOps pipeline with which I provision our Databricks workspaces. With this pipeline I also create the catalogs through the Databricks API (/api/2.1/unity-catalog/catalogs).Create a catalog | Catalogs API | REST API reference ...

Marco37_0-1755176222089.png
  • 401 Views
  • 3 replies
  • 6 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 6 kudos

Hi @Marco37 ,Unfortunately, I believe this is not currently possible through REST API. The only options you have are via UI or using SQL. Unfortunately, it's also no possible to do this using terraform. There's a PR that is hanging from some time, bu...

  • 6 kudos
2 More Replies
KingAJ
by New Contributor II
  • 215 Views
  • 1 replies
  • 0 kudos

Is There a Way to Test Subscriptions to the Databricks Status Page?

Hi everyone,I’ve subscribed to the Databricks Status page and set up a webhook for notifications. I’m wondering if there’s any way to test that the subscription is working correctly—perhaps by triggering a test notification or receiving a sample JSON...

  • 215 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @KingAJ ,There's is no such mechanism that databricks provides. But you can try to use similar approach to the below one and test it on your own.https://github.com/jvargh/Webhooks-For-Status-AlertsSo, basically this PowerShell script generates pay...

  • 0 kudos
varni
by New Contributor III
  • 648 Views
  • 1 replies
  • 1 kudos

Resolved! Databricks Apps On Aws

ContextMigrating from Azure Databricks (Premium) to AWS Databricks (Premium) in eu‑west‑2 (London) with Unity Catalog attached.On Azure, Databricks Apps are available (Compute → Apps and New → App). (ы)Goal: run the same Streamlit apps on AWS.What we...

  • 648 Views
  • 1 replies
  • 1 kudos
Latest Reply
varni
New Contributor III
  • 1 kudos

[Resolved]. The root cause was that Databricks Apps (including Serverless App) are not available in all AWS regions. My primary workspace was created in eu-west-2 (London), where this feature is not supported.After creating a new workspace in eu-west...

  • 1 kudos