cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

alex_svyatets
by New Contributor II
  • 594 Views
  • 2 replies
  • 1 kudos

Verbose logging

Hello! I want audit access to some delta tables in unity catalog from jobs and notebooks. We enabled audit verbose logging. But not all queries that were run from notebooks are  written to system.access.audit, column_lineage and table_lineage tables

  • 594 Views
  • 2 replies
  • 1 kudos
Latest Reply
alex_svyatets
New Contributor II
  • 1 kudos

It are queries to registered table name.

  • 1 kudos
1 More Replies
vishnuvardhan
by New Contributor II
  • 962 Views
  • 6 replies
  • 2 kudos

Getting 403 Forbidden Error When Joining Tables Across Two Unity Catalogs in the Same Workspace

 Hi everyone, I’m facing an unusual issue in my Databricks environment and would appreciate your guidance. I’m using a consumer workspace with access to two different Unity Catalogs. I can successfully query tables from both catalogs individually wit...

  • 962 Views
  • 6 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 2 kudos

Hi @vishnuvardhan, I appreciate the information isn't directly going to answer your question but I wanted to share what I've found. Firstly, I've not heard of a consumer databricks workspace prior to your post. I'm not sure if you've looked into the ...

  • 2 kudos
5 More Replies
fhameed
by New Contributor
  • 796 Views
  • 2 replies
  • 0 kudos

Unity Catalog federation with Snowflake-managed Iceberg table fails with Universal Format conversion

Hello All, Error:[DELTA_UNIVERSAL_FORMAT_CONVERSION_FAILED] Failed to convert the table version 3 to the universal format iceberg. Clone validation failed - Size and number of data files in target table should match with source table. srcTableSize: 7...

  • 796 Views
  • 2 replies
  • 0 kudos
Latest Reply
mani_22
Databricks Employee
  • 0 kudos

@fhameed  The error occurs if the Iceberg metadata written by Snowflake does not match the number of files in object storage. When attempting to read the table in Databricks, there is a verification process that checks to see if the Iceberg metadata ...

  • 0 kudos
1 More Replies
Marco37
by Contributor II
  • 460 Views
  • 3 replies
  • 6 kudos

Resolved! Add a tag to a catalog with REST API

Goodday,I have an Azure DevOps pipeline with which I provision our Databricks workspaces. With this pipeline I also create the catalogs through the Databricks API (/api/2.1/unity-catalog/catalogs).Create a catalog | Catalogs API | REST API reference ...

Marco37_0-1755176222089.png
  • 460 Views
  • 3 replies
  • 6 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 6 kudos

Hi @Marco37 ,Unfortunately, I believe this is not currently possible through REST API. The only options you have are via UI or using SQL. Unfortunately, it's also no possible to do this using terraform. There's a PR that is hanging from some time, bu...

  • 6 kudos
2 More Replies
KingAJ
by New Contributor II
  • 237 Views
  • 1 replies
  • 0 kudos

Is There a Way to Test Subscriptions to the Databricks Status Page?

Hi everyone,I’ve subscribed to the Databricks Status page and set up a webhook for notifications. I’m wondering if there’s any way to test that the subscription is working correctly—perhaps by triggering a test notification or receiving a sample JSON...

  • 237 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @KingAJ ,There's is no such mechanism that databricks provides. But you can try to use similar approach to the below one and test it on your own.https://github.com/jvargh/Webhooks-For-Status-AlertsSo, basically this PowerShell script generates pay...

  • 0 kudos
varni
by New Contributor III
  • 723 Views
  • 1 replies
  • 1 kudos

Resolved! Databricks Apps On Aws

ContextMigrating from Azure Databricks (Premium) to AWS Databricks (Premium) in eu‑west‑2 (London) with Unity Catalog attached.On Azure, Databricks Apps are available (Compute → Apps and New → App). (ы)Goal: run the same Streamlit apps on AWS.What we...

  • 723 Views
  • 1 replies
  • 1 kudos
Latest Reply
varni
New Contributor III
  • 1 kudos

[Resolved]. The root cause was that Databricks Apps (including Serverless App) are not available in all AWS regions. My primary workspace was created in eu-west-2 (London), where this feature is not supported.After creating a new workspace in eu-west...

  • 1 kudos
lexa_koszegi
by New Contributor III
  • 796 Views
  • 5 replies
  • 1 kudos

Cross-region serverless compute network access to Azure storage account

Given we have Unity Catalog and an Azure Databricks workspace with both in Azure west us region, and we want to allow serverless compute to access data in catalogs that use external locations on an Azure Storage account in west us 3, how can we get t...

  • 796 Views
  • 5 replies
  • 1 kudos
Latest Reply
lexa_koszegi
New Contributor III
  • 1 kudos

Turns out we don't have a Databricks-managed NAT Gateway because our workspace is deployed in our own VNet and we have SCC enabled. I opened a track with Microsoft Support and will be working with them today; if we get it figured out I'll share the i...

  • 1 kudos
4 More Replies
ysdtmy
by New Contributor III
  • 1494 Views
  • 6 replies
  • 2 kudos

Resolved! Delta Sharing Error from Azure Databricks - "received more than two lines"

Hello,I am trying to query a Delta table located on AWS S3 from Azure Databricks using Delta Sharing.My setup includes a Delta Sharing server running on AWS Fargate. The server itself is running correctly, and I can successfully query it from my loca...

  • 1494 Views
  • 6 replies
  • 2 kudos
Latest Reply
ysdtmy
New Contributor III
  • 2 kudos

I was able to connect after changing the delta-sharing-server version to 1.1.0.Thank you for your kind help!

  • 2 kudos
5 More Replies
debal
by New Contributor
  • 1101 Views
  • 2 replies
  • 0 kudos

Terraform Databricks Integration - specially for Unity Catalog in AWS S3

We are attempting to provision Unity Catalog using Terraform, but we're encountering issues with establishing authentication with AWS through IAM Roles and Policies.For EC2/Cluster instances, the instance profile works fine with a trust relationship ...

  • 1101 Views
  • 2 replies
  • 0 kudos
Latest Reply
sumit-sampang
New Contributor II
  • 0 kudos

Is this work tested? I'm getting an errorError: Self-referential block on index.tf line 31, in resource "aws_iam_role" "reader": 31: "arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/${aws_iam_role.reader.name}" Conf...

  • 0 kudos
1 More Replies
ecwootten
by New Contributor III
  • 787 Views
  • 5 replies
  • 5 kudos

Resolved! Recent Databricks UI issue with mouse

For the past week or so, I've been having a weird UI issue with Databricks. Frequently, when I try to select text within a notebook, by dragging the mouse, it behaves as if I have clicked on a notebook within the LHS explorer pane, and loads it. So, ...

  • 787 Views
  • 5 replies
  • 5 kudos
Latest Reply
ecwootten
New Contributor III
  • 5 kudos

Thanks for the tip! I'm a bit surprised that I've only just started triggering this - and am now seeing it all the time. But yes, closing the explorer window is probably the way to go if it can't be fixed

  • 5 kudos
4 More Replies
LuukDSL
by New Contributor III
  • 3496 Views
  • 14 replies
  • 1 kudos

Running jobs as service principal, while pulling code from Azure DevOps

In our Dataplatform, our jobs are defined in a dataplatform_jobs.yml within a Databricks Asset Bundle, and then pushed to Databricks via an Azure Devops Pipeline (Azure Devops is where our codebase resides). Currently, this results in workflows looki...

LuukDSL_0-1751983798686.png
  • 3496 Views
  • 14 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 1 kudos

Hi @LuukDSL had you try the solution I have provided above ?

  • 1 kudos
13 More Replies
JonnyData
by New Contributor III
  • 659 Views
  • 4 replies
  • 4 kudos

Resolved! %run command gives error on free edition

Hi,I'm testing out running one Notebook from another using the %run magic command in the Databricks Free Edition. Just really simple test stuff but get the following error:Failed to parse %run command: string matching regex '\$[\w_]+' expected but 'p...

  • 659 Views
  • 4 replies
  • 4 kudos
Latest Reply
Advika
Databricks Employee
  • 4 kudos

Hello @JonnyData! This parsing error typically appears when the notebook path isn't in the expected format. Could you share the exact %run command you're using?Also, please ensure the path is a workspace path, either absolute (starting with /) or rel...

  • 4 kudos
3 More Replies
Chinu
by New Contributor III
  • 689 Views
  • 2 replies
  • 1 kudos

databricks sdk version installed in serverless compute differ

Hello, I've encountered an issue with my Python notebook where app.list() is failing in some serverless compute clusters but works fine in others. After investigating further, I noticed the following version differences:Working Cluster: SDK version 0...

  • 689 Views
  • 2 replies
  • 1 kudos
Latest Reply
Chinu
New Contributor III
  • 1 kudos

oh, I found the doc and confirmed that list() method is added with version 0.27.0Added list() method for w.apps workspace-level service.https://github.com/databricks/databricks-sdk-py/blob/v0.40.0/CHANGELOG.mdNow, how can I update this to newer versi...

  • 1 kudos
1 More Replies
Marco37
by Contributor II
  • 2335 Views
  • 12 replies
  • 6 kudos

Resolved! Install python packages from Azure DevOps feed with service principal authentication

At the moment I install python packages from our Azure DevOps feed with a PAT token as authentication mechanism. This works well, but I want to use a service principal instead of the PAT token.I have created an Azure service principal and assigned it...

Marco37_0-1753975679472.png Marco37_1-1753975813527.png Marco37_2-1753975934347.png
  • 2335 Views
  • 12 replies
  • 6 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 6 kudos

Hi @Marco37 ,Did you follow some guideline or documentation when you were trying to configure it?At first glance it looks incorrect.  In following line you're trying to use service principal as a token? If so it definitely won't work. pip config set ...

  • 6 kudos
11 More Replies
Witold
by Honored Contributor
  • 4497 Views
  • 4 replies
  • 2 kudos

Databricks runtime and Java Runtime

The Databricks runtime is shipped with two Java Runtimes: JRE 8 and JRE 17. While the first one is used by default, you can use the environment variable JNAME to specify the other JRE: JNAME: zulu17-ca-amd64.FWIW, AFAIK JNAME is available since DBR 1...

  • 4497 Views
  • 4 replies
  • 2 kudos
Latest Reply
catalyst
New Contributor II
  • 2 kudos

@Witold Thanks for the original post here, Any luck with jdk-21 on DBR-17?I'm using some java-17 features in the code alongside spark-4.0.0 which I wanted to run on DBR-17. Sadly the generic jname=zulu21-ca-amd64 did not work for me. I also tried oth...

  • 2 kudos
3 More Replies