cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

552532
by New Contributor II
  • 2203 Views
  • 1 replies
  • 0 kudos

Databricks Private link connectivity for External SaaS Application

We need your guidance on completing the set-up around private link set-up with a Customer who is in the same region in AWS where our application is hosted in AWS. Our Customer has already enabled Private Link in their account and they are using custo...

  • 2203 Views
  • 1 replies
  • 0 kudos
Latest Reply
552532
New Contributor II
  • 0 kudos

Hi Databricks Support,We followed the instructions above, but we are getting an error when registering the endpoint in customer databricks environment using "Registering Endpoint".Following is the error message we see:"INVALID_PARAMETER_VALUE" Endpoi...

  • 0 kudos
ilarsen
by Contributor
  • 2509 Views
  • 1 replies
  • 0 kudos

When to add Users Groups or SPs from Account to Workspace

Hi community We are using Unity Catalog, SCIM and Identity Federation, so we have users, groups and service principals at Account level.  In what scenarios do users, groups and service principals need explicitly added to a Workspace?

  • 2509 Views
  • 1 replies
  • 0 kudos
Latest Reply
kiashaa
New Contributor II
  • 0 kudos

1. If you enable Unity Catalog in a workspace, users in that workspace may be able to access the same data that users in other workspaces in your account can access. Data guardians can control who has access to what data across all workspaces from on...

  • 0 kudos
alexometis
by New Contributor III
  • 8423 Views
  • 2 replies
  • 3 kudos

System Tables Preview - retention period?

The new System Tables for billing, pricing & compute look really useful and easier to consume than getting it via the APIs.However I can't see in the documentation:Does data only start being gathered when you turn them on or is there immediately a hi...

  • 8423 Views
  • 2 replies
  • 3 kudos
Latest Reply
Avvar2022
Contributor
  • 3 kudos

@Retired_mod  -We are customer of databricks. Have databricks premium workspace with unity catalog enabled. and we have also legacy workspaces (non-unity enabled).I can see history is available for all workspaces (unity and non-unity) in same meta st...

  • 3 kudos
1 More Replies
smart5mk
by New Contributor III
  • 2235 Views
  • 0 replies
  • 0 kudos

Destination Path of Cloned Notebooks

Hi, for my project I need to get destination paths of cloned notebooks. But when I run the query to get them: ''SELECT DISTINCT request_params.destinationPathFROM system.access.auditWHERE service_name = "notebook"andaction_name = 'cloneNotebook'LIMIT...

  • 2235 Views
  • 0 replies
  • 0 kudos
JDL
by New Contributor III
  • 19567 Views
  • 4 replies
  • 2 kudos

Get number of rows in delta lake table from metadata without count(*)

Hello folks,Is there a way with sql query to get count from delta table metadata without doing count(*) on each of table? Wondering, if this information is stored in any of INFORMATION_SCHEMA tables.I have a use-case to get counts from 1000's of delt...

  • 19567 Views
  • 4 replies
  • 2 kudos
Latest Reply
SSundaram
Contributor
  • 2 kudos

 Here is a related one.https://community.databricks.com/t5/data-engineering/how-to-get-the-total-number-of-records-in-a-delta-table-from-the/td-p/20441

  • 2 kudos
3 More Replies
karthik_p
by Esteemed Contributor
  • 2236 Views
  • 3 replies
  • 0 kudos

Disaster Recovery Issue

We are trying to create Disaster Recovery for UC enabled Workspaces in Azure. our UC metastore are in different regions.1. we are trying to use Deep Clone2. In source we are adding region2 metastore as external location3. able to do deep cloneproblem...

  • 2236 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

Right I get it.Actually cloning it as external seems logical to me, for the moment, as unity cannot manage the other metastore.For the moment I would go with cloning the data and then creating an external table of that.Not ideal, but at least you hav...

  • 0 kudos
2 More Replies
JDL
by New Contributor III
  • 3481 Views
  • 1 replies
  • 1 kudos

Tie Parquet files in Azure ADLS to Databricks table

Hello All,I have databricks delta table with files residing in Azure Data Lake. I understand, when I load create table and load data from databricks, it creates respective folder and files for table in ADLS. I am wondering if there is reverse way to ...

  • 3481 Views
  • 1 replies
  • 1 kudos
Latest Reply
JDL
New Contributor III
  • 1 kudos

Thanks @Retired_mod this is helpful

  • 1 kudos
RonMoody
by New Contributor II
  • 7521 Views
  • 1 replies
  • 0 kudos

service principal table accesses not showing up in system.audit

When we run jobs using service principals system.audit doesn't show any table accesses (getTable). Volume (getVolume) shows up for service principals. Same query when run as a user shows up in system.audit. I know system.audit is in public preview. W...

  • 7521 Views
  • 1 replies
  • 0 kudos
Latest Reply
RonMoody
New Contributor II
  • 0 kudos

hi @Retired_mod thanks so much for your reply ! I was referring to https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html which is part of databricks core offering and isn't related to ServiceNow's offering. I am assuming t...

  • 0 kudos
cltj
by New Contributor III
  • 3065 Views
  • 1 replies
  • 1 kudos

Three level name space naming standard

Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...

  • 3065 Views
  • 1 replies
  • 1 kudos
aayusha3
by New Contributor II
  • 5214 Views
  • 4 replies
  • 2 kudos

Internal error: Attach your notebook to a different compute or restart the current compute.

I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this error  Internal error. Attach your notebook...

aayusha3_0-1700411861325.png
  • 5214 Views
  • 4 replies
  • 2 kudos
Latest Reply
amandaolens
New Contributor III
  • 2 kudos

@Martin74  same here martin.

  • 2 kudos
3 More Replies
niklas
by Contributor
  • 4324 Views
  • 1 replies
  • 0 kudos

Resolved! Error: cannot create metastore data access

I'm in the progress of enabling Databricks Unity Catalog and encountered a problem with the databricks_metastore_data_access Terraform resource:resource "databricks_metastore_data_access" "this" { provider = databricks.account-level metastore_id...

Administration & Architecture
metastore
Terraform
Unity Catalog
  • 4324 Views
  • 1 replies
  • 0 kudos
Latest Reply
niklas
Contributor
  • 0 kudos

Found a solution. Please see my answer on Stack Overflow:https://stackoverflow.com/questions/77440091/databricks-unity-catalog-error-cannot-create-metastore-data-access/77506306#77506306

  • 0 kudos
NateJ
by New Contributor II
  • 4267 Views
  • 4 replies
  • 2 kudos

Failed to start cluster: Large docker image

I have a large Docker image in our AWS ECR repo. The image is 27.4 GB locally and 11539.79 MB compressed in ECR.The error from the Event Log is:Failed to add 2 containers to the compute. Will attempt retry: true. Reason: Docker image pull failureJSON...

  • 4267 Views
  • 4 replies
  • 2 kudos
Latest Reply
amoghjain
New Contributor II
  • 2 kudos

I have a similar problem. a 10gb image pulls fine but a 31gb image doesnt. both workers and drivers have 64gb memory. i get the timeout error with "Cannot launch the cluster because pulling the docker image failed. Please double check connectivity fr...

  • 2 kudos
3 More Replies