cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Lakehouse Architecture


Forum Posts

RonMoody
by New Contributor II
  • 6926 Views
  • 1 replies
  • 0 kudos

service principal table accesses not showing up in system.audit

When we run jobs using service principals system.audit doesn't show any table accesses (getTable). Volume (getVolume) shows up for service principals. Same query when run as a user shows up in system.audit. I know system.audit is in public preview. W...

  • 6926 Views
  • 1 replies
  • 0 kudos
Latest Reply
RonMoody
New Contributor II
  • 0 kudos

hi @Retired_mod thanks so much for your reply ! I was referring to https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html which is part of databricks core offering and isn't related to ServiceNow's offering. I am assuming t...

  • 0 kudos
cltj
by New Contributor III
  • 1703 Views
  • 1 replies
  • 1 kudos

Three level name space naming standard

Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...

  • 1703 Views
  • 1 replies
  • 1 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 1 kudos

This widget could not be displayed.
Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...

This widget could not be displayed.
  • 1 kudos
This widget could not be displayed.
aayusha3
by New Contributor II
  • 2640 Views
  • 4 replies
  • 2 kudos

Internal error: Attach your notebook to a different compute or restart the current compute.

I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this error  Internal error. Attach your notebook...

aayusha3_0-1700411861325.png
  • 2640 Views
  • 4 replies
  • 2 kudos
Latest Reply
amandaolens
New Contributor III
  • 2 kudos

@Martin74  same here martin.

  • 2 kudos
3 More Replies
amandaolens
by New Contributor III
  • 5351 Views
  • 3 replies
  • 2 kudos

Internal error. Attach your notebook to a different compute or restart the current compute. java.lan

Internal error. Attach your notebook to a different compute or restart the current compute.java.lang.RuntimeException: abort: DriverClient destroyed at com.databricks.backend.daemon.driver.DriverClient.$anonfun$poll$3(DriverClient.scala:577) at scala...

  • 5351 Views
  • 3 replies
  • 2 kudos
Latest Reply
amandaolens
New Contributor III
  • 2 kudos

@Retired_mod yeah, in one dataset there are slightly higher data points.schema are same.when the spark crashed i have checked the memory usage.it was around 50%.

  • 2 kudos
2 More Replies
niklas
by Contributor
  • 3327 Views
  • 1 replies
  • 0 kudos

Resolved! Error: cannot create metastore data access

I'm in the progress of enabling Databricks Unity Catalog and encountered a problem with the databricks_metastore_data_access Terraform resource:resource "databricks_metastore_data_access" "this" { provider = databricks.account-level metastore_id...

Administration & Architecture
metastore
Terraform
Unity Catalog
  • 3327 Views
  • 1 replies
  • 0 kudos
Latest Reply
niklas
Contributor
  • 0 kudos

Found a solution. Please see my answer on Stack Overflow:https://stackoverflow.com/questions/77440091/databricks-unity-catalog-error-cannot-create-metastore-data-access/77506306#77506306

  • 0 kudos
NateJ
by New Contributor II
  • 3034 Views
  • 4 replies
  • 2 kudos

Failed to start cluster: Large docker image

I have a large Docker image in our AWS ECR repo. The image is 27.4 GB locally and 11539.79 MB compressed in ECR.The error from the Event Log is:Failed to add 2 containers to the compute. Will attempt retry: true. Reason: Docker image pull failureJSON...

  • 3034 Views
  • 4 replies
  • 2 kudos
Latest Reply
amoghjain
New Contributor II
  • 2 kudos

I have a similar problem. a 10gb image pulls fine but a 31gb image doesnt. both workers and drivers have 64gb memory. i get the timeout error with "Cannot launch the cluster because pulling the docker image failed. Please double check connectivity fr...

  • 2 kudos
3 More Replies
Ishmael
by New Contributor III
  • 4505 Views
  • 3 replies
  • 2 kudos

Connect to databricks from external non-spark cluster

Hi,I have an app/service on a non-spark kubernetes cluster. Is there a way to access/query a databricks service from my app/service? I see documentations on connectors, particularly on scala which is the code of my app/service. Can I use these connec...

  • 4505 Views
  • 3 replies
  • 2 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 2 kudos

This widget could not be displayed.
Hi,I have an app/service on a non-spark kubernetes cluster. Is there a way to access/query a databricks service from my app/service? I see documentations on connectors, particularly on scala which is the code of my app/service. Can I use these connec...

This widget could not be displayed.
  • 2 kudos
This widget could not be displayed.
2 More Replies
Learnit
by New Contributor II
  • 2137 Views
  • 0 replies
  • 0 kudos

Databricks deployment and automation tools comparison.

Hello All, As a newcomer to databricks, I am seeking guidance on automation within databricks environments. What are the best best practices for deployment, and how do Terraform, the REST API, and the databricks SDK compare in terms of advantages and...

  • 2137 Views
  • 0 replies
  • 0 kudos
MasJamei
by New Contributor II
  • 1249 Views
  • 1 replies
  • 0 kudos

Notebook Id level of uniqueness

Hi there,We know that notebook ids are unique. https://docs.databricks.com/en/workspace/workspace-details.html but I want to know in what level they're unique. For example, if Notebook Ids are unique within a workspace, or are they universally unique...

  • 1249 Views
  • 1 replies
  • 0 kudos
Latest Reply
" src="" />
This widget could not be displayed.
This widget could not be displayed.
This widget could not be displayed.
  • 0 kudos

This widget could not be displayed.
Hi there,We know that notebook ids are unique. https://docs.databricks.com/en/workspace/workspace-details.html but I want to know in what level they're unique. For example, if Notebook Ids are unique within a workspace, or are they universally unique...

This widget could not be displayed.
  • 0 kudos
This widget could not be displayed.
ParameshM
by New Contributor
  • 1348 Views
  • 0 replies
  • 0 kudos

Ubuntu 22 ODBC Connectivity Issue with PHP - SQL error: [unixODBC][Driver Manager]Can't open lib

Dear Friends,I'm having trouble connecting to Databricks ODBC from Ubuntu 22. I followed the steps documented here: https://docs.databricks.com/en/integrations/jdbc-odbc-bi.html#odbc-linuxHere is my odbc.ini file: [ODBC Data Sources] Databricks=Datab...

Administration & Architecture
driver
ODBC
ubuntu
unix
  • 1348 Views
  • 0 replies
  • 0 kudos
gilShin
by Contributor
  • 6445 Views
  • 4 replies
  • 1 kudos

Resolved! DBT job stuck when running on databricks

HiI'm trying to run a DBT job on a databricks instance. The query should be run on the same instance.When I run the job, I get to: Opening a new connection, currently in state initIt is stuck in that phase for a long time. I'm using IP access list wh...

  • 6445 Views
  • 4 replies
  • 1 kudos
Latest Reply
gilShin
Contributor
  • 1 kudos

I recreated the databricks (there's no other way to solve that). If it was a production databricks workspace it was a disaster!I have created a VM with static public IP and added this IP to the IP access list. Hopefully it'll become the last resort i...

  • 1 kudos
3 More Replies
GiggleByte
by New Contributor II
  • 3054 Views
  • 3 replies
  • 0 kudos

Databricks Access Bundles - config data needed by notebook

I have this structure - Folder-1 - the root of databricks access directory. "databricks.yaml" file is in this directoryFolder-1 / Folder-2  has notebooks. One of the notebook, "test-notebook" is used for *job* configuration in databricks.yaml file.Fo...

  • 3054 Views
  • 3 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@GiggleByte @Yes based on demo test that I have done, it is working as you said. JSON converted yaml config for job setting need to be placed under resources, that yaml has job config setting, it looks  similar to rest api json request converted in f...

  • 0 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels