error when trying to create a cluster in databricks
We are trying to create the cluster within the databricks workspace but it is generating the error attach
- 2569 Views
- 1 replies
- 0 kudos
We are trying to create the cluster within the databricks workspace but it is generating the error attach
Hello All,I have databricks delta table with files residing in Azure Data Lake. I understand, when I load create table and load data from databricks, it creates respective folder and files for table in ADLS. I am wondering if there is reverse way to ...
When we run jobs using service principals system.audit doesn't show any table accesses (getTable). Volume (getVolume) shows up for service principals. Same query when run as a user shows up in system.audit. I know system.audit is in public preview. W...
hi @Retired_mod thanks so much for your reply ! I was referring to https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html which is part of databricks core offering and isn't related to ServiceNow's offering. I am assuming t...
Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...
I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this error Internal error. Attach your notebook...
I am running into an issue where "Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances are behaving differently for a 30gb custom docker image, and I am a bit stumped.when using NC4 instances, I get a timeout, with the exact message shown below...
I'm in the progress of enabling Databricks Unity Catalog and encountered a problem with the databricks_metastore_data_access Terraform resource:resource "databricks_metastore_data_access" "this" { provider = databricks.account-level metastore_id...
Found a solution. Please see my answer on Stack Overflow:https://stackoverflow.com/questions/77440091/databricks-unity-catalog-error-cannot-create-metastore-data-access/77506306#77506306
I have a large Docker image in our AWS ECR repo. The image is 27.4 GB locally and 11539.79 MB compressed in ECR.The error from the Event Log is:Failed to add 2 containers to the compute. Will attempt retry: true. Reason: Docker image pull failureJSON...
I have a similar problem. a 10gb image pulls fine but a 31gb image doesnt. both workers and drivers have 64gb memory. i get the timeout error with "Cannot launch the cluster because pulling the docker image failed. Please double check connectivity fr...
We have unity catalog metastore set up in storage account prod_1. Can we move this to prod_2 storage account and delete prod_1?Also, is it possible to rename the catalogs once they are created?
Hi,I have an app/service on a non-spark kubernetes cluster. Is there a way to access/query a databricks service from my app/service? I see documentations on connectors, particularly on scala which is the code of my app/service. Can I use these connec...
Hello All, As a newcomer to databricks, I am seeking guidance on automation within databricks environments. What are the best best practices for deployment, and how do Terraform, the REST API, and the databricks SDK compare in terms of advantages and...
Hi there,We know that notebook ids are unique. https://docs.databricks.com/en/workspace/workspace-details.html but I want to know in what level they're unique. For example, if Notebook Ids are unique within a workspace, or are they universally unique...
Hi everybody,I am curious if there is a way to raise the security level by default. Specifically if it is possible to configure the default setting to limit the access of the query, rather than granting it open, as right now from my side the default ...
I would like this too. Or perhaps the ability to change this programmatically in bulk. The CLI, SDK and Rest Endpoints don't appear to work and will not let me change this.
Dear Friends,I'm having trouble connecting to Databricks ODBC from Ubuntu 22. I followed the steps documented here: https://docs.databricks.com/en/integrations/jdbc-odbc-bi.html#odbc-linuxHere is my odbc.ini file: [ODBC Data Sources] Databricks=Datab...
The documentation for using R in Databricks states that the session can be configured by modifying the /usr/lib/R/etc/Rprofile.site file. This works for most things, however the repository URLs set by the `repos` option is overridden by another scrip...
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now| User | Count |
|---|---|
| 114 | |
| 37 | |
| 34 | |
| 26 | |
| 25 |