- 7479 Views
- 1 replies
- 0 kudos
service principal table accesses not showing up in system.audit
When we run jobs using service principals system.audit doesn't show any table accesses (getTable). Volume (getVolume) shows up for service principals. Same query when run as a user shows up in system.audit. I know system.audit is in public preview. W...
- 7479 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @Retired_mod thanks so much for your reply ! I was referring to https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html which is part of databricks core offering and isn't related to ServiceNow's offering. I am assuming t...
- 0 kudos
- 2966 Views
- 1 replies
- 1 kudos
Three level name space naming standard
Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...
- 2966 Views
- 1 replies
- 1 kudos
- 5126 Views
- 4 replies
- 2 kudos
Internal error: Attach your notebook to a different compute or restart the current compute.
I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this error Internal error. Attach your notebook...
- 5126 Views
- 4 replies
- 2 kudos
- 2045 Views
- 0 replies
- 0 kudos
Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances
I am running into an issue where "Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances are behaving differently for a 30gb custom docker image, and I am a bit stumped.when using NC4 instances, I get a timeout, with the exact message shown below...
- 2045 Views
- 0 replies
- 0 kudos
- 4273 Views
- 1 replies
- 0 kudos
Resolved! Error: cannot create metastore data access
I'm in the progress of enabling Databricks Unity Catalog and encountered a problem with the databricks_metastore_data_access Terraform resource:resource "databricks_metastore_data_access" "this" { provider = databricks.account-level metastore_id...
- 4273 Views
- 1 replies
- 0 kudos
- 0 kudos
Found a solution. Please see my answer on Stack Overflow:https://stackoverflow.com/questions/77440091/databricks-unity-catalog-error-cannot-create-metastore-data-access/77506306#77506306
- 0 kudos
- 4196 Views
- 4 replies
- 2 kudos
Failed to start cluster: Large docker image
I have a large Docker image in our AWS ECR repo. The image is 27.4 GB locally and 11539.79 MB compressed in ECR.The error from the Event Log is:Failed to add 2 containers to the compute. Will attempt retry: true. Reason: Docker image pull failureJSON...
- 4196 Views
- 4 replies
- 2 kudos
- 2 kudos
I have a similar problem. a 10gb image pulls fine but a 31gb image doesnt. both workers and drivers have 64gb memory. i get the timeout error with "Cannot launch the cluster because pulling the docker image failed. Please double check connectivity fr...
- 2 kudos
- 5463 Views
- 0 replies
- 0 kudos
Is it possible to change the Azure storage account of Unity Catalog?
We have unity catalog metastore set up in storage account prod_1. Can we move this to prod_2 storage account and delete prod_1?Also, is it possible to rename the catalogs once they are created?
- 5463 Views
- 0 replies
- 0 kudos
- 5400 Views
- 3 replies
- 2 kudos
Connect to databricks from external non-spark cluster
Hi,I have an app/service on a non-spark kubernetes cluster. Is there a way to access/query a databricks service from my app/service? I see documentations on connectors, particularly on scala which is the code of my app/service. Can I use these connec...
- 5400 Views
- 3 replies
- 2 kudos
- 2727 Views
- 0 replies
- 0 kudos
Databricks deployment and automation tools comparison.
Hello All, As a newcomer to databricks, I am seeking guidance on automation within databricks environments. What are the best best practices for deployment, and how do Terraform, the REST API, and the databricks SDK compare in terms of advantages and...
- 2727 Views
- 0 replies
- 0 kudos
- 1719 Views
- 1 replies
- 0 kudos
Notebook Id level of uniqueness
Hi there,We know that notebook ids are unique. https://docs.databricks.com/en/workspace/workspace-details.html but I want to know in what level they're unique. For example, if Notebook Ids are unique within a workspace, or are they universally unique...
- 1719 Views
- 1 replies
- 0 kudos
- 3510 Views
- 2 replies
- 1 kudos
Have "viewer's credential" as default setting in Query access control, instead of "owner's credential"
Hi everybody,I am curious if there is a way to raise the security level by default. Specifically if it is possible to configure the default setting to limit the access of the query, rather than granting it open, as right now from my side the default ...
- 3510 Views
- 2 replies
- 1 kudos
- 1 kudos
I would like this too. Or perhaps the ability to change this programmatically in bulk. The CLI, SDK and Rest Endpoints don't appear to work and will not let me change this.
- 1 kudos
- 1904 Views
- 0 replies
- 0 kudos
Ubuntu 22 ODBC Connectivity Issue with PHP - SQL error: [unixODBC][Driver Manager]Can't open lib
Dear Friends,I'm having trouble connecting to Databricks ODBC from Ubuntu 22. I followed the steps documented here: https://docs.databricks.com/en/integrations/jdbc-odbc-bi.html#odbc-linuxHere is my odbc.ini file: [ODBC Data Sources] Databricks=Datab...
- 1904 Views
- 0 replies
- 0 kudos
- 1551 Views
- 0 replies
- 0 kudos
Support for setting R repository URLs in Databricks
The documentation for using R in Databricks states that the session can be configured by modifying the /usr/lib/R/etc/Rprofile.site file. This works for most things, however the repository URLs set by the `repos` option is overridden by another scrip...
- 1551 Views
- 0 replies
- 0 kudos
- 8918 Views
- 4 replies
- 1 kudos
Resolved! DBT job stuck when running on databricks
HiI'm trying to run a DBT job on a databricks instance. The query should be run on the same instance.When I run the job, I get to: Opening a new connection, currently in state initIt is stuck in that phase for a long time. I'm using IP access list wh...
- 8918 Views
- 4 replies
- 1 kudos
- 1 kudos
I recreated the databricks (there's no other way to solve that). If it was a production databricks workspace it was a disaster!I have created a VM with static public IP and added this IP to the IP access list. Hopefully it'll become the last resort i...
- 1 kudos
- 3985 Views
- 3 replies
- 0 kudos
Databricks Access Bundles - config data needed by notebook
I have this structure - Folder-1 - the root of databricks access directory. "databricks.yaml" file is in this directoryFolder-1 / Folder-2 has notebooks. One of the notebook, "test-notebook" is used for *job* configuration in databricks.yaml file.Fo...
- 3985 Views
- 3 replies
- 0 kudos
- 0 kudos
@GiggleByte @Yes based on demo test that I have done, it is working as you said. JSON converted yaml config for job setting need to be placed under resources, that yaml has job config setting, it looks similar to rest api json request converted in f...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
40 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
88 | |
37 | |
25 | |
23 | |
17 |