- 6926 Views
- 1 replies
- 0 kudos
service principal table accesses not showing up in system.audit
When we run jobs using service principals system.audit doesn't show any table accesses (getTable). Volume (getVolume) shows up for service principals. Same query when run as a user shows up in system.audit. I know system.audit is in public preview. W...
- 6926 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @Retired_mod thanks so much for your reply ! I was referring to https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html which is part of databricks core offering and isn't related to ServiceNow's offering. I am assuming t...
- 0 kudos
- 1703 Views
- 1 replies
- 1 kudos
Three level name space naming standard
Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...
- 1703 Views
- 1 replies
- 1 kudos
- 1 kudos
- 1 kudos
- 2640 Views
- 4 replies
- 2 kudos
Internal error: Attach your notebook to a different compute or restart the current compute.
I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this error Internal error. Attach your notebook...
- 2640 Views
- 4 replies
- 2 kudos
- 1219 Views
- 0 replies
- 0 kudos
Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances
I am running into an issue where "Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances are behaving differently for a 30gb custom docker image, and I am a bit stumped.when using NC4 instances, I get a timeout, with the exact message shown below...
- 1219 Views
- 0 replies
- 0 kudos
- 5351 Views
- 3 replies
- 2 kudos
Internal error. Attach your notebook to a different compute or restart the current compute. java.lan
Internal error. Attach your notebook to a different compute or restart the current compute.java.lang.RuntimeException: abort: DriverClient destroyed at com.databricks.backend.daemon.driver.DriverClient.$anonfun$poll$3(DriverClient.scala:577) at scala...
- 5351 Views
- 3 replies
- 2 kudos
- 2 kudos
@Retired_mod yeah, in one dataset there are slightly higher data points.schema are same.when the spark crashed i have checked the memory usage.it was around 50%.
- 2 kudos
- 3327 Views
- 1 replies
- 0 kudos
Resolved! Error: cannot create metastore data access
I'm in the progress of enabling Databricks Unity Catalog and encountered a problem with the databricks_metastore_data_access Terraform resource:resource "databricks_metastore_data_access" "this" { provider = databricks.account-level metastore_id...
- 3327 Views
- 1 replies
- 0 kudos
- 0 kudos
Found a solution. Please see my answer on Stack Overflow:https://stackoverflow.com/questions/77440091/databricks-unity-catalog-error-cannot-create-metastore-data-access/77506306#77506306
- 0 kudos
- 3034 Views
- 4 replies
- 2 kudos
Failed to start cluster: Large docker image
I have a large Docker image in our AWS ECR repo. The image is 27.4 GB locally and 11539.79 MB compressed in ECR.The error from the Event Log is:Failed to add 2 containers to the compute. Will attempt retry: true. Reason: Docker image pull failureJSON...
- 3034 Views
- 4 replies
- 2 kudos
- 2 kudos
I have a similar problem. a 10gb image pulls fine but a 31gb image doesnt. both workers and drivers have 64gb memory. i get the timeout error with "Cannot launch the cluster because pulling the docker image failed. Please double check connectivity fr...
- 2 kudos
- 4740 Views
- 0 replies
- 0 kudos
Is it possible to change the Azure storage account of Unity Catalog?
We have unity catalog metastore set up in storage account prod_1. Can we move this to prod_2 storage account and delete prod_1?Also, is it possible to rename the catalogs once they are created?
- 4740 Views
- 0 replies
- 0 kudos
- 4505 Views
- 3 replies
- 2 kudos
Connect to databricks from external non-spark cluster
Hi,I have an app/service on a non-spark kubernetes cluster. Is there a way to access/query a databricks service from my app/service? I see documentations on connectors, particularly on scala which is the code of my app/service. Can I use these connec...
- 4505 Views
- 3 replies
- 2 kudos
- 2 kudos
- 2 kudos
- 2137 Views
- 0 replies
- 0 kudos
Databricks deployment and automation tools comparison.
Hello All, As a newcomer to databricks, I am seeking guidance on automation within databricks environments. What are the best best practices for deployment, and how do Terraform, the REST API, and the databricks SDK compare in terms of advantages and...
- 2137 Views
- 0 replies
- 0 kudos
- 1249 Views
- 1 replies
- 0 kudos
Notebook Id level of uniqueness
Hi there,We know that notebook ids are unique. https://docs.databricks.com/en/workspace/workspace-details.html but I want to know in what level they're unique. For example, if Notebook Ids are unique within a workspace, or are they universally unique...
- 1249 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 1348 Views
- 0 replies
- 0 kudos
Ubuntu 22 ODBC Connectivity Issue with PHP - SQL error: [unixODBC][Driver Manager]Can't open lib
Dear Friends,I'm having trouble connecting to Databricks ODBC from Ubuntu 22. I followed the steps documented here: https://docs.databricks.com/en/integrations/jdbc-odbc-bi.html#odbc-linuxHere is my odbc.ini file: [ODBC Data Sources] Databricks=Datab...
- 1348 Views
- 0 replies
- 0 kudos
- 1150 Views
- 0 replies
- 0 kudos
Support for setting R repository URLs in Databricks
The documentation for using R in Databricks states that the session can be configured by modifying the /usr/lib/R/etc/Rprofile.site file. This works for most things, however the repository URLs set by the `repos` option is overridden by another scrip...
- 1150 Views
- 0 replies
- 0 kudos
- 6445 Views
- 4 replies
- 1 kudos
Resolved! DBT job stuck when running on databricks
HiI'm trying to run a DBT job on a databricks instance. The query should be run on the same instance.When I run the job, I get to: Opening a new connection, currently in state initIt is stuck in that phase for a long time. I'm using IP access list wh...
- 6445 Views
- 4 replies
- 1 kudos
- 1 kudos
I recreated the databricks (there's no other way to solve that). If it was a production databricks workspace it was a disaster!I have created a VM with static public IP and added this IP to the IP access list. Hopefully it'll become the last resort i...
- 1 kudos
- 3054 Views
- 3 replies
- 0 kudos
Databricks Access Bundles - config data needed by notebook
I have this structure - Folder-1 - the root of databricks access directory. "databricks.yaml" file is in this directoryFolder-1 / Folder-2 has notebooks. One of the notebook, "test-notebook" is used for *job* configuration in databricks.yaml file.Fo...
- 3054 Views
- 3 replies
- 0 kudos
- 0 kudos
@GiggleByte @Yes based on demo test that I have done, it is working as you said. JSON converted yaml config for job setting need to be placed under resources, that yaml has job config setting, it looks similar to rest api json request converted in f...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
50 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
36 | |
9 | |
9 | |
8 | |
8 |