- 2203 Views
- 1 replies
- 0 kudos
Databricks Private link connectivity for External SaaS Application
We need your guidance on completing the set-up around private link set-up with a Customer who is in the same region in AWS where our application is hosted in AWS. Our Customer has already enabled Private Link in their account and they are using custo...
- 2203 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Databricks Support,We followed the instructions above, but we are getting an error when registering the endpoint in customer databricks environment using "Registering Endpoint".Following is the error message we see:"INVALID_PARAMETER_VALUE" Endpoi...
- 0 kudos
- 2509 Views
- 1 replies
- 0 kudos
When to add Users Groups or SPs from Account to Workspace
Hi community We are using Unity Catalog, SCIM and Identity Federation, so we have users, groups and service principals at Account level. In what scenarios do users, groups and service principals need explicitly added to a Workspace?
- 2509 Views
- 1 replies
- 0 kudos
- 0 kudos
1. If you enable Unity Catalog in a workspace, users in that workspace may be able to access the same data that users in other workspaces in your account can access. Data guardians can control who has access to what data across all workspaces from on...
- 0 kudos
- 8423 Views
- 2 replies
- 3 kudos
System Tables Preview - retention period?
The new System Tables for billing, pricing & compute look really useful and easier to consume than getting it via the APIs.However I can't see in the documentation:Does data only start being gathered when you turn them on or is there immediately a hi...
- 8423 Views
- 2 replies
- 3 kudos
- 3 kudos
@Retired_mod -We are customer of databricks. Have databricks premium workspace with unity catalog enabled. and we have also legacy workspaces (non-unity enabled).I can see history is available for all workspaces (unity and non-unity) in same meta st...
- 3 kudos
- 2235 Views
- 0 replies
- 0 kudos
Destination Path of Cloned Notebooks
Hi, for my project I need to get destination paths of cloned notebooks. But when I run the query to get them: ''SELECT DISTINCT request_params.destinationPathFROM system.access.auditWHERE service_name = "notebook"andaction_name = 'cloneNotebook'LIMIT...
- 2235 Views
- 0 replies
- 0 kudos
- 19567 Views
- 4 replies
- 2 kudos
Get number of rows in delta lake table from metadata without count(*)
Hello folks,Is there a way with sql query to get count from delta table metadata without doing count(*) on each of table? Wondering, if this information is stored in any of INFORMATION_SCHEMA tables.I have a use-case to get counts from 1000's of delt...
- 19567 Views
- 4 replies
- 2 kudos
- 2 kudos
Here is a related one.https://community.databricks.com/t5/data-engineering/how-to-get-the-total-number-of-records-in-a-delta-table-from-the/td-p/20441
- 2 kudos
- 2236 Views
- 3 replies
- 0 kudos
Disaster Recovery Issue
We are trying to create Disaster Recovery for UC enabled Workspaces in Azure. our UC metastore are in different regions.1. we are trying to use Deep Clone2. In source we are adding region2 metastore as external location3. able to do deep cloneproblem...
- 2236 Views
- 3 replies
- 0 kudos
- 0 kudos
Right I get it.Actually cloning it as external seems logical to me, for the moment, as unity cannot manage the other metastore.For the moment I would go with cloning the data and then creating an external table of that.Not ideal, but at least you hav...
- 0 kudos
- 1467 Views
- 0 replies
- 0 kudos
Source data for Raw layer - API's vs Microservice
What does everyone think about ingesting source data to the raw layer via a microservice rather than direct from the source API?
- 1467 Views
- 0 replies
- 0 kudos
- 2404 Views
- 1 replies
- 0 kudos
error when trying to create a cluster in databricks
We are trying to create the cluster within the databricks workspace but it is generating the error attach
- 2404 Views
- 1 replies
- 0 kudos
- 3481 Views
- 1 replies
- 1 kudos
Tie Parquet files in Azure ADLS to Databricks table
Hello All,I have databricks delta table with files residing in Azure Data Lake. I understand, when I load create table and load data from databricks, it creates respective folder and files for table in ADLS. I am wondering if there is reverse way to ...
- 3481 Views
- 1 replies
- 1 kudos
- 7521 Views
- 1 replies
- 0 kudos
service principal table accesses not showing up in system.audit
When we run jobs using service principals system.audit doesn't show any table accesses (getTable). Volume (getVolume) shows up for service principals. Same query when run as a user shows up in system.audit. I know system.audit is in public preview. W...
- 7521 Views
- 1 replies
- 0 kudos
- 0 kudos
hi @Retired_mod thanks so much for your reply ! I was referring to https://docs.databricks.com/en/administration-guide/system-tables/audit-logs.html which is part of databricks core offering and isn't related to ServiceNow's offering. I am assuming t...
- 0 kudos
- 3065 Views
- 1 replies
- 1 kudos
Three level name space naming standard
Hi all, I have not been successful in getting a good grip of the naming conventions for the three level name space. Initially i learned about bronze, silver and gold, but i am confused where to put this. The obvious choice may be to use the {catalog}...
- 3065 Views
- 1 replies
- 1 kudos
- 5214 Views
- 4 replies
- 2 kudos
Internal error: Attach your notebook to a different compute or restart the current compute.
I am currently using a personal computer cluster [13.3 LTS (includes Apache Spark 3.4.1, Scala 2.12)] on GCP attached to a notebook. After running a few command lines without an issue, I end up getting this error Internal error. Attach your notebook...
- 5214 Views
- 4 replies
- 2 kudos
- 2089 Views
- 0 replies
- 0 kudos
Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances
I am running into an issue where "Standard_NC8as_T4_v3" and "Standard_NC4as_T4_v3" instances are behaving differently for a 30gb custom docker image, and I am a bit stumped.when using NC4 instances, I get a timeout, with the exact message shown below...
- 2089 Views
- 0 replies
- 0 kudos
- 4324 Views
- 1 replies
- 0 kudos
Resolved! Error: cannot create metastore data access
I'm in the progress of enabling Databricks Unity Catalog and encountered a problem with the databricks_metastore_data_access Terraform resource:resource "databricks_metastore_data_access" "this" { provider = databricks.account-level metastore_id...
- 4324 Views
- 1 replies
- 0 kudos
- 0 kudos
Found a solution. Please see my answer on Stack Overflow:https://stackoverflow.com/questions/77440091/databricks-unity-catalog-error-cannot-create-metastore-data-access/77506306#77506306
- 0 kudos
- 4267 Views
- 4 replies
- 2 kudos
Failed to start cluster: Large docker image
I have a large Docker image in our AWS ECR repo. The image is 27.4 GB locally and 11539.79 MB compressed in ECR.The error from the Event Log is:Failed to add 2 containers to the compute. Will attempt retry: true. Reason: Docker image pull failureJSON...
- 4267 Views
- 4 replies
- 2 kudos
- 2 kudos
I have a similar problem. a 10gb image pulls fine but a 31gb image doesnt. both workers and drivers have 64gb memory. i get the timeout error with "Cannot launch the cluster because pulling the docker image failed. Please double check connectivity fr...
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
102 | |
37 | |
27 | |
25 | |
19 |