- 3381 Views
- 1 replies
- 0 kudos
Connect to Salesforce
Curious if there's a Databricks connector for Salesforce on AWS?
- 3381 Views
- 1 replies
- 0 kudos
- 0 kudos
There is no "databricks" connector like the on you have in Unity Fedarating e.g. for Snowflake.You can use partner ecosystem e.g. Fivetran https://www.fivetran.com/connectors/salesforceto integrate Salesforce data to your Lakehouse. You also have spa...
- 0 kudos
- 4449 Views
- 1 replies
- 1 kudos
How to allocate costs per SQL query?
By using System Tables (systen.billing.usage) I'm able to identity DBU usage per query, but I'm not able to identify who ran each query because is not part of the table. I'm also aware of query history where all the queries and who ran them is listed...
- 4449 Views
- 1 replies
- 1 kudos
- 1 kudos
thanks @Retired_mod for the reply, however query_id is not part of the system.billing.usage table, so no way to join them by IDs. What my Databricks account team suggested me is to join them by timestamps since both tables contain a column like that....
- 1 kudos
- 5534 Views
- 2 replies
- 0 kudos
NPIP tunnel setup failure during launch
In AWS with this current error when spinning up the SQL warehouse or personal compute.Backend private link is enabled.Error: NPIP tunnel setup failure during launch. Please try again later and contact Databricks if the problem persists. Instance boot...
- 5534 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello, Thanks for contacting Databricks Support. Based on the error message: NPIP_TUNNEL_SETUP_FAILURE. It indicates that bootstrap failed due to network connectivity issues between the data plane and control plane. Seems like you have already dow...
- 0 kudos
- 1936 Views
- 0 replies
- 0 kudos
Error running history command twice using DeltaTable
Hi I'm using Unity Catalog on Azure with a Managed Identity connected to my Storage Account. I can read and write data without issues, and interact with data using pyspark and SQL.I can display the history by running a cell with the following code:fr...
- 1936 Views
- 0 replies
- 0 kudos
- 5488 Views
- 2 replies
- 1 kudos
How to monitor a python wheel job with Prometheus?
Hi Community,We have a Databricks job with a single Python wheel task that runs our streaming pyspark job. The job runs on a single-node compute cluster and consumes from Kafka.Our monitoring stack is Prometheus + Grafana.I want the job's metrics to ...
- 5488 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi I'm trying to use the metrics registry object inside an UDF function, but I can't because it's not serializable due to Lock. Our goal is to be able to count the number of messages parsed, and the number of messages we can't parsed (due to exceptio...
- 1 kudos
- 1546 Views
- 0 replies
- 0 kudos
Handling Kafka Topics with Avro Schema
Our input data resides in a Kafka topic, and we utilize the Kafka schema registry with Avro schemas. While I can retrieve the schema from the registry, I am facing challenges creating a Spark DataFrame that correctly serializes data for streaming rea...
- 1546 Views
- 0 replies
- 0 kudos
- 1096 Views
- 0 replies
- 0 kudos
Scalable API/binary lookups
We sometimes process large dataframes that contain a column of IP addresses and we need to associate an Autonomous System Number (ASN) per IP address. The ASN information is provided by MaxMind in the form of a binary data file only accessible via a ...
- 1096 Views
- 0 replies
- 0 kudos
- 2768 Views
- 0 replies
- 0 kudos
auto-deleted workspace on GCP
Our production Databricks workspaces were auto-deleted when our subscription from our GCP account was canceled due to system error. We have a backup of the GCS buckets that Databricks was running on (not workspace exports). Is it possible to recove...
- 2768 Views
- 0 replies
- 0 kudos
- 2295 Views
- 1 replies
- 0 kudos
Databricks Private link connectivity for External SaaS Application
We need your guidance on completing the set-up around private link set-up with a Customer who is in the same region in AWS where our application is hosted in AWS. Our Customer has already enabled Private Link in their account and they are using custo...
- 2295 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Databricks Support,We followed the instructions above, but we are getting an error when registering the endpoint in customer databricks environment using "Registering Endpoint".Following is the error message we see:"INVALID_PARAMETER_VALUE" Endpoi...
- 0 kudos
- 2637 Views
- 1 replies
- 0 kudos
When to add Users Groups or SPs from Account to Workspace
Hi community We are using Unity Catalog, SCIM and Identity Federation, so we have users, groups and service principals at Account level. In what scenarios do users, groups and service principals need explicitly added to a Workspace?
- 2637 Views
- 1 replies
- 0 kudos
- 0 kudos
1. If you enable Unity Catalog in a workspace, users in that workspace may be able to access the same data that users in other workspaces in your account can access. Data guardians can control who has access to what data across all workspaces from on...
- 0 kudos
- 8686 Views
- 2 replies
- 3 kudos
System Tables Preview - retention period?
The new System Tables for billing, pricing & compute look really useful and easier to consume than getting it via the APIs.However I can't see in the documentation:Does data only start being gathered when you turn them on or is there immediately a hi...
- 8686 Views
- 2 replies
- 3 kudos
- 3 kudos
@Retired_mod -We are customer of databricks. Have databricks premium workspace with unity catalog enabled. and we have also legacy workspaces (non-unity enabled).I can see history is available for all workspaces (unity and non-unity) in same meta st...
- 3 kudos
- 2306 Views
- 0 replies
- 0 kudos
Destination Path of Cloned Notebooks
Hi, for my project I need to get destination paths of cloned notebooks. But when I run the query to get them: ''SELECT DISTINCT request_params.destinationPathFROM system.access.auditWHERE service_name = "notebook"andaction_name = 'cloneNotebook'LIMIT...
- 2306 Views
- 0 replies
- 0 kudos
- 20511 Views
- 4 replies
- 2 kudos
Get number of rows in delta lake table from metadata without count(*)
Hello folks,Is there a way with sql query to get count from delta table metadata without doing count(*) on each of table? Wondering, if this information is stored in any of INFORMATION_SCHEMA tables.I have a use-case to get counts from 1000's of delt...
- 20511 Views
- 4 replies
- 2 kudos
- 2 kudos
Here is a related one.https://community.databricks.com/t5/data-engineering/how-to-get-the-total-number-of-records-in-a-delta-table-from-the/td-p/20441
- 2 kudos
- 2363 Views
- 3 replies
- 0 kudos
Disaster Recovery Issue
We are trying to create Disaster Recovery for UC enabled Workspaces in Azure. our UC metastore are in different regions.1. we are trying to use Deep Clone2. In source we are adding region2 metastore as external location3. able to do deep cloneproblem...
- 2363 Views
- 3 replies
- 0 kudos
- 0 kudos
Right I get it.Actually cloning it as external seems logical to me, for the moment, as unity cannot manage the other metastore.For the moment I would go with cloning the data and then creating an external table of that.Not ideal, but at least you hav...
- 0 kudos
- 1543 Views
- 0 replies
- 0 kudos
Source data for Raw layer - API's vs Microservice
What does everyone think about ingesting source data to the raw layer via a microservice rather than direct from the source API?
- 1543 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
60 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |