cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Nathant93
by New Contributor III
  • 3136 Views
  • 1 replies
  • 0 kudos

SQL Server OUTPUT clause alternative

I am looking at after a merge or insert has happened to get the records in that batch that had been inserted via either method, much like the OUTPUT clause in sql server.Does anyone have any suggestions, the only thing I can think of is to add a time...

  • 3136 Views
  • 1 replies
  • 0 kudos
Latest Reply
Nathant93
New Contributor III
  • 0 kudos

I've managed to do it like this qry = spark.sql(f"DESCRIBE history <table_name> limit 1").collect()current_version = int(qry[0][0])prev_version = current_version - 1Then do an except statement between the versions. 

  • 0 kudos
KNYSJOA
by New Contributor
  • 3770 Views
  • 4 replies
  • 0 kudos

SDK Workspace client HTTP Connection Pool

Hello.Do you know how to solve issue with the HTTPSConnectionPool when we are using SDK WorkspaceClient in notebook via workflow?I would like to trigger job when some conditions are met. These condition are done using Python. I am using SDK to trigge...

  • 3770 Views
  • 4 replies
  • 0 kudos
Latest Reply
Dribka
New Contributor III
  • 0 kudos

It seems like the issue you're facing with the HTTPSConnectionPool in the SDK WorkspaceClient when using it within a workflow may be related to the environment variables or credentials not being propagated correctly. When running the notebook manuall...

  • 0 kudos
3 More Replies
Deexith
by New Contributor
  • 5737 Views
  • 3 replies
  • 0 kudos

getting this error in logs Status logger error unable to locate configured logger context factory though i am able to connect with databricks db and retrive the data for mulesoft integration

ERROR StatusLogger Unable to locate configured LoggerContextFactory org.mule.runtime.module.launcher.log4j2.MuleLog4jContextFactoryERROR StatusLogger Unable to load class org.apache.logging.log4j.core.config.xml.XmlConfigurationFactoryjava.lang.Class...

  • 5737 Views
  • 3 replies
  • 0 kudos
Latest Reply
DataBricks1565
New Contributor II
  • 0 kudos

Hi @Uppala Deexith​ Any update on how you fixed this issue would greatly appreciated.

  • 0 kudos
2 More Replies
CKBertrams
by New Contributor III
  • 2068 Views
  • 2 replies
  • 2 kudos

Resolved! Stream failure notifications

Hi all,I have a job running three consecutive streams, when just one of them fails I want to get notified. The notification only triggers when all tasks have failed or are skipped/canceled. Does anyone have a suggestion on how to implement this?

  • 2068 Views
  • 2 replies
  • 2 kudos
Latest Reply
deng_dev
New Contributor III
  • 2 kudos

Hi!You can add notifications directly on tasks 

  • 2 kudos
1 More Replies
Kayla
by Valued Contributor II
  • 2578 Views
  • 1 replies
  • 0 kudos

Clusters Suddenly Failing - java.lang.RuntimeException: abort: DriverClient destroyed

I'm having clusters randomly failing that we've been using without issue for weeks. We're able to run a handful of cells and then get an error about "java.lang.RuntimeException: abort: DriverClient destroyed". Has anyone run into this before?Edit: I ...

  • 2578 Views
  • 1 replies
  • 0 kudos
nag_kanchan
by New Contributor III
  • 1012 Views
  • 0 replies
  • 0 kudos

Applying SCD in DLT using 3 different tables at source

My organization has recently started using Delta Live Tables in Databricks for data modeling. One of the dimensions I am trying to model takes data from 3 existing tables in the data lake and needs to be slowly changing dimensions (SCD Type 1).This a...

  • 1012 Views
  • 0 replies
  • 0 kudos
Magnus
by Contributor
  • 4070 Views
  • 1 replies
  • 1 kudos

FIELD_NOT_FOUND when selecting field not part of original schema

Hi,I'm implementing a DLT pipeline using Auto Loader to ingest json files. The json files contains an array called Items that contains records and two of the fields in the records wasn't part of the original schema, but has been added later. Auto Loa...

Data Engineering
Auto Loader
Delta Live Tables
  • 4070 Views
  • 1 replies
  • 1 kudos
choi_2
by New Contributor II
  • 45702 Views
  • 1 replies
  • 0 kudos

maintaining cluster and databases in Databricks Community Edition

I am using the Databricks Community Edition, but the cluster usage is limited to 2 hours and it automatically terminates. So I have to attach the cluster every time to run the notebook again. As I read other discussions, I learned it is not something...

Data Engineering
communityedition
  • 45702 Views
  • 1 replies
  • 0 kudos
Feather
by New Contributor III
  • 10181 Views
  • 12 replies
  • 9 kudos

Resolved! DLT pipeline MLFlow UDF error

I am running this notebook via the dlt pipeline in preview mode.everything works up until the predictions table that should be created with a registered model inferencing the gold table. This is the  error: com databricks spark safespark UDFException...

Feather_0-1699311273694.png Feather_1-1699311414386.png
  • 10181 Views
  • 12 replies
  • 9 kudos
Latest Reply
BarryC
New Contributor III
  • 9 kudos

Hi @Feather Have you also tried specifying the version of the library as well?

  • 9 kudos
11 More Replies
oosterhuisf
by New Contributor II
  • 2116 Views
  • 1 replies
  • 0 kudos

break production using a shallow clone

Hi,If you create a shallow clone using the latest LTS, and drop the clone using a SQL warehouse (either current or preview), the source table is broken beyond repair. Data reads and writes still work, but vacuum will remain forever broken. I've attac...

  • 2116 Views
  • 1 replies
  • 0 kudos
Latest Reply
oosterhuisf
New Contributor II
  • 0 kudos

To add to that: the manual does not state that this might happen

  • 0 kudos
icyflame92
by New Contributor II
  • 12227 Views
  • 2 replies
  • 1 kudos

Resolved! Access storage account with private endpoint

Hi, I need guidance on connecting Databricks (not VNET injected) to a storage account with Private Endpoint.We have a client who created Databricks with (public ip and not VNET Injected). It’s using a managed VNET in the Databricks managed resource g...

Data Engineering
ADLS
azure
  • 12227 Views
  • 2 replies
  • 1 kudos
Latest Reply
rudyevers
New Contributor III
  • 1 kudos

 No this is not possible because the workspace is not part of the virtual network and since than can not access the storage over it's private endpoint. It is all mentioned in de documentation:https://www.databricks.com/blog/2020/02/28/securely-access...

  • 1 kudos
1 More Replies
jx1226
by New Contributor III
  • 2984 Views
  • 0 replies
  • 0 kudos

Connect to storage with private endpoint from workspace EnableNoPublicIP=No and VnetInjection=No

We know that Databricks with VNET injection (our own VNET) allows is to connect to blob storage/ ADLS Gen2 over private endpoints and peering. This is what we typically do.We have a client who created Databricks with EnableNoPublicIP=No (secure clust...

  • 2984 Views
  • 0 replies
  • 0 kudos
grazie
by Contributor
  • 3692 Views
  • 2 replies
  • 0 kudos

Azure Databricks, migrating delta table data with CDF on.

We are on Azure Databricks over ADLS Gen2 and have a set of tables and workflows that process data from and between those tables, using change data feeds. (We are not yet using Unity Catalog, and also not Hive metastore, just accessing delta tables f...

  • 3692 Views
  • 2 replies
  • 0 kudos
Latest Reply
grazie
Contributor
  • 0 kudos

As it turns out, due to a misunderstanding, the responses from Azure support were answering a slightly different question (about Azure Table Storage instead of Delta Tables on Blob/ADLS Gen2), so we'll try there again. However, still interested in id...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels