cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ClintHall
by New Contributor II
  • 324 Views
  • 2 replies
  • 1 kudos

Resolved! Error filtering by datetime Lakehouse Federated SQL Server table

In unity catalog, I have a connection to a SQL Server database. When I try to filter by a datetime column using a datetime with fractional seconds, Databricks gives me this error:Job aborted due to stage failure: com.microsoft.sqlserver.jdbc.SQLServe...

  • 324 Views
  • 2 replies
  • 1 kudos
Latest Reply
ClintHall
New Contributor II
  • 1 kudos

Thanks, @Isi. Very helpful. It would be nice if Lakehouse federation would do this for us (the same way that it knows SQL Server uses ISNULL where Spark SQl uses NVL). Is there a way to bring it the dev's attention?

  • 1 kudos
1 More Replies
juanjomendez96
by Contributor
  • 355 Views
  • 2 replies
  • 3 kudos

Resolved! Update Databricks App compute

Hello community!I have been using for a while the new Databricks feature 'Databricks Apps'. It has been incredible the amount of effort and time we have saved by using Databricks Apps to deploy our dashboards instead of deploying them directly in our...

  • 355 Views
  • 2 replies
  • 3 kudos
Latest Reply
HariSankar
Contributor III
  • 3 kudos

Hey @juanjomendez96 ,You’ve explained this really well, and yes, what you’re experiencing is currently one of the main limitations of Databricks Apps.Right now, these apps run on fixed managed compute controlled by Databricks. That means we, as users...

  • 3 kudos
1 More Replies
Hari_P
by New Contributor II
  • 259 Views
  • 2 replies
  • 2 kudos

Sharing Databricks Notebook Functionality Without Revealing Source Code

Hi All,I have a unique scenario in Databricks and would appreciate your insights.I’ve developed functionality in Databricks notebooks, and I’d like to share this with other developers within the same workspace. My goal is to allow colleagues to impor...

  • 259 Views
  • 2 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

Hey @Hari_P ,I believe this doesn’t exist today as a built-in feature. I reviewed the Databricks notebook permission model (docs link) and with the minimum level (“CAN READ”) users already have access to view the notebook’s source. The simplest and m...

  • 2 kudos
1 More Replies
Pratikmsbsvm
by Contributor
  • 1797 Views
  • 3 replies
  • 1 kudos

Resolved! How to Read and Wrire Data between 2 seperate instance of Databricks

How to Read and Wrire Data between 2 seperate instance of Databricks.I want to have bi-directional data read and write between Databricks A and Databricks B. Both are not in same instance.Please help

Pratikmsbsvm_0-1752575827266.png
  • 1797 Views
  • 3 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Here are some patterns that you can utilize:1. If the workspaces are in different Databricks Accounts or different Azure regions. The recommended approach is Delta sharing.The simplest, most governed way to let A read B’s tables and B read A’s tables...

  • 1 kudos
2 More Replies
Hritik_Moon
by New Contributor III
  • 666 Views
  • 6 replies
  • 3 kudos

Dynamic value input to a job

How do I pass dynamic value to a databricks job?I created a notebook which will extract the names of files in the catalog, I want to pass these names as parameter to another notebook task in a job.What are the ways I can do this?

  • 666 Views
  • 6 replies
  • 3 kudos
Latest Reply
Hritik_Moon
New Contributor III
  • 3 kudos

Hello, I think you have posted wrong picture for notebook1. Could you please verify once.I made some changes and its working now. Thanks a lot.

  • 3 kudos
5 More Replies
cpollock
by New Contributor III
  • 512 Views
  • 3 replies
  • 1 kudos

Resolved! Getting NO_TABLES_IN_PIPELINE error in Lakeflow Declarative Pipelines

Yesterday (10/1) starting around 12 PM EST we starting getting the following error in our Lakeflow Declarative Pipelines (LDP) process.  We get this in environments where none of our code has changed.  I found some info on the serverless compute abou...

  • 512 Views
  • 3 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 1 kudos

Hi @cpollock Check the “Event log” and “Pipeline logs” in the Databricks UI for any clues.also can you please share screenshot as pasted in window, attachment is not really working and only scanning

  • 1 kudos
2 More Replies
tushar_bansal
by Contributor
  • 1296 Views
  • 21 replies
  • 17 kudos

Resolved! Copy text from the integrated Web terminal

How do I copy text from the integrated web terminal? The selection goes away as soon as I lift my finger from the mouse.

Screenshot 2025-09-09 at 11.35.37 AM.png
  • 1296 Views
  • 21 replies
  • 17 kudos
Latest Reply
tushar_bansal
Contributor
  • 17 kudos

An update here. I raised a ticket and found out this is because tmux mode is enabled by default in the web terminal. You can disable the tmux mode by adding `export DISABLE_TMUX=true` to the ~/.bashrc of the compute.When I asked them about the defaul...

  • 17 kudos
20 More Replies
Ritesh-Dhumne
by New Contributor III
  • 239 Views
  • 3 replies
  • 1 kudos

Resolved! Dynamic value input to a job in community free edition

How do i pass dynamic value to a databricks job?I created a notebook which will extract the names of files in the catalog, I want to pass these names as parameter to another notebook task in a job.

  • 239 Views
  • 3 replies
  • 1 kudos
Latest Reply
Ritesh-Dhumne
New Contributor III
  • 1 kudos

Thank you for the response  , will this work in free Edition.

  • 1 kudos
2 More Replies
Akshay_Petkar
by Valued Contributor
  • 429 Views
  • 1 replies
  • 3 kudos

Resolved! How to send automated emails from Databricks notebooks based on conditions or events?

Hi everyone,I’m currently exploring how to replicate something similar to Alteryx Email Activity within Databricks.Basically, I want to send automated emails to specific users when certain conditions or events occur in a notebook workflow  for exampl...

  • 429 Views
  • 1 replies
  • 3 kudos
Latest Reply
HariSankar
Contributor III
  • 3 kudos

Hey @Akshay_Petkar ,This is something a lot of people try to do when they move workflows from Alteryx or SSIS into Databricks. There isn’t a direct “Email Activity” node like in Alteryx, but you candefinitely set up automated email notifications in a...

  • 3 kudos
Anonymous
by Not applicable
  • 5847 Views
  • 7 replies
  • 5 kudos

COPY INTO command can not recognise MAP type value from JSON file

I have a delta table in Databricks with single column of type map<string, string> and I have a data file in JSON format created by Hive 3 for the table with thecolumn of same type. And I want to load data from file to Databricks's table using COPY IN...

  • 5847 Views
  • 7 replies
  • 5 kudos
Latest Reply
Y-I
New Contributor II
  • 5 kudos

Usefrom_json(to_json({struct column}),{your schema definition})For exampleCOPY INTO {table} FROM (select from_json(to_json()), 'MAP<STRING, STRING>' FROM {path}) ... 

  • 5 kudos
6 More Replies
Abdul_Alikhan
by New Contributor II
  • 1061 Views
  • 4 replies
  • 2 kudos

Resolved! in data bricks free edition Serverless compute is not working

I recently logged into the Databricks free edition, but the serverless compute is not working. I'm receiving the error: 'An error occurred while trying to attach serverless compute. Please try again or contact support.'"

  • 1061 Views
  • 4 replies
  • 2 kudos
Latest Reply
LonaOsmani
New Contributor III
  • 2 kudos

Hi @Abdul_Alikhan ,I experienced the same yesterday when I imported some of my notebooks. I noticed that this error only appeared for imported notebooks because the environment version was 1 by default. Changing the environment version to 2 solved th...

  • 2 kudos
3 More Replies
MarkV
by New Contributor III
  • 1129 Views
  • 3 replies
  • 0 kudos

DLT Runtime Values

When my pipeline runs, I have a need to query a table in the pipeline before I actually create another table. I need to know the target catalog and target schema for the query. I figured the notebook might run automatically in the context of the cata...

  • 1129 Views
  • 3 replies
  • 0 kudos
Latest Reply
SparkJun
Databricks Employee
  • 0 kudos

can you set up notebook parameters and pass them in the DLT pipeline? https://docs.databricks.com/en/jobs/job-parameters.html

  • 0 kudos
2 More Replies
pokus
by New Contributor III
  • 9551 Views
  • 3 replies
  • 2 kudos

Resolved! use DeltaLog class in databricks cluster

I need to use DeltaLog class in the code to get the AddFiles dataset. I have to keep the implemented code in a repo and run it in databricks cluster. Some docs say to use org.apache.spark.sql.delta.DeltaLog class, but it seems databricks gets rid of ...

  • 9551 Views
  • 3 replies
  • 2 kudos
Latest Reply
NandiniN
Databricks Employee
  • 2 kudos

Hi @pokus ,  You don't need to access via reflection.  You can Access DeltaLog with spark._jvm:Unity Catalog and DeltaLake tables expose their metadata and transaction log via the JVM backend. Using spark._jvm, you can interact with DeltaLog Thanks!

  • 2 kudos
2 More Replies
Nasd_
by New Contributor II
  • 1031 Views
  • 3 replies
  • 2 kudos

Resolved! Accessing DeltaLog and OptimisticTransaction from PySpark

Hi community,I'm exploring ways to perform low-level, programmatic operations on Delta tables directly from a PySpark environment.The standard delta.tables.DeltaTable Python API is excellent for high-level DML, but it seems to abstract away the core ...

  • 1031 Views
  • 3 replies
  • 2 kudos
Latest Reply
NandiniN
Databricks Employee
  • 2 kudos

For accessing the Databricks pre-installed package's use spark._jvm.com.databricks.sql.transaction.tahoe.DeltaLog  org.apache.spark.sql.delta.DeltaLog would be the OSS jar's classname.  

  • 2 kudos
2 More Replies
Nasd_
by New Contributor II
  • 2206 Views
  • 1 replies
  • 1 kudos

Resolved! Unable to load org.apache.spark.sql.delta classes from JVM pyspark

Hello,I’m working on Databricks with a cluster running Runtime 16.4, which includes Spark 3.5.2 and Scala 2.12.For a specific need, I want to implement my own custom way of writing to Delta tables by manually managing Delta transactions from PySpark....

  • 2206 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hi @Nasd_,  I believe you are trying to use OSS jars on DBR. (Can infer based on class package) org.apache.spark.sql.delta.DeltaLog The error ModuleNotFoundError: No module named 'delta.exceptions.captured'; 'delta.exceptions' is not a package can be...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels