cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

DavidFrench
by New Contributor
  • 218 Views
  • 1 replies
  • 1 kudos

Resolved! Altair charts don't work in offline mode

Hello,We are running a secure Databricks environment (no internet access) within an Azure Virtual Desktop and are currently unable to get any charts to display. They work if we access the environment from outwith the AVD, but not when we access it fr...

  • 218 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Why It Works Outside AVD When working outside your AVD setup (such as on a local machine or a cloud environment with internet access), the widget JavaScript loads successfully from the CDN, enabling chart display. Solutions for Secure, Offline Envir...

  • 1 kudos
Hritik_Moon
by New Contributor III
  • 259 Views
  • 4 replies
  • 2 kudos

Resolved! call job parameter in notebook

Notebook1 as a list output (d_list) and stored in a taskValueI have provided this as input to a loop for notebook 2These are the parameters for notebook2  How do I get the value of the parameters file_name and file_format inside notebook2?when I try ...

N1.png loopInput.png N2.png job.png
  • 259 Views
  • 4 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Let me know if you managed to do it @Hritik_Moon in a way I described above. In case of any issues I can prepare for you an end to end example

  • 2 kudos
3 More Replies
Ritesh-Dhumne
by New Contributor III
  • 415 Views
  • 7 replies
  • 4 kudos

Dynamic Jobs community Edition

Hello  I tried this,Notebook 1 :dbutils.jobs.taskValues.set(key = "my_key", value = "hi From Notebook1")Notebook2:X = dbutils.jobs.taskValues.get(taskKey="01", key="my_key", debugValue = "Fail")print(X) Here I get "Fail" as output, its not fetching m...

  • 415 Views
  • 7 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Hi @Ritesh-Dhumne ,Folllow my steps. I created 2 notebooks:- first one called Notebook1 with followign content- second one called Notebook2 with following content that will read value defined in Notebook1 Here's my definiton of workflow that is using...

  • 4 kudos
6 More Replies
Ritesh-Dhumne
by New Contributor III
  • 184 Views
  • 2 replies
  • 3 kudos

Jobs and Pipeline input parameter

I wanted to extract all files in the volume I have uploaded , in notebook 1 and then in notebook 2 perform basic transformation on every files like missing values , nulls , also I want to store the null , dirty records seperately and a clean datafram...

  • 184 Views
  • 2 replies
  • 3 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 3 kudos

 Hi @Ritesh-Dhumne ,I'm assuming that you mistakenly named Free Edition as Community since you're using volumes which are not available in community edition.I’m not sure if I’ve understood your approach correctly, but at first glance it seems incorre...

  • 3 kudos
1 More Replies
ClintHall
by New Contributor II
  • 325 Views
  • 2 replies
  • 1 kudos

Resolved! Error filtering by datetime Lakehouse Federated SQL Server table

In unity catalog, I have a connection to a SQL Server database. When I try to filter by a datetime column using a datetime with fractional seconds, Databricks gives me this error:Job aborted due to stage failure: com.microsoft.sqlserver.jdbc.SQLServe...

  • 325 Views
  • 2 replies
  • 1 kudos
Latest Reply
ClintHall
New Contributor II
  • 1 kudos

Thanks, @Isi. Very helpful. It would be nice if Lakehouse federation would do this for us (the same way that it knows SQL Server uses ISNULL where Spark SQl uses NVL). Is there a way to bring it the dev's attention?

  • 1 kudos
1 More Replies
juanjomendez96
by Contributor
  • 356 Views
  • 2 replies
  • 3 kudos

Resolved! Update Databricks App compute

Hello community!I have been using for a while the new Databricks feature 'Databricks Apps'. It has been incredible the amount of effort and time we have saved by using Databricks Apps to deploy our dashboards instead of deploying them directly in our...

  • 356 Views
  • 2 replies
  • 3 kudos
Latest Reply
HariSankar
Contributor III
  • 3 kudos

Hey @juanjomendez96 ,You’ve explained this really well, and yes, what you’re experiencing is currently one of the main limitations of Databricks Apps.Right now, these apps run on fixed managed compute controlled by Databricks. That means we, as users...

  • 3 kudos
1 More Replies
Hari_P
by New Contributor II
  • 259 Views
  • 2 replies
  • 2 kudos

Sharing Databricks Notebook Functionality Without Revealing Source Code

Hi All,I have a unique scenario in Databricks and would appreciate your insights.I’ve developed functionality in Databricks notebooks, and I’d like to share this with other developers within the same workspace. My goal is to allow colleagues to impor...

  • 259 Views
  • 2 replies
  • 2 kudos
Latest Reply
Isi
Honored Contributor III
  • 2 kudos

Hey @Hari_P ,I believe this doesn’t exist today as a built-in feature. I reviewed the Databricks notebook permission model (docs link) and with the minimum level (“CAN READ”) users already have access to view the notebook’s source. The simplest and m...

  • 2 kudos
1 More Replies
Pratikmsbsvm
by Contributor
  • 1798 Views
  • 3 replies
  • 1 kudos

Resolved! How to Read and Wrire Data between 2 seperate instance of Databricks

How to Read and Wrire Data between 2 seperate instance of Databricks.I want to have bi-directional data read and write between Databricks A and Databricks B. Both are not in same instance.Please help

Pratikmsbsvm_0-1752575827266.png
  • 1798 Views
  • 3 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Here are some patterns that you can utilize:1. If the workspaces are in different Databricks Accounts or different Azure regions. The recommended approach is Delta sharing.The simplest, most governed way to let A read B’s tables and B read A’s tables...

  • 1 kudos
2 More Replies
Hritik_Moon
by New Contributor III
  • 666 Views
  • 6 replies
  • 3 kudos

Dynamic value input to a job

How do I pass dynamic value to a databricks job?I created a notebook which will extract the names of files in the catalog, I want to pass these names as parameter to another notebook task in a job.What are the ways I can do this?

  • 666 Views
  • 6 replies
  • 3 kudos
Latest Reply
Hritik_Moon
New Contributor III
  • 3 kudos

Hello, I think you have posted wrong picture for notebook1. Could you please verify once.I made some changes and its working now. Thanks a lot.

  • 3 kudos
5 More Replies
cpollock
by New Contributor III
  • 516 Views
  • 3 replies
  • 1 kudos

Resolved! Getting NO_TABLES_IN_PIPELINE error in Lakeflow Declarative Pipelines

Yesterday (10/1) starting around 12 PM EST we starting getting the following error in our Lakeflow Declarative Pipelines (LDP) process.  We get this in environments where none of our code has changed.  I found some info on the serverless compute abou...

  • 516 Views
  • 3 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor II
  • 1 kudos

Hi @cpollock Check the “Event log” and “Pipeline logs” in the Databricks UI for any clues.also can you please share screenshot as pasted in window, attachment is not really working and only scanning

  • 1 kudos
2 More Replies
tushar_bansal
by Contributor
  • 1301 Views
  • 21 replies
  • 17 kudos

Resolved! Copy text from the integrated Web terminal

How do I copy text from the integrated web terminal? The selection goes away as soon as I lift my finger from the mouse.

Screenshot 2025-09-09 at 11.35.37 AM.png
  • 1301 Views
  • 21 replies
  • 17 kudos
Latest Reply
tushar_bansal
Contributor
  • 17 kudos

An update here. I raised a ticket and found out this is because tmux mode is enabled by default in the web terminal. You can disable the tmux mode by adding `export DISABLE_TMUX=true` to the ~/.bashrc of the compute.When I asked them about the defaul...

  • 17 kudos
20 More Replies
Ritesh-Dhumne
by New Contributor III
  • 241 Views
  • 3 replies
  • 1 kudos

Resolved! Dynamic value input to a job in community free edition

How do i pass dynamic value to a databricks job?I created a notebook which will extract the names of files in the catalog, I want to pass these names as parameter to another notebook task in a job.

  • 241 Views
  • 3 replies
  • 1 kudos
Latest Reply
Ritesh-Dhumne
New Contributor III
  • 1 kudos

Thank you for the response  , will this work in free Edition.

  • 1 kudos
2 More Replies
Akshay_Petkar
by Valued Contributor
  • 450 Views
  • 1 replies
  • 3 kudos

Resolved! How to send automated emails from Databricks notebooks based on conditions or events?

Hi everyone,I’m currently exploring how to replicate something similar to Alteryx Email Activity within Databricks.Basically, I want to send automated emails to specific users when certain conditions or events occur in a notebook workflow  for exampl...

  • 450 Views
  • 1 replies
  • 3 kudos
Latest Reply
HariSankar
Contributor III
  • 3 kudos

Hey @Akshay_Petkar ,This is something a lot of people try to do when they move workflows from Alteryx or SSIS into Databricks. There isn’t a direct “Email Activity” node like in Alteryx, but you candefinitely set up automated email notifications in a...

  • 3 kudos
Anonymous
by Not applicable
  • 5853 Views
  • 7 replies
  • 5 kudos

COPY INTO command can not recognise MAP type value from JSON file

I have a delta table in Databricks with single column of type map<string, string> and I have a data file in JSON format created by Hive 3 for the table with thecolumn of same type. And I want to load data from file to Databricks's table using COPY IN...

  • 5853 Views
  • 7 replies
  • 5 kudos
Latest Reply
Y-I
New Contributor II
  • 5 kudos

Usefrom_json(to_json({struct column}),{your schema definition})For exampleCOPY INTO {table} FROM (select from_json(to_json()), 'MAP<STRING, STRING>' FROM {path}) ... 

  • 5 kudos
6 More Replies
Abdul_Alikhan
by New Contributor II
  • 1067 Views
  • 4 replies
  • 2 kudos

Resolved! in data bricks free edition Serverless compute is not working

I recently logged into the Databricks free edition, but the serverless compute is not working. I'm receiving the error: 'An error occurred while trying to attach serverless compute. Please try again or contact support.'"

  • 1067 Views
  • 4 replies
  • 2 kudos
Latest Reply
LonaOsmani
New Contributor III
  • 2 kudos

Hi @Abdul_Alikhan ,I experienced the same yesterday when I imported some of my notebooks. I noticed that this error only appeared for imported notebooks because the environment version was 1 by default. Changing the environment version to 2 solved th...

  • 2 kudos
3 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels