cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SCPablo
by New Contributor
  • 527 Views
  • 1 replies
  • 1 kudos

Resolved! Enable Classic (Non-Serverless) Clusters on Free Trial

Hi Databricks community,I’m using a Free Trial cloud account. Currently and I need to create classic clusters for Spark exercises.Is there a way to enable Standard/Classic Clusters in a trial workspace, or any workaround for Free Trial users?Any guid...

  • 527 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hello @SCPablo ,If you are referring to the 14 days free trial account (link: https://docs.databricks.com/aws/en/getting-started/free-trial) , you can create compute clusters and experiment with them. But if you are referring to the Databricks Free E...

  • 1 kudos
kanikvijay9
by New Contributor III
  • 2368 Views
  • 2 replies
  • 1 kudos

Resolved! Performance Issues with Writing Large DataFrames to Managed Tables in Databricks (3.5B+ Rows)

Hi Community,I'm working on a large-scale data processing job in Databricks and facing performance and stability issues during the write operations. Here's a detailed breakdown of my use case and environment:Use Case Overview:Primary Data Frames:Firs...

kanikvijay9_0-1755015948307.png kanikvijay9_1-1755015978065.png kanikvijay9_0-1755016092143.png
  • 2368 Views
  • 2 replies
  • 1 kudos
Latest Reply
kanikvijay9
New Contributor III
  • 1 kudos

I found the solution, Please refer to the below links for the solutionLinkedIn Post: https://www.linkedin.com/posts/activity-7363497408925745154-LaaL?utm_source=share&utm_medium=member_desktop&rcm=ACoAACTtno0BU78QJcWz-X3GHtKRvhXxf5fod90Medium Blob: h...

  • 1 kudos
1 More Replies
Fikrat
by Contributor
  • 1972 Views
  • 7 replies
  • 1 kudos

Resolved! Lakebridge Transpile paths

Hi there,What kind of source and target paths can I use in Transpile command?I'm trying to run command:databricks labs lakebridge transpile --source-dialect tsql --input-source and I get error:ERROR [src/databricks/labs/lakebridge.transpile] ValueErr...

  • 1972 Views
  • 7 replies
  • 1 kudos
Latest Reply
Fikrat
Contributor
  • 1 kudos

Also, what cloud source locations are available for Transpile- dbfs, Unity Catalog volumes, etc?

  • 1 kudos
6 More Replies
Keren
by New Contributor
  • 377 Views
  • 1 replies
  • 0 kudos

notebook_params deprecation

In Databricks documentation  for  workflow / Jobs /REST API 2.1 /  Trigger a new job run - some fields in the request body are stated as depreciated, for example notebook_params, jar_params.  In  Jobs REST API 2.2 those fields do not appear at all.Ho...

  • 377 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Keren! Fields like notebook_params and jar_params are deprecated in Jobs REST API 2.1 and no longer appear in 2.2. While an end-of-life date has not been announced yet, it’s recommended to transition to the newer parameter fields rather than r...

  • 0 kudos
flashmav
by New Contributor II
  • 444 Views
  • 1 replies
  • 0 kudos

Two developer working on same same Genie space

My teammate and I are working on the same Genie space at the same time. We’re both adding instructions, and it seems like Genie is getting confused. In this situation, how does Genie decide which instructions to follow? Will it overwrite one set of i...

  • 444 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @flashmav! For now, when multiple users work on the same Genie space at the same time, changes made by one person may overwrite another’s updates if both are editing the same instructions or example queries. To avoid conflicts and ensure your u...

  • 0 kudos
rs-tcs
by New Contributor II
  • 497 Views
  • 1 replies
  • 2 kudos

Resolved! Notebooks - Pasting images from clipboard

Hello, I am new to this community. I really miss the Jypyter-style feature to copy paste images from the clipboard into the markdown cells as I am work on a notebook. As you might know, when an image is pasted from clipboard, first that gets converte...

  • 497 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Hello @rs-tcs! Thanks for sharing your feedback.Yes, direct pasting from the clipboard isn’t supported at the moment. Right now, Databricks Notebooks support adding images in markdown by linking to a file/URL or by dragging and dropping an image from...

  • 2 kudos
saurabh_aher
by New Contributor III
  • 1358 Views
  • 8 replies
  • 4 kudos

Resolved! databricks sql create function - input table name as parameter and returns complete table.

hi , I am trying to create a databricks sql unity catalog function which will take table_name as input parameter and returns the full table as output. I am getting error, kindly help CREATE OR REPLACE FUNCTION catalog.schema.get_table( table_name STR...

  • 1358 Views
  • 8 replies
  • 4 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 4 kudos

@szymon_dybczak I admire that you always find the appropriate information in the documentation. I will try my best to emulate this behaviour with other posts .@saurabh_aher great workaround suggestion with a stored procedure. Lots of lessons learned ...

  • 4 kudos
7 More Replies
Daan
by New Contributor III
  • 1277 Views
  • 6 replies
  • 2 kudos

Resolved! Databricks Asset Bundles: using loops

Hey,I am using DAB's to deploy the job below.This code works but I would like to use it for other suppliers as well.Is there a way to loop over a loop of suppliers: ['nike', 'adidas',...] and fill those variables so that config_nike_gsheet_to_databri...

  • 1277 Views
  • 6 replies
  • 2 kudos
Latest Reply
MujtabaNoori
New Contributor III
  • 2 kudos

@Daan ,You can maintain a default template that holds the common configuration. While creating a job-specific configuration, you can safely merge your job-specific dictionary with the base template using the | operator.

  • 2 kudos
5 More Replies
N_M
by Contributor
  • 3020 Views
  • 3 replies
  • 4 kudos

Access to For Each run ids from jobs rest API

Hello CommunityI'm using the for_each tasks in workflows, but I'm struggling to access to the job information through the job APIs.In short, using the runs api (Get a single job run | Jobs API | REST API reference | Databricks on AWS), I'm able to ac...

Data Engineering
API
jobs API
  • 3020 Views
  • 3 replies
  • 4 kudos
Latest Reply
prabhatika
New Contributor II
  • 4 kudos

This feature would be extremely helpful in monitoring each task in the `foreachtask` task. 

  • 4 kudos
2 More Replies
jtrousdale-lyb
by New Contributor III
  • 1393 Views
  • 6 replies
  • 4 kudos

Resolved! DLT pipelines - sporadic ModuleNotFoundError

When we run DLT pipelines (which we deploy via DABs), we get a sporadic issue when attempting to install our bundle's wheel file.First, in every DLT pipeline, we as a first step a script that looks like the followingimport subprocess as sp from impor...

  • 1393 Views
  • 6 replies
  • 4 kudos
Latest Reply
WiliamRosa
Contributor III
  • 4 kudos

If you're encountering intermittent ModuleNotFoundError when your DLT pipeline tries to install your asset bundle’s wheel file, this typically points to inconsistencies in how your dependencies are packaged or where they’re being deployed. Common cul...

  • 4 kudos
5 More Replies
susanne
by Contributor
  • 930 Views
  • 3 replies
  • 4 kudos

Resolved! Asset Bundles define entire folder for source code transformation files

Hi all I used the new Lakeflow UI in order to create a pipeline. Now I am struggling with the asset bundle configuration. When I am creating the pipeline manually I can configure the correct folder to the transformations where my sql and python trans...

Screenshot 2025-08-16 at 14.52.24.png
  • 930 Views
  • 3 replies
  • 4 kudos
Latest Reply
susanne
Contributor
  • 4 kudos

Hi Szymon, thanks once again for your help!It worked now with your approach. Do you maybe know why there is this warning displayed after databricks bundle validate/deploy:Warning: unknown field: globThis was one reason I thought this can not be the r...

  • 4 kudos
2 More Replies
RohanIyer
by New Contributor II
  • 1445 Views
  • 1 replies
  • 3 kudos

Resolved! Azure RBAC Support for Secret Scopes

Hi there!I am using multiple Azure Key Vaults within our Azure Databricks workspaces, and we have set up secret scopes that are backed by these Key Vaults. Azure provides two authentication methods for accessing Key Vaults:Access Policies, which is c...

  • 1445 Views
  • 1 replies
  • 3 kudos
Latest Reply
WiliamRosa
Contributor III
  • 3 kudos

Actually, RBAC is supported for authentication for the secret scopes.The thing is, when you setup the secret scope, Databricks is automatically assigning permissions through access policies. With RBAC - you'll need to grant the role on your own.As a ...

  • 3 kudos
YosepWijaya
by New Contributor II
  • 33715 Views
  • 7 replies
  • 2 kudos

How can I embed image to the cell using markdown or code?

I have been trying to embed the image from the dbfs location, when I run the code, the image is unknown or question mark. I have tried following code: The path of the file is dbfs:/FileStore/tables/svm.jpgdisplayHTML("<img src ='dbfs:/FileStore/tabl...

Data Engineering
markdown
Notebook
  • 33715 Views
  • 7 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 2 kudos

@WiliamRosa You've stated:"1. Drag and drop images directly into Markdown cellsYou can simply drag an image file from your local system into a markdown cell. Databricks will upload it automatically to your workspace directory and display it inline in...

  • 2 kudos
6 More Replies
CzarR
by New Contributor III
  • 1939 Views
  • 7 replies
  • 2 kudos

Maximum string length to pass for Databricks notebook widget

Is there a limitation on the string length to pass for Databricks notebook widget? ADF lookup outputs about 1000 tables that I am trying to pass to the databricks notebook via widget parameter. ADF spends 30 mins to open the Databricks notebook and e...

  • 1939 Views
  • 7 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @CzarR ,Yes, there's a limitation. A maximum of 2048 characters can be input to a text widget.https://docs.databricks.com/aws/en/notebooks/notebook-limitations#databricks-widgets

  • 2 kudos
6 More Replies
ChristianRRL
by Honored Contributor
  • 1067 Views
  • 4 replies
  • 5 kudos

Resolved! AutoLoader - Cost of Directory Listing Mode

I'm curious to get thoughts and experience on this. Intuitively, the directory listing mode makes sense to me in order to ensure that only the latest unprocessed files are picked up and processed, but I'm curious about what the cost impact of this wo...

  • 1067 Views
  • 4 replies
  • 5 kudos
Latest Reply
kerem
Contributor
  • 5 kudos

Hi @ChristianRRL Autoloader ingests your data incrementally regardless of whether you are on directory listing mode or file notification mode. The key difference lies in how it discovers new files. In directory listing mode, Autoloader queries the cl...

  • 5 kudos
3 More Replies
Labels