cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

CHorton
by Visitor
  • 67 Views
  • 3 replies
  • 2 kudos

Resolved! Calling a function with parameters via Spark ODBC driver

Hi All,I am having an issue with calling a Databricks SQL user defined function with parameters from my client application using the Spark ODBC driver.I have been able to execute a straight SQL statement using parameters e.g. SELECT * FROM Customer W...

  • 67 Views
  • 3 replies
  • 2 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 2 kudos

Hi @CHorton The Databricks SQL engine does not support positional (?) parameters inside SQL UDF calls.  When Spark SQL parses GetCustomerData(?), the parameter is unresolved at analysis time, so you get [UNBOUND_SQL_PARAMETER]. This is not an ODBC bu...

  • 2 kudos
2 More Replies
Harun
by Honored Contributor
  • 12617 Views
  • 2 replies
  • 3 kudos

How to change the number of executors instances in databricks

I know that Databricks runs one executor per worker node. Can i change the no.of.exectors by adding params (spark.executor.instances) in the cluster advance option? and also can i pass this parameter when i schedule a task, so that particular task wi...

  • 12617 Views
  • 2 replies
  • 3 kudos
Latest Reply
RandiMacGyver
New Contributor II
  • 3 kudos

In Databricks, the executor model is largely managed by the platform itself. On Databricks clusters, each worker node typically runs a single Spark executor, and this behavior is intentional.

  • 3 kudos
1 More Replies
by Not applicable
  • 65 Views
  • 3 replies
  • 3 kudos

Is anyone getting up and working ? Federating Snowflake-managed Iceberg tables into Azure Databricks

I'm federating Snowflake-managed Iceberg tables into Azure Databricks Unity Catalog to query the same data from both platforms without copying it. I am getting weird error message when query table from Databricks and i have tried to put all nicely in...

Data Engineering
azure
Iceberg
snowflake
unity-catalog
  • 65 Views
  • 3 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 3 kudos

try this: In Snowflake, run SELECT SYSTEM$GET_ICEBERG_TABLE_INFORMATION('<db>.<schema>.<table>'); to retrieve metadata location.In Databricks, create an external location that matches the metadata URI.

  • 3 kudos
2 More Replies
liquibricks
by Contributor
  • 57 Views
  • 3 replies
  • 3 kudos

Resolved! Spark verison errors in "Build an ETL pipeline with Lakeflow Spark Declarative Pipelines"

I'm trying to define a job for a pipeline using the Asset Bundle Python SDK. I created the pipeline first (using the SDK) and i'm now trying to add the Job. The DAB validates and deploys successfully, but when I run the Job i get an error: UNAUTHORIZ...

  • 57 Views
  • 3 replies
  • 3 kudos
Latest Reply
mukul1409
New Contributor
  • 3 kudos

This happens because the job is not actually linked to the deployed pipeline and the pipeline id is None at runtime. When using Asset Bundles, the pipeline id is only resolved after deployment, so referencing my_pipeline.id in code does not work. Ins...

  • 3 kudos
2 More Replies
mukul1409
by New Contributor
  • 117 Views
  • 3 replies
  • 1 kudos

Resolved! Iceberg interoperability between Databricks and external catalogs

I would like to understand the current approach for Iceberg interoperability in Databricks. Databricks supports Iceberg using Unity Catalog, but many teams also use Iceberg tables managed outside Databricks. Are there recommended patterns today for s...

  • 117 Views
  • 3 replies
  • 1 kudos
Latest Reply
Yogesh_Verma_
Contributor II
  • 1 kudos

Great

  • 1 kudos
2 More Replies
hnnhhnnh
by New Contributor II
  • 61 Views
  • 1 replies
  • 0 kudos

Title: How to handle type widening (int→bigint) in DLT streaming tables without dropping the table

SetupBronze source table (external to DLT, CDF & type widening enabled):# Source table properties:# delta.enableChangeDataFeed: "true"# delta.enableDeletionVectors: "true"# delta.enableTypeWidening: "true"# delta.minReaderVersion: "3"# delta.minWrite...

  • 61 Views
  • 1 replies
  • 0 kudos
Latest Reply
mukul1409
New Contributor
  • 0 kudos

Hi @hnnhhnnh DLT streaming tables that use apply changes do not support widening the data type of key columns such as changing an integer to a bigint after the table is created. Even though Delta and Unity Catalog support type widening in general, DL...

  • 0 kudos
Sunil_Patidar
by New Contributor II
  • 302 Views
  • 3 replies
  • 2 kudos

Unable to read from or write to Snowflake Open Catalog via Databricks

I have Snowflake Iceberg tables whose metadata is stored in Snowflake Open Catalog. I am trying to read these tables from the Open Catalog and write back to the Open Catalog using Databricks.I have explored the available documentation but haven’t bee...

  • 302 Views
  • 3 replies
  • 2 kudos
Latest Reply
mukul1409
New Contributor
  • 2 kudos

Databricks does not currently provide official support to read from or write to Snowflake Open Catalog. Although Snowflake Open Catalog is compatible with the Iceberg REST catalog and open source Spark can work with it, this integration is not suppor...

  • 2 kudos
2 More Replies
Loinguyen318
by New Contributor II
  • 2469 Views
  • 4 replies
  • 0 kudos

Resolved! Public DBFS root is disabled in Databricks free edition

I am using notebook to execute a sample spark to write delta table in dbfs using free edition. However, I face an issue, that I can not access the public DBFS after the code executed.The spark code such as:data = spark.range(0, 5)data.write.format("d...

  • 2469 Views
  • 4 replies
  • 0 kudos
Latest Reply
mukul1409
New Contributor
  • 0 kudos

Yes ,  use UCVolumes instead of DBFS. As Databricks moves toward a serverless architecture, DBFS access is being increasingly restricted and is not intended for long term or production usage. UC Volumes are a better choice than DBFS.

  • 0 kudos
3 More Replies
Phani1
by Databricks MVP
  • 2086 Views
  • 2 replies
  • 1 kudos

Resolved! Databricks - Calling dashboard another dashboard..

Hi Team ,Can we call the dashboard from another dashboard? An example screenshot is attached.Main Dashboard has 3 buttons that point to 3 different dashboards and if we click any of the buttons it has to redirect to the respective dashboard.

  • 2086 Views
  • 2 replies
  • 1 kudos
Latest Reply
thains
New Contributor III
  • 1 kudos

I would also like to see this feature added.

  • 1 kudos
1 More Replies
ciaran
by New Contributor
  • 112 Views
  • 1 replies
  • 0 kudos

Is GCP Workload Identity Federation supported for BigQuery connections in Azure Databricks?

I’m trying to set up a BigQuery connection in Azure Databricks (Unity Catalog / Lakehouse Federation) using GCP Workload Identity Federation (WIF) instead of a GCP service account keyEnvironment:Azure Databricks workspaceBigQuery query federation via...

  • 112 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 0 kudos

I guess that it is only one accepted as doc say "Google service account key json"

  • 0 kudos
pavelhym
by New Contributor
  • 121 Views
  • 1 replies
  • 1 kudos

Usage of MLFlow models inside Streamlit app in Databricks

I have an issue with loading registered MLflow model into streamlit app inside the DatabricksThis is the sample code used for model load:import mlflowfrom mlflow.tracking import MlflowClientmlflow.set_tracking_uri("databricks")mlflow.set_registry_uri...

  • 121 Views
  • 1 replies
  • 1 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 1 kudos

Authentication context isn’t automatically available in Apps. Notebooks automatically inject workspace host and token for mlflow when you use mlflow.set_tracking_uri("databricks") and mlflow.set_registry_uri("databricks-uc"). In Databricks Apps, you ...

  • 1 kudos
JothyGanesan
by New Contributor III
  • 160 Views
  • 2 replies
  • 4 kudos

Resolved! Vacuum on DLT

We are currently using DLT tables in our target tables. The tables are getting loaded in continuous job pipelines.The liquid cluster is enabled in the tables. Will Vacuum work on these tables when it is getting loaded in continuous mode? How to run t...

  • 160 Views
  • 2 replies
  • 4 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 4 kudos

VACUUM works fine on DLT tables running in continuous mode. DLT does automatic maintenance (OPTIMIZE + VACUUM) roughly every 24 hours if the pipeline has a maintenance cluster configured. Q: The liquid cluster is enabled in the tables. Will Vacuum wo...

  • 4 kudos
1 More Replies
DylanStout
by Contributor
  • 253 Views
  • 4 replies
  • 1 kudos

Catalog tag filter error

When trying to filter in the catalog on "Tag", it throws an error that it failed to load values:The other filters do load:I have tried it with different computes and I have a view that has a tag (as shown in the screenshot).I have the following privi...

DylanStout_0-1764581602517.png DylanStout_1-1764581693590.png DylanStout_2-1764581879449.png
  • 253 Views
  • 4 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Hello @DylanStout! Did the suggestions shared above help resolve your concern? If so, please consider marking the response as the accepted solution.If you found a different approach that worked, sharing it would be helpful for others in the community...

  • 1 kudos
3 More Replies
TFV
by New Contributor
  • 141 Views
  • 1 replies
  • 1 kudos

Regression: Dashboard slicer paste now commits invalid filter values instead of searching

Hi Team,We appear to be experiencing a recent regression in the AI/BI dashboard filter slicer behaviour.Steps to reproduceOpen a dashboard containing a single-select or multi-select filter slicer.Click into the slicer’s text input.Paste text from the...

  • 141 Views
  • 1 replies
  • 1 kudos
Latest Reply
emma_s
Databricks Employee
  • 1 kudos

Hi Tim, I can't find any mention of this internally. But I suspect it will be related to this change  Multi-select filter paste: Viewers can now copy a column of values from a spreadsheet and paste them into a multi-select filter. My recommendation w...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels