cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jeremy98
by Honored Contributor
  • 3325 Views
  • 2 replies
  • 0 kudos

if else condition task doubt

Hi community,The if else condition task couldn't be used as real if condition? Seems that if the condition goes to False the entire job will be stop. Is it a right behaviour?

  • 3325 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

In Databricks workflows, the "if-else" condition and depends_on logic do not behave exactly like standard programming if-else statements. If a task depends on another task's outcome and that outcome does not match (for example, the condition is false...

  • 0 kudos
1 More Replies
Carl_B
by New Contributor II
  • 3763 Views
  • 1 replies
  • 0 kudos

ImportError: cannot import name 'override' from 'typing_extensions'

Hello,I'm facing an ImportError when trying to run my OpenAI-based summarization script in.The error message is:ImportError: cannot import name 'override' from 'typing_extensions' (/databricks/python/lib/python3.10/site-packages/typing_extensions.py)...

  • 3763 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

This error is caused by a version mismatch between the OpenAI Python package and the typing_extensions library in your Databricks environment. The 'override' symbol is relatively new and only exists in typing_extensions version 4.5.0 and above; some ...

  • 0 kudos
SQLBob
by New Contributor II
  • 3509 Views
  • 2 replies
  • 0 kudos

Unity Catalog Python UDF to Send Messages to MS Teams

Good Morning All - This didn't seem like such a daunting task until I tried it. Of course, it's my very first function in Unity Catalog. Attached are images of both the UDF and example usage I created to send messages via the Python requests library ...

  • 3509 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

You're encountering a common limitation when trying to use an external HTTP request (like the Python requests library) inside a Unity Catalog UDF in Databricks. While your code is correct for a regular notebook environment, Unity Catalog UDFs (and, s...

  • 0 kudos
1 More Replies
jash281098
by New Contributor II
  • 3027 Views
  • 2 replies
  • 0 kudos

Issues when adding keystore spark config for pyspark to mongo atlas X.509 connectivity

Step followed - Step1: To add init script that will copy the keystore file in the tmp location.Step2: To add spark config in cluster advance options - spark.driver.extraJavaOptions -Djavax.net.ssl.keyStore=/tmp/keystore.jks -Djavax.net.ssl.keyStorePa...

  • 3027 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

To achieve MongoDB Atlas X.509 connectivity from Databricks using PySpark, the standard keystore configuration may fail due to certificate, configuration, or driver method issues. The recommended approach involves several key steps, including properl...

  • 0 kudos
1 More Replies
ShivangiB1
by New Contributor III
  • 21 Views
  • 0 replies
  • 0 kudos

DATABRICKS LAKEFLOW SQL SERVER INGESTION PIPELINE ERROR

Hey Team,I am getting below error while creating pipeline :com.databricks.pipelines.execution.extensions.managedingestion.errors.ManagedIngestionNonRetryableException: [INGESTION_GATEWAY_DDL_OBJECTS_MISSING] DDL objects missing on table 'coedb.dbo.so...

  • 21 Views
  • 0 replies
  • 0 kudos
der
by Contributor II
  • 181 Views
  • 6 replies
  • 2 kudos

EXCEL_DATA_SOURCE_NOT_ENABLED Excel data source is not enabled in this cluster

I want to read an Excel xlsx file on DBR 17.3. On the Cluster the library dev.mauch:spark-excel_2.13:4.0.0_0.31.2 is installed. V1 Implementation works fine:df = spark.read.format("dev.mauch.spark.excel").schema(schema).load(excel_file) display(df)V2...

  • 181 Views
  • 6 replies
  • 2 kudos
Latest Reply
mmayorga
Databricks Employee
  • 2 kudos

hi @der  First of all thank you for your patience and for providing more information about your case. Use of ".format("excel")" I replicated equally your cluster config in Azure. Without installing any library, I was able to run and load the xlsx fil...

  • 2 kudos
5 More Replies
GJ2
by New Contributor II
  • 10496 Views
  • 12 replies
  • 2 kudos

Install the ODBC Driver 17 for SQL Server

Hi,I am not a Data Engineer, I want to connect to ssas. It looks like it can be connected through pyodbc. however looks like  I need to install "ODBC Driver 17 for SQL Server" using the following command. How do i install the driver on the cluster an...

GJ2_1-1739798450883.png
  • 10496 Views
  • 12 replies
  • 2 kudos
Latest Reply
Coffee77
Contributor
  • 2 kudos

If you only need to interact with your cloud SQL database, I recommend you use simple code like displayed below for running select queries. To write would be very similar. Take a look here: https://learn.microsoft.com/en-us/sql/connect/spark/connecto...

  • 2 kudos
11 More Replies
73334
by New Contributor II
  • 3796 Views
  • 3 replies
  • 1 kudos

Dedicated Access Mode Interactive Cluster with a Service Principal

Hi, I am wondering if it is possible to set up an interactive cluster set to dedicated access mode and having that user be a machine user?I've tried the cluster creation API, /api/2.1/clusters/create, and set the user name to the service principal na...

  • 3796 Views
  • 3 replies
  • 1 kudos
Latest Reply
Coffee77
Contributor
  • 1 kudos

It turns out that now is possible to include deployment of interactive and SQL Warehouse clusters with Databricks Asset Bundles, so you can include a YAML file similar to this one to deploy that type of interactive clusters:Definition of Interactive ...

  • 1 kudos
2 More Replies
TomDeas
by New Contributor II
  • 2014 Views
  • 2 replies
  • 2 kudos

Resolved! Resource Throttling; Large Merge Operation - Recent Engine Change?

Morning all, hope you can help as I've been stumped for weeks.Question: have there been recent changes to the Databricks query engine, or Photon (etc) which may impact large sort operations?I have a Jobs pipeline that runs a series of notebooks which...

runhistory.JPG query1.png query2.png query_peak.JPG
Data Engineering
MERGE
Performance Optimisation
Photon
Query Plan
serverless
  • 2014 Views
  • 2 replies
  • 2 kudos
Latest Reply
mark_ott
Databricks Employee
  • 2 kudos

There have indeed been recent changes to the Databricks query engine and Photon, especially during the June 2025 platform releases, which may influence how large sort operations and resource allocation are handled in SQL pipelines similar to yours. S...

  • 2 kudos
1 More Replies
feliximmanuel
by New Contributor II
  • 1775 Views
  • 1 replies
  • 1 kudos

Error: oidc: fetch .well-known: Get "https://%E2%80%93host/oidc/.well-known/oauth-authorization-serv

I'm trying to authenticate databricks using WSL but suddenly getting this error./databricks-asset-bundle$ databricks auth login –host https://<XXXXXXXXX>.12.azuredatabricks.netDatabricks Profile Name:<XXXXXXXXX>Error: oidc: fetch .well-known: Get "ht...

  • 1775 Views
  • 1 replies
  • 1 kudos
Latest Reply
code-vj
New Contributor
  • 1 kudos

It looks like the issue is caused by the dash before host. The command is using an en-dash (–) instead of a regular hyphen (-) — which breaks the URL parsing.Try running this instead:databricks auth login --host https://<your-instance>.azuredatabrick...

  • 1 kudos
Coffee77
by Contributor
  • 100 Views
  • 6 replies
  • 2 kudos

Resolved! Databricks Asset Bundles - High Level Diagrams Flow

Hi guys!Working recently in fully understanding (and helping others...) Databricks Asset Bundles (DAB) and having fun creating some diagrams with DAB flow at high level. First one contains flow with a simple deployment in PROD and second one contains...

databricks_dab_deployment_prod.png databricks_dab_deployment_prod_with_tests.png
  • 100 Views
  • 6 replies
  • 2 kudos
Latest Reply
Coffee77
Contributor
  • 2 kudos

I will go only with latest version then , that can be applied to any other lower environment for QA or testing.

  • 2 kudos
5 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels