cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jpwp
by New Contributor III
  • 20966 Views
  • 12 replies
  • 8 kudos

Resolved! How to specify entry_point for python_wheel_task?

Can someone provide me an example for a python_wheel_task and what the entry_point field should be?The jobs UI help popup says this about "entry_point":"Function to call when starting the wheel, for example: main. If the entry point does not exist in...

  • 20966 Views
  • 12 replies
  • 8 kudos
Latest Reply
hectorfi
New Contributor III
  • 8 kudos

Just in case anyone comes here in the future, this is kind of how Databricks executes these entry points... How I know? I have banged my head against this wall for a couple of hours already.from importlib import metadata package_name = "some.package...

  • 8 kudos
11 More Replies
jwilliam
by Contributor
  • 1905 Views
  • 3 replies
  • 1 kudos

Resolved! [BUG] Databricks install WHL as JAR in Python Wheel Task?

I'm using Python Wheel Task in Databricks job with WHEEL dependencies. However, the cluster installed the dependencies as JAR instead of WHEEL. Is this an expected behavior or a bug?

  • 1905 Views
  • 3 replies
  • 1 kudos
Latest Reply
AndréSalvati
New Contributor III
  • 1 kudos

There you can see a complete template project with a python wheel task and Databricks Asset Bundles. Please, follow the instructions for deployment.https://github.com/andre-salvati/databricks-template

  • 1 kudos
2 More Replies
GGG_P
by New Contributor III
  • 3973 Views
  • 3 replies
  • 0 kudos

Databricks Tasks Python wheel : How access to JobID & runID ?

I'm using Python (as Python wheel application) on Databricks.I deploy & run my jobs using dbx.I defined some Databricks Workflow using Python wheel tasks.Everything is working fine, but I'm having issue to extract "databricks_job_id" & "databricks_ru...

  • 3973 Views
  • 3 replies
  • 0 kudos
Latest Reply
AndréSalvati
New Contributor III
  • 0 kudos

There you can see a complete template project with Databricks Asset Bundles and python wheel task. Please, follow the instructions for deployment.https://github.com/andre-salvati/databricks-templateIn particular, take a look at the workflow definitio...

  • 0 kudos
2 More Replies
Bartek
by Contributor
  • 2811 Views
  • 0 replies
  • 1 kudos

How to pass all dag_run.conf parameters to python_wheel_task

I want to trigger Databricks job from Airflow using DatabricksSubmitRunDeferrableOperator and I need to pass configuration params. Here is excerpt from my code (definition is not complete, only crucial properties):from airflow.providers.databricks.op...

  • 2811 Views
  • 0 replies
  • 1 kudos
jpwp
by New Contributor III
  • 4562 Views
  • 2 replies
  • 1 kudos

Resolved! Adding a dependent library to a Job task permanently adds it to the entire cluster?

Why does adding a dependent library to a Job task also permanently add it to the entire cluster?I am using python wheels, and even when I remove the dependent library from a Job task, the wheel is still part of the cluster configuration.If I then upd...

  • 4562 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

If you have configured a library to install on all clusters automatically, or you select an existing terminated cluster that has libraries installed, the job execution does not wait for library installation to complete. If a job requires a specific l...

  • 1 kudos
1 More Replies
Labels