.py script execution failed but succeeded when run in Python notebook
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-19-2025 10:07 PM
Background:
My code executing without problem if run in a python notebook. However, the same code fails when execute from a .py script in the workspace. Seems like the 2 execution methods don't have identical version of the packages
Error message: AttributeError: 'DeltaMergeBuilder' object has no attribute 'withSchemaEvolution'
Code:
from delta.tables import DeltaTable
# Perform the merge with schema evolution
merge_result = delta_table.alias("target") \
.merge(
df.alias("source"),
f"target.{id_column_name} = source.{id_column_name}"
) \
.withSchemaEvolution() \
.whenMatchedUpdateAll() \
.whenNotMatchedInsertAll() \
.execute()
For more context, this .withSchemaEvolution() setting was availabled from runtime 16.0 onwards: Databricks Runtime 16.0 | Databricks Documentation My serverless run time is 16.1, as confirmed by running:
spark.sql("select current_version().dbr_version").show()
0 REPLIES 0

