Hello everyone!
I have a pipeline in Data Factory full of notebooks. In one of the notebooks I have the following (inside a function take writes a query to a table):
"
try:
df = spark.sql(query)
df.write.format(format).mode(method).saveAsTable(table)
except Exception as e:
(???)
"
And I want that if the code enters this "except Exception as e:" part, that the notebook abort (and that in the pipeline in DF the notebook shows as "failed").
I tried this methods:
1)
raise e
2)
raise Exception("error")
3)
assert False
4)
msg = "An error occured while writing the query to the table: {}".format(e)
dbutils.notebook.exit(msg)
None of them worked like I wanted. I want that the notebook fails. The last one actually stopped the rest of the cells from executing, but it still appears that was "successful" in Data Factory, and I want "failed".
Does anyone know how can I do this?
Please and thank you!! 😄