Hi @ak4, this is an expected behavior with DBIO transactional commit enabled and most likely the issue is when you update a table and query it immediately. You could explicitly invalidate the cache or configure disk cache. If you have a long running ...
Hi @Dnirmania, You could achieve something similar using this UDF:%sql
CREATE OR REPLACE FUNCTION ryanlakehouse.default.column_masking(column_value STRING, groups_str String)
RETURNS STRING
LANGUAGE SQL
COMMENT 'Return the column value if use...
Hi @Amodak91, you could use the %run magic command from within the downstream notebook and call the upstream notebook thus having it run in the same context and have all it's variables accessible including the dataframe without needing to persist it....
Hi @Mangeysh,You could achieve this using Databricks SQL Statement Execution API. I would recommend going through the docs and looking at the functionality and limitations and see if it serves your need before planning to develop your own APIs.
Hi @ChristianRRL, you could get this information using the dynamic value references {{job.trigger.type}}In your task settings, assign it to a parameter.And then you could access it from within your notebook using dbutils widgets