Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I want to pass some context information to the delta live tables pipeline when calling from Azure Data Factory. I know the body of the API call supports Full Refresh parameter but I wonder if I can add my own custom parameters and how this can be re...
In case this helps anyone, I only could use the refresh_selection parameter setting it to [] by default. Then, in the notebook, I derived the custom parameter values from the refresh_selection value.
I have two notebooks created for my Delta Live Table pipeline. The first is a utils notebook with functions I will be reusing for other pipelines. The second contains my actual creation of the delta live tables. I added both notebooks to the pipeline...
Hi Dave,You can solve this by putting your utils into a python file and referencing your .py file in the DLT notebook. I provided a template for the python file below:STEP 1: #import functions
from pyspark.sql import SparkSession
import IPython
dbut...
I am building out a new DLT pipeline and have since had to rebuild it from scratch. Having deleted the old pipeline and constructed a new one I now get this error:Table 'X' is already managed by pipeline 'Y'. As I only have the one pipeline how would...
rename your function from @Dlt.table, for exemple:@Dlt.table( comment="exemple", table_properties={"exemple": "exemple"}, partition_cols=["a", "b", "c"])def modify_this_name():
Failed to launch pipeline cluster 0802-171503-4m02lexd: The operation could not be performed on your account with the following error message: azure_error_code: OperationNotAllowed, azure_error_message: Operation could not be completed as it results ...
Hi there @Aaron LeBato Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you...
I am getting this error when running Delta Live Table Pipelines pulling from a source in our sandbox's local dbfs folder. It says the user is not authorized to perform this operation, whereas I am able to see the data when I run a simple select state...
Hey there @Abhay Sudhakaran Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution and mark an answer as best? Else please let us know if you need more help. We'd love to hear fr...
DLT Pipeline results are published to the "Storage Location" defined as part of configuring the Pipeline. Ex:- https://docs.databricks.com/_images/dlt-create-notebook-pipeline.pngIf an explicit Storage Location is not specified, the pipeline results ...