cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dave_Nithio
by Contributor
  • 6770 Views
  • 5 replies
  • 7 kudos

Resolved! Delta Live Table Pipeline with Multiple Notebooks

I have two notebooks created for my Delta Live Table pipeline. The first is a utils notebook with functions I will be reusing for other pipelines. The second contains my actual creation of the delta live tables. I added both notebooks to the pipeline...

image.png
  • 6770 Views
  • 5 replies
  • 7 kudos
Latest Reply
fecavalc08
New Contributor III
  • 7 kudos

Hi @Vivian Wilfred​ and @Dave Wilson​ we solved our reusability code with repos and pointing the code to our main code:sys.path.append(os.path.abspath('/Workspace/Repos/[your repo]/[folder with the python scripts'))from your_class import *It just wor...

  • 7 kudos
4 More Replies
jfvizoso
by New Contributor II
  • 6187 Views
  • 4 replies
  • 0 kudos

Can I pass parameters to a Delta Live Table pipeline at running time?

I need to execute a DLT pipeline from a Job, and I would like to know if there is any way of passing a parameter. I know you can have settings in the pipeline that you use in the DLT notebook, but it seems you can only assign values to them when crea...

  • 6187 Views
  • 4 replies
  • 0 kudos
Latest Reply
Mustafa_Kamal
New Contributor II
  • 0 kudos

Hi @jfvizoso ,I also have the same scenario, did you find any work around.Thanks in advance.

  • 0 kudos
3 More Replies
shagun
by New Contributor III
  • 3517 Views
  • 3 replies
  • 0 kudos

Resolved! Delta live tables target schema

The first time i run my delta live table pipeline after setup, I get this error on starting it :-------------------------------------org.apache.spark.sql.catalyst.parser.ParseException: Possibly unquoted identifier my-schema-name detected. Please con...

  • 3517 Views
  • 3 replies
  • 0 kudos
Latest Reply
BenTendo
New Contributor II
  • 0 kudos

This still errors on internal databricks spark/python code likedeltaTable.history()@shagun wrote:The first time i run my delta live table pipeline after setup, I get this error on starting it :-------------------------------------org.apache.spark.sql...

  • 0 kudos
2 More Replies
J_M_W
by Contributor
  • 3544 Views
  • 3 replies
  • 3 kudos

Resolved! Can you use %run or dbutils.notebook.run in a Delta Live Table pipeline?

Hi there, Can you use a %run or dbutils.notebook.run() in a Delta Live Table (DLT) pipeline?When I try, I get the following error: "IllegalArgumentException: requirement failed: To enable notebook workflows, please upgrade your Databricks subscriptio...

  • 3544 Views
  • 3 replies
  • 3 kudos
Latest Reply
J_M_W
Contributor
  • 3 kudos

Hi all.@Kaniz Fatma​ thanks for your answer. I am on the premium pricing tier in Azure.After digging around the logs it would seem that you cannot run magic commands in a Delta Live Table pipeline. Therefore, you cannot use %run in a DLT pipeline - w...

  • 3 kudos
2 More Replies
d_meaker
by New Contributor II
  • 1474 Views
  • 3 replies
  • 0 kudos

map_keys() returns an empty array in Delta Live Table pipeline.

We are exploding a map type column into multiple columns based on the keys of the map column. Part of this process is to extract the keys of a map type column called json_map as illustrated in the snippet below. The code executes as expected when run...

  • 1474 Views
  • 3 replies
  • 0 kudos
Latest Reply
d_meaker
New Contributor II
  • 0 kudos

Hi @Suteja Kanuri​ , Thank you for you response and explanation. The code I have shown above is not the exact snippet we are using. Please find the exact snippet below. We are dynamically extracting the keys of the map and then using getitem() to mak...

  • 0 kudos
2 More Replies
NM447101
by New Contributor II
  • 1963 Views
  • 3 replies
  • 1 kudos

Error when creating a delta live table pipeline

INVALID_PARAMETER_VALUE: Validation failed for node_type_id, the value must be Standard_DS3_v2 (is "Standard_F8s") 

image
  • 1963 Views
  • 3 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Nitya Mehta​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers y...

  • 1 kudos
2 More Replies
BigMF
by New Contributor III
  • 2725 Views
  • 2 replies
  • 1 kudos

Resolved! Can I use Widgets in a Delta Live Table pipeline

Hello, I'm pretty new to Databricks in general and Delta Live Tables specifically. My problem statement is that I'd like loop through a set of files and run a notebook that loads the data into some Delta Live Tables. Additionally, I'd like to include...

image image
  • 2725 Views
  • 2 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

may be it is possible in the job run file there will be configurations option the user can add there

  • 1 kudos
1 More Replies
Aaron1234567
by New Contributor III
  • 3201 Views
  • 9 replies
  • 3 kudos

Delta Live Table Pipeline - Azure cluster fail

Failed to launch pipeline cluster 0802-171503-4m02lexd: The operation could not be performed on your account with the following error message: azure_error_code: OperationNotAllowed, azure_error_message: Operation could not be completed as it results ...

  • 3201 Views
  • 9 replies
  • 3 kudos
Latest Reply
Vidula
Honored Contributor
  • 3 kudos

Hi there @Aaron LeBato​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you...

  • 3 kudos
8 More Replies
aladda
by Honored Contributor II
  • 1945 Views
  • 1 replies
  • 0 kudos
  • 1945 Views
  • 1 replies
  • 0 kudos
Latest Reply
aladda
Honored Contributor II
  • 0 kudos

Here's the difference a View and Table in the context of a Delta Live Table PIpelineViews are similar to a temporary view in SQL and are an alias for some computation. A view allows you to break a complicated query into smaller or easier-to-understan...

  • 0 kudos
aladda
by Honored Contributor II
  • 937 Views
  • 1 replies
  • 0 kudos
  • 937 Views
  • 1 replies
  • 0 kudos
Latest Reply
aladda
Honored Contributor II
  • 0 kudos

Yes. You can specify a "target" database as part of your DLT pipeline configuration to publish results to a target database in the metastore. See - https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-quickstart.html#publi...

  • 0 kudos
aladda
by Honored Contributor II
  • 815 Views
  • 1 replies
  • 0 kudos
  • 815 Views
  • 1 replies
  • 0 kudos
Latest Reply
aladda
Honored Contributor II
  • 0 kudos

DLT Pipeline results are published to the "Storage Location" defined as part of configuring the Pipeline. Ex:- https://docs.databricks.com/_images/dlt-create-notebook-pipeline.pngIf an explicit Storage Location is not specified, the pipeline results ...

  • 0 kudos
User16783854357
by New Contributor III
  • 828 Views
  • 1 replies
  • 1 kudos

How to run a Delta Live Table pipeline with a different runtime?

I would like to run a DLT pipeline with the 8.2 runtime.

  • 828 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16783854357
New Contributor III
  • 1 kudos

You can add the below JSON property to the Delta Live Table pipeline specification at the parent level:"dbr_version": "8.2"

  • 1 kudos
Labels