cancel
Showing results for 
Search instead for 
Did you mean: 
Certifications
Join dynamic discussions on Databricks certifications within the Community. Exchange insights, tips, and experiences to help prepare for certification exams and validate your expertise in data engineering, analytics, and machine learning.
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT Pipeline Setup: What Must Be Specified ?

sunnyj
New Contributor III

Hi all,
I'm preparing for the Databricks Data Engineer Associate exam and came across this question:

Which of the following must be specified when creating a new Delta Live Tables pipeline?
A. A key-value pair configuration
B. At least one notebook library to be executed
C. A path to cloud storage location for the written data
D. A location of a target database for the written data

From what I understand, some options like cloud storage paths and database names may be optional or inferred. I’d appreciate any clarification from the community or Databricks team on which of these is truly mandatory during pipeline creation.

Thanks in advance!

2 REPLIES 2

SP_6721
Contributor III

Hi @sunnyj ,

The one thing we must specify while creating a new Delta Live Tables pipeline is at least one notebook (or SQL/Python file) with our DLT logic (like @dlt.table or CREATE LIVE TABLE statements). That’s the main part of our ETL, and the pipeline won’t work without it.

  • Key-value configs are optional; it's only needed if we want to customize Spark settings.
  • Storage path can be left empty; Databricks will auto-generate a default.
  • We can set the target database in the code itself or later from the UI or JSON.

tootru2breal
New Contributor II

Actually, the correct answer would be D. This is often mistakened. You need to identify a target database (default schema to be read/written/published to) in order to create the pipeline. Databricks will create a blank notebook for you that you will need to add logic into later. I've attached a screenshot from my work environment. I'm in Databricks daily. The key is to pay attention to what the question is actually asking. Now, a blank notebook will not do you any good, but that's not what the question is asking.

I marked up the screenshot as a reference to help you out.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now