yesterday
I am updating dlt pipeline configs with job id , run id and run_datetime of the job , so that i can access these values inside dlt pipeline. below is the code i am using to do that.
yesterday
Hi @ganapati ,
Can you provide your SDK version? Also, when you provided SPN - how did you do that? Passing string?
20 hours ago - last edited 20 hours ago
I am using databricks-sdk (0.65.0), actually passing run_as is not working inside w.pipelines.update. BTW this code without run_as inside the update was working just fine for a week.
19 hours ago - last edited 19 hours ago
Hi @ganapati ,
You're passing run_as in the wrong way. This argument expects a RunAs data type - not string. Try to create RunAs instance and pass that as an argument
19 hours ago
Would you know how to create RunAs instance and pass that as an argument?
19 hours ago - last edited 19 hours ago
I guess it should look something like below. Just provide your service princpal id:
from databricks.sdk.service.pipelines import RunAs
run_as_instance = RunAs(service_principal_name="your_service_principal_id")
w.pipelines.update(
pipeline_id=pipeline_id,
name=pipeline.name,
libraries=pipeline.spec.libraries,
catalog=pipeline.spec.catalog,
target=pipeline.spec.target,
configuration=configuration,
development=pipeline.spec.development,
edition=pipeline.spec.edition,
serverless=pipeline.spec.serverless,
run_as = run_as_instance
)
19 hours ago
wonderful, thanks a lot, i will try this out
19 hours ago
Let us know if it works 🙂
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now