how to pass secrets keys using a spark_python_task
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hello community,
I was searching a way to pass secrets to spark_python_task. Using a notebook file is easy, it's only to use dbutils.secrets.get(...) but how to do the same thing using a spark_python_task set using serveless compute?
Kind regards,
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
ping, it's important 😞
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
Hi @jeremy98. To securely access secrets in a spark_python_task with serverless compute in Databricks... first create a secret scope and add secrets (refer to this article). Then, pass secrets by injecting them into environment variables in the job configuration and access them in your Python script using os.environ().
You can also check out this blog for more details.

