Creating a python package that uses dbutils.secrets
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2024 02:45 AM
Hello Databricks,
I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook
I want to use dbutils.secret for this
Q1. I have used dbutils in my code , but when i import this package in notebook and run it it shows dbutils not defined, is it not possible it takes dbutils on itself when running this package in notebook.
Q2. Then I also added "from databricks.sdk.runtime import dbutils" but this throws error saying "cannot configure default auth"
I want to create this python package locally and run it in databricks notebook where it can use dbutils , what can i do?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2024 04:10 AM
Hello samarth_solanki,
Any specific reason for using the scope & key in class, however in data bricks and as a best practice we can create the two types of Databricks Secret Scopes:
Azure Key Vault-Backed Scope :
Users can create a Secret Scope backed by the Azure Key Vault. It allows users to leverage all the Secrets in the corresponding Key Vault instance from a particular Secret Scope.
In this method, the Secret Scopes are managed with an internally encrypted database owned by the Databricks platform. Users can create a Databricks-backed Secret Scope using the Databricks CLI .
This option would be viable for you in case if don't want to leverage the azure key vault option.
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-29-2024 09:22 AM - edited 03-29-2024 09:25 AM
@samarth_solanki wrote:Hello Databricks,
I wanted to create python package which has a python script which has a class , this class is where we give scope and key and we het the secret. This package will be used inside a databricks notebook
I want to use dbutils.secret for this
Q1. I have used dbutils in my code , but when i import this package in notebook and run it it shows dbutils not defined, is it not possible it takes dbutils on itself when running this package in notebook.
Q2. Then I also added "from databricks.sdk.runtime import dbutils" but this throws error saying "cannot configure default auth"
I want to create this python package locally and run it in databricks notebook where it can use dbutils , what can i do?
Hello, I have exactly the same issue (in my case I want to use dbutils.fs.ls). Did you find a solution?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-15-2024 02:11 PM - edited 08-15-2024 02:12 PM
Same question. Did you ever get an answer for this?
I want to use databricks secrets inside my custom Python code. This is only necessary since based on this post (https://community.databricks.com/t5/administration-architecture/aws-secrets-manager-access/td-p/5019...), DBR > 12.2 LTS no longer allows the use of AWS Secret Manager
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2024 07:13 AM
from pyspark.dbutils import DBUtils
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
dbutils = DBUtils(spark)
Adding this to the module file solves the problem.

