access databricks secretes in int script
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2022 11:01 AM
we are trying install databricks cli on init scripts and in order to do this we need to autheticate with databricks token but it is not secure as anyone got access to cluster can get hold of this databricks token.
we try to inject the secretes into secrete scope and access in init script and it is not working.
Any suggestions to access secrete or inject token securely into init scripts ?
- Labels:
-
Databricks cli
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2022 01:21 PM
I think you don't need to install CLI. There is a whole API available via notebook. below is example:
import requests
ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
host_name = ctx.tags().get("browserHostName").get()
host_token = ctx.apiToken().get()
# COMMAND ----------
notebook_folder = '/Users/hubert.dudek@databrickster.com'
response = requests.get(
f'https://{host_name}/api/2.0/workspace/list',
headers={'Authorization': f'Bearer {host_token}'},
json={'path': notebook_folder}
).json()
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2022 11:13 PM
Thank you for reply.
The reason we are installing databricks cli on init script is we have requirement to call python notebook via init script.
Btw I managed to access the secrete from secrete scope and may be I had some typo.
example:
Set below in environment variable of cluster between bracket
DToken=secretes/scope/secretesKey
And remove the DToken from spark environments variable so that it won't be available after the init script.
Example: sed -i '/^DToken/d' /databricks/spark/conf/spark-env.sh

