ā02-03-2022 03:10 PM
Is there a spark command in databricks that will tell me what databricks workspace I am using? Iād like to parameterise my code so that I can update delta lake file paths automatically depending on the workspace (i.e. it picks up the dev workspace name when in dev and will pick up the prod workspace name in prod). Is this possible?
ā02-03-2022 08:26 PM
You may run the below in a notebook within the workspace:
dbutils.entry_point.getDbutils().notebook().getContext().toJson()
The output contains:
browserHostName - is the workspace name
orgId - is the workspaceID
ā02-03-2022 11:54 PM
2 weeks ago - last edited 2 weeks ago
2 weeks ago
How does it even make sense? Did you even see how old the post is? It's confusing! What made you reply to this post, wondering!
2 weeks ago
To programmatically retrieve the Databricks workspace name from within a notebook, you can use Spark configuration or the notebook context. One method is to read the workspace URL using spark.conf.get("spark.databricks.workspaceUrl") and then extract the instance nameāfor example, by splitting at the first period. Alternatively, you can access the notebook context with dbutils.notebook.entry_point.getDbutils().notebook().getContext().browserHostName().toString() (or via the equivalent JSON tags), which provides the hostname corresponding to the workspace. Both approaches give you a way to dynamically capture and use the workspace identifier in your code.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityāsign up today to get started!
Sign Up Now