Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
Is there a spark command in databricks that will tell me what databricks workspace I am using? Iād like to parameterise my code so that I can update delta lake file paths automatically depending on the workspace (i.e. it picks up the dev workspace name when in dev and will pick up the prod workspace name in prod). Is this possible?