Hi,Interest of using a repo is to have a dedicated area for each of developers.If you want to have only a folder with the last version of the code, you should a CI/CD pipeline that will package the code and then delivered into a folder inside Workspa...
Hello,On my side, I always have to add the provider in each resource block.You can try that: resource "databricks_group" "xxxxx" {
provider = databricks.accounts
display_name = "xxxxx"
} About authentication, you can also try to add:auth_type ...
Hi Sean,There are two ways to handle secret scopes:databricks-backed scopes: scope is related to a workspace. You will have to handle the update of the secrets.Azure Key Vault-backed scopes: scope is related to a Key Vault. It means than you configur...
Yes, you can:
https://docs.databricks.com/user-guide/notebooks/notebook-workflows.html#example
You will get the return value as you will do with a function.
Hi @Emiliano Parizzi,You could parsed the timestamp after loading the file with using the withColumn (cf. https://stackoverflow.com/questions/39088473/pyspark-dataframe-convert-unusual-string-format-to-timestamp).from pyspark.sql import Row from pysp...