"Apache Spark programming with databricks" tutorial uses Blob storage parquet files on Azure. To access those files a sas key is used in the configuration files.
Those keys were generated 5 years ago, however they expired in the begining of this month (04/2023).
The tutorial can be accessed with this link:
https://partner-academy.databricks.com/learn/lp/123/apache-spark
And the workspace dbc file with this one:
https://labs.training.databricks.com/import/apache-spark-programming-with-databricks/v1.7.0/apache-s...
The access keys still works for AWS, but on Azure the configuration can't be run succesfully since they expired.
They are read-only keys so they are written directly in notebooks. Here is an example:
The "se" parameter normally indicate the expiration date from what I found.
The keys were generated for the container "training" in the accounts "dbtrain{region}".
I did not know what was the correct loation for this post, to help solve this issue.