cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure Blob Storage sas-keys expired for Apache Spark Tutorial

RayelightOP
New Contributor II

"Apache Spark programming with databricks" tutorial uses Blob storage parquet files on Azure. To access those files a sas key is used in the configuration files.

Those keys were generated 5 years ago, however they expired in the begining of this month (04/2023).

The tutorial can be accessed with this link:

https://partner-academy.databricks.com/learn/lp/123/apache-spark

And the workspace dbc file with this one:

https://labs.training.databricks.com/import/apache-spark-programming-with-databricks/v1.7.0/apache-s...

The access keys still works for AWS, but on Azure the configuration can't be run succesfully since they expired.

They are read-only keys so they are written directly in notebooks. Here is an example:

The "se" parameter normally indicate the expiration date from what I found.

The keys were generated for the container "training" in the accounts "dbtrain{region}".

I did not know what was the correct loation for this post, to help solve this issue.

1 REPLY 1

jose_gonzalez
Moderator
Moderator

Adding @Vidula Khanna​ and @Kaniz Fatma​ for visibility to help with your request

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.