ASP1.2 Error create database in Spark Programming with Databricks training
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-11-2023 06:07 AM
I'm on Demo and Lab in Dataframes section. I've imported the dbc into my company cluster and has run "%run ./Includes/Classroom-Setup" successfully.
When i run the 1st sql command
%sql
CREATE TABLE IF NOT EXISTS events USING parquet OPTIONS (path "/mnt/training/ecommerce/events/events.parquet");
and i got this error
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: hadoop_azure_shaded.com.microsoft.azure.storage.StorageException: Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
May i know how to resolve this issue?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-15-2023 05:53 PM
@alex chooi :
The error message suggests that there is an authentication issue when accessing the Azure storage account where the data is stored.
To resolve this issue, you can try the following steps:
- Check if the storage account key is correctly configured in your Databricks cluster. You can find the storage account key by going to the Azure portal, selecting the storage account, and then navigating to "Access keys" under "Settings". Make sure that you have copied the correct storage account name and key to the corresponding variables in your Databricks notebook.
- If you have recently regenerated the storage account key, make sure that you have updated the key in your Databricks notebook.
- Check if the storage account has the necessary permissions to allow access from your Databricks cluster. You can check this by going to the Azure portal, selecting the storage account, and then navigating to "Firewalls and virtual networks" under "Security + networking". Make sure that the IP address of your Databricks cluster is added to the allowed IP addresses.
- If the above steps do not resolve the issue, you can try re-creating the storage account and re-uploading the data to it. Sometimes, there can be issues with the storage account that cause authentication errors.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-18-2023 08:01 AM
Hello!
I got exactly the same error and cannot continue with the course. Is it on our side to check these account keys and everything? Is not it something that should have been done for us (we are trying to access a storage that was defined in Common Notebooks to create mounts?) My colleagues easily went through the course, everything was ready out of the box. It is quite frustrating to get stuck at this point 😞
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-10-2023 05:43 AM
Same problem here, my theory is that is a problem with the notebook's course, because I can easily run other notebooks from other courses, for example "Data Engineering with Databricks V2" everything runs ok, but not the Spark Programming with Databricks that was last updated a year ago.
Please help.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-07-2023 12:56 AM
I had the same issue and solved it like this:
In the includes folder, there is a reset notebook, run the first command, this unmounts all mounted databases.
Go back to the ASP 1.2 notebook and run the %run ./Includes/Classroom-Setup codeblock.
Then run the next codeblock:

