Our customer is using Azureโs blob storage service to save big files so that we can work with them using an Azure online service, like Databricks.
We want to read and work with these files with a computing resource obtained by Azure directly without downloading them into another Azure service, like Azure Machine Learning Studio or Databricks.
Until now, we are unable to access the data within the blob storage without downloading them into Azure Machine Learning Studio for working with them.
Moreover, none of the files which we want to read is a type of these:
However, they can be read with the help of a python extension.
How can we access the data within the blob storage without downloading them beforehand?
Is it possible to mount the blob storage to Databricks anyhow so that we can access and use the files in a Databricks notebook?
I started to experiment with Databricks.
Therefore, I created two files within a blob-storage-container:
a picture and a csv-file.
Then I mounted the blob-storage-container to Databricks according to this instruction.The csv-file can be accessed and displayed without downloading to Databricks in any way.
Unfortunately, I cannot display the image.
Databricks is telling me that there is "No such file or directory".
However, the file exists and the path is correct as we have seen before with the csv-file.
Can someone help me, please?
With best regards
SettlerOfCatan