Hello Every one,
i am really new to databricks, just passed my apache developer certification on it.
i also have a certification on data engineering with Azure.
some fancy words here but i only started doing real deep work on them as i started a personnal project i'm really excited about.
My issue comes with the accessing the account storage through databricks with the help of Managed identity.
meaning :
1/ created access connector to databricks
- created it's identity and gave it delegator role on account storage + contributor on container.
2/ created a metastore, linked it to databrick access connector, linked it to my db workspace.
3/ created credentials and external location .
4 / I could Query the container with 2 different methods but not the last one.
So long i tried two ways that are working just fine
1 /
%sql
create table raw.table;
COPY INTO raw.table
FROM 'abfss://container@accstore.dfs.core.windows.net/'
FILEFORMAT = CSV
COPY_OPTIONS ('mergeSchema' = 'true')
2 / works perfectly
%python
df = spark.read.schema(schema).csv("abfss://raw@twitterdatalake.dfs.core.windows.net/", header=True, escape='"',quote='"' , multiLine=True ) #inferSchema=True
3 / doesn't work .
%sql
drop table if exists raw.table;
CREATE external TABLE raw.table USING CSV OPTIONS (path "abfss://raw@accstore.dfs.core.windows.net/", header 'true',
inferSchema 'true') ;
FileReadException: Error while reading file abfss:REDACTED_LOCAL_PART@accsstore.dfs.core.windows.net/file.csv.
Caused by: KeyProviderException: Failure to initialize configuration for storage account twitterdatalake.dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.key
Caused by: InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key
yes i know you will ask me why do you need this particular way ?
i don't know, i juste saw it a lot in the exam certification so i guess it's a best practice ?
furtermore, the fact it doesn't work is really really annoying me.
Does anyone have an idea on why it doesn't work ?
Thank you!
Have a good day