Hi @Kaniz Fatmaโ,
Thank your reply. I followed the steps in the link to mount the ADLS from the notebook but I'm getting an error :
Operation failed: "This request is not authorized to perform this operation.", 403, HEAD, https://myadls2.dfs.core.windows.net/libraries/?upn=false&action=getAccessControl&timeout=90
ExecutionError                            Traceback (most recent call last)
<command-274923613237744> in <module>
      5           "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/12a3af23-a769-4654-847f-958f3d479f4a/oauth2/token"}
      6 
----> 7 dbutils.fs.mount(
      8   source = "abfss://libraries@myadls2.dfs.core.windows.net/",
      9   mount_point = "/mnt/libraries",
 
/databricks/python_shell/dbruntime/dbutils.py in f_with_exception_handling(*args, **kwargs)
    387                     exc.__context__ = None
    388                     exc.__cause__ = None
--> 389                     raise exc
    390 
    391             return f_with_exception_handling
 
ExecutionError: An error occurred while calling o360.mount.
: Operation failed: "This request is not authorized to perform this operation.", 403, HEAD, https://myadls2.dfs.core.windows.net/libraries/?upn=false&action=getAccessControl&timeout=90
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:246)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getAclStatus(AbfsClient.java:955)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getAclStatus(AbfsClient.java:937)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getIsNamespaceEnabled(AzureBlobFileSystemStore.java:318)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:883)
	at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:682)
	at com.databricks.backend.daemon.dbutils.DBUtilsCore.verifyAzureFileSystem(DBUtilsCore.scala:824)
	at com.databricks.backend.daemon.dbutils.DBUtilsCore.createOrUpdateMount(DBUtilsCore.scala:734)
	at com.databricks.backend.daemon.dbutils.DBUtilsCore.mount(DBUtilsCore.scala:776)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
	at py4j.Gateway.invoke(Gateway.java:295)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:251)
	at java.lang.Thread.run(Thread.java:748)
FYI: The service principal has the following roles on the adls : storage data reader, storage blob data contributor, storage blob data owner, owner.