- 1907 Views
- 3 replies
- 2 kudos
We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
...
- 1907 Views
- 3 replies
- 2 kudos
Latest Reply
Is there any update on this feature request? OAuth still seems not to be working with Azure Blob Storage... Configuration works fine for ADLS gen 2, but for Azure Blob Storage still only SAS and Account key seems to be working.
2 More Replies
- 7349 Views
- 5 replies
- 1 kudos
I want to read data from s3 access point.I successfully accessed using boto3 client to data through s3 access point.s3 = boto3.resource('s3')ap = s3.Bucket('arn:aws:s3:[region]:[aws account id]:accesspoint/[S3 Access Point name]')for obj in ap.object...
- 7349 Views
- 5 replies
- 1 kudos
Latest Reply
I'm reaching out to seek assistance as I navigate an issue. Currently, I'm trying to read JSON files from an S3 Multi-Region Access Point using a Databricks notebook. While reading directly from the S3 bucket presents no challenges, I encounter an "j...
4 More Replies
- 2571 Views
- 2 replies
- 1 kudos
I am reading the data from a folder /mnt/lake/customer where mnt/lake is the mount path referring to ADLS Gen 2, Now I would like to rename a folder from /mnt/lake/customer to /mnt/lake/customeraddress without copying the data from one folder to ano...
- 2571 Views
- 2 replies
- 1 kudos
Latest Reply
Atanu
Esteemed Contributor
https://docs.databricks.com/data/databricks-file-system.html#local-file-api-limitations this might help @Simhadri Raju​
1 More Replies