- 4499 Views
- 3 replies
- 2 kudos
We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
...
- 4499 Views
- 3 replies
- 2 kudos
Latest Reply
Try replacing wasbs with abfss and dfs with blob in the URI, should work!
2 More Replies
- 10949 Views
- 2 replies
- 1 kudos
I want to read data from s3 access point.I successfully accessed using boto3 client to data through s3 access point.s3 = boto3.resource('s3')ap = s3.Bucket('arn:aws:s3:[region]:[aws account id]:accesspoint/[S3 Access Point name]')for obj in ap.object...
- 10949 Views
- 2 replies
- 1 kudos
Latest Reply
I'm reaching out to seek assistance as I navigate an issue. Currently, I'm trying to read JSON files from an S3 Multi-Region Access Point using a Databricks notebook. While reading directly from the S3 bucket presents no challenges, I encounter an "j...
1 More Replies
- 4915 Views
- 1 replies
- 1 kudos
I am reading the data from a folder /mnt/lake/customer where mnt/lake is the mount path referring to ADLS Gen 2, Now I would like to rename a folder from /mnt/lake/customer to /mnt/lake/customeraddress without copying the data from one folder to ano...
- 4915 Views
- 1 replies
- 1 kudos
Latest Reply
Atanu
Databricks Employee
https://docs.databricks.com/data/databricks-file-system.html#local-file-api-limitations this might help @Simhadri Raju