- 4349 Views
- 3 replies
- 2 kudos
We already know that we can mount Azure Data Lake Gen2 with OAuth2 using this:configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
...
- 4349 Views
- 3 replies
- 2 kudos
Latest Reply
Try replacing wasbs with abfss and dfs with blob in the URI, should work!
2 More Replies
by
Woody
• New Contributor II
- 1187 Views
- 1 replies
- 2 kudos
I need to login and I can't to try Databricks..so you have a OAUTH issue...I cant try Databricks at all because the country icon doesn't work and it sends a URI issue from your front end to the back end..."Request_URI=&Geo_country_code=&Geo_country_i...
- 1187 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @Jessica Woods Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers...
- 2436 Views
- 2 replies
- 1 kudos
Square brackets in ADLS are accepted, so why can't I list the files in the folder? I have tried escaping the square brackets manually, but then the escaped values are re-escaped from %5B to %255B and %5D to %255D. I get:URISyntaxException: Illegal ...
- 2436 Views
- 2 replies
- 1 kudos
Latest Reply
@Joshua Stafford :The URISyntaxException error you are encountering is likely due to the fact that square brackets are reserved characters in URIs (Uniform Resource Identifiers) and need to be properly encoded when used in a URL. In this case, it ap...
1 More Replies
- 7847 Views
- 2 replies
- 0 kudos
I am trying to read a folder with partition files where each partition is date/hour/timestamp.csv where timestamp is the exact timestamp in ISO format, e.g. 09-2022-12-05T20:35:15.2786966Z It seems like spark having issues with reading files with col...
- 7847 Views
- 2 replies
- 0 kudos
Latest Reply
The issue was reopened again https://issues.apache.org/jira/browse/HDFS-14762
1 More Replies
- 7027 Views
- 4 replies
- 3 kudos
- 7027 Views
- 4 replies
- 3 kudos
Latest Reply
Thanks a lot for the help! Removing colon fixed it. Now I need to fix the Data Factory instance that writes to my storage container. Hope it's easy, Data Factory is such a hassle.
3 More Replies
- 3743 Views
- 1 replies
- 3 kudos
I'm trying to list number of files in s3 bucket. I've initially used "aws s3 ls <s3://>" to list the files and it worked. However, when trying to do the same using dbutils.fs.ls, I'm getting java.lang.NullPointerException: null uri host. This can be ...
- 3743 Views
- 1 replies
- 3 kudos
Latest Reply
You might be encountering an issue with bucket naming. Which I'm also getting with a bucket named something.[0-9]https://issues.apache.org/jira/browse/HADOOP-17241
- 1869 Views
- 0 replies
- 0 kudos
Hello, Today on our workspace we access everything via mount points, we plan to change it to "abfss://" because of security, governance and performance reasons. The problem is sometimes we interact with files using "python only" code, and apparently ...
- 1869 Views
- 0 replies
- 0 kudos