I have a function which is meant to use the `cloudFiles` source to stream file contents from s3. It is configured like this:
```
stream = (
spark.readStream.format("cloudFiles")
.option("cloudFiles.format", "text")
.option("cloudFiles.schemaLocation", MY_CHECKPOINT_PATH)
.option("wholeText", True)
.option("cloudFiles.fetchParallelism", ๐
.option("cloudFiles.pathGlobFilter", "*/subdir/*")
.load(MY_S3_PATH)
)
```
According to the autoloader docs, this is a valid option, but when I run this in a notebook on DBR 11.3 LTS personal cluster, I get `CloudFilesIllegalArgumentException: Found unknown option keys: cloudFiles.pathglobfilter`
This is on an AWS deployment of databricks. I've also tried running on a 12.2 cluster, with the same result. I've also tried a number of different version of the glob filter pattern itself, to no avail. This is the simplest one (my real use case would need a comma-separated string of options or something like that).