Have you tried creating a volume on top of the external location, and using the volume in spark.read.parquet?
i.e.
spark.read.parquet('/Volumes/<volume_name>/<folder_name>/<file_name.parquet>')
Edit: also, not sure why the Databricks community manager here said Shared access mode is "in preview" and that its "not recommended for production workloads", because this is completely false. It is not in preview and completely safe for production workloads. It has been for almost 2 years. The only thing in preview for shared access mode clusters right now are scala workloads.