Databricks workspace is hosted in AWS. Trying to access data in Google Cloud Platform.
I have followed the instructions here: https://docs.databricks.com/en/connect/storage/gcs.html
I get error: "java.io.IOException: Invalid PKCS8 data." when trying to do a dbutils.fs.ls() to view contents.
Looking online, it looks like there is an issue with the private key needing to be encoded as bytes, but I am not sure how I set this in the cluster spark config. Example solutions online are using notebook code to the encoding, but that's not an option in the spark config.