java.io.IOException: Invalid PKCS8 data error when reading data from Google Storage
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-22-2024 09:56 AM
Databricks workspace is hosted in AWS. Trying to access data in Google Cloud Platform.
I have followed the instructions here: https://docs.databricks.com/en/connect/storage/gcs.html
I get error: "java.io.IOException: Invalid PKCS8 data." when trying to do a dbutils.fs.ls() to view contents.
Looking online, it looks like there is an issue with the private key needing to be encoded as bytes, but I am not sure how I set this in the cluster spark config. Example solutions online are using notebook code to the encoding, but that's not an option in the spark config.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-22-2024 04:05 PM
@364488 - can you please share the spark configs added to the cluster?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-22-2024 07:29 PM
Hi, Could you also please share the whole error stack?

