<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: [Unity Catalog] Lack of Credential Type When GCS Interworking in Database ricks in AWS Environme in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/unity-catalog-lack-of-credential-type-when-gcs-interworking-in/m-p/153806#M5124</link>
    <description>&lt;P&gt;&lt;SPAN&gt;Hi — this is expected behavior, not a bug. Unity Catalog storage credentials in the UI are &lt;/SPAN&gt;&lt;STRONG&gt;cloud-specific to your workspace deployment&lt;/STRONG&gt;&lt;SPAN&gt;. Since your workspace runs on AWS, you only see AWS IAM Role and Cloudflare API Token. The GCP Service Account option only appears on GCP-deployed Databricks workspaces.&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3&gt;&lt;STRONG&gt;How to Access GCS from an AWS Databricks Workspace&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN&gt;Unity Catalog external locations don't support cross-cloud storage credentials, but you have a few options:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Option 1: GCS Connector + Service Account Key (most common)&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Upload the GCS connector JAR and authenticate using a GCP service account key stored in a Databricks secret scope:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# Store your GCP SA key JSON in a secret scope first:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# databricks secrets put-secret --scope gcp --key sa-key --string-value '&amp;lt;json&amp;gt;'&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;service_account_key = dbutils.secrets.get("gcp", "sa-key")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.gs.auth.type", "SERVICE_ACCOUNT_JSON_KEYFILE")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.gs.auth.service.account.json.keyfile", service_account_key)&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# OR write the key to a temp file and use:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# spark.conf.set("fs.gs.auth.service.account.json.keyfile", "/tmp/sa-key.json")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;df = spark.read.format("parquet").load("gs://your-bucket/path/")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;You'll need the &lt;/SPAN&gt;&lt;SPAN&gt;gcs-connector&lt;/SPAN&gt;&lt;SPAN&gt; JAR installed on your cluster (add it via cluster Libraries tab or init script).&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Option 2: GCS S3-Compatible API with HMAC Keys&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;GCS supports S3-compatible access. Create HMAC keys in GCP, then use the S3A connector:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.s3a.endpoint", "&lt;A href="https://storage.googleapis.com" target="_blank"&gt;https://storage.googleapis.com&lt;/A&gt;")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.s3a.access.key", dbutils.secrets.get("gcp", "hmac-access-key"))&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.s3a.secret.key", dbutils.secrets.get("gcp", "hmac-secret-key"))&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;df = spark.read.format("parquet").load("s3a://your-gcs-bucket/path/")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Option 3: Delta Sharing (if data is on a GCP Databricks workspace)&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;If the GCS data is managed by another Databricks workspace on GCP, the cleanest approach is &lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/delta-sharing/" target="_blank"&gt;&lt;SPAN&gt;Delta Sharing&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN&gt; — share the tables from the GCP workspace and consume them in your AWS workspace. No cross-cloud credentials needed.&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3&gt;&lt;STRONG&gt;Summary&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Approach&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Unity Catalog Governed&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Needs JAR&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Complexity&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;GCS Connector + SA Key&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Yes&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Medium&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;HMAC / S3-Compatible&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Low&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Delta Sharing&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Yes&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Low&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;STRONG&gt;Note:&lt;/STRONG&gt;&lt;SPAN&gt; Options 1 and 2 bypass Unity Catalog governance (no external locations / storage credentials). If governance is a requirement, Delta Sharing is the recommended path.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Docs:&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI style="font-weight: 400;" aria-level="1"&gt;&lt;A href="https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/" target="_blank"&gt;&lt;SPAN&gt;Connect to Cloud Storage via Unity Catalog (AWS)&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI style="font-weight: 400;" aria-level="1"&gt;&lt;A href="https://docs.databricks.com/aws/en/delta-sharing/" target="_blank"&gt;&lt;SPAN&gt;Delta Sharing&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI style="font-weight: 400;" aria-level="1"&gt;&lt;A href="https://cloud.google.com/dataproc/docs/concepts/connectors/cloud-storage" target="_blank"&gt;&lt;SPAN&gt;GCS Connector for Spark&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
    <pubDate>Thu, 09 Apr 2026 03:35:04 GMT</pubDate>
    <dc:creator>anuj_lathi</dc:creator>
    <dc:date>2026-04-09T03:35:04Z</dc:date>
    <item>
      <title>[Unity Catalog] Lack of Credential Type When GCS Interworking in Database ricks in AWS Environment</title>
      <link>https://community.databricks.com/t5/administration-architecture/unity-catalog-lack-of-credential-type-when-gcs-interworking-in/m-p/153801#M5123</link>
      <description>&lt;P&gt;Hi, I'm using Databricks in AWS environment and&lt;BR /&gt;I'm trying to link the data from GCP GCS to Unity Catalog.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;[Official document]&lt;/STRONG&gt;&lt;BR /&gt;I tried to set it up by referring to the official guide of Databricks below.&lt;/P&gt;&lt;P&gt;▶&amp;nbsp;&lt;A href="https://docs.databricks.com/gcp/en/connect/unity-catalog/cloud-services/service-credentials#optional-assign-a-service-credential-to-specific-workspaces" target="_self"&gt;Create service credentials guides&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;[Problem situation]&lt;/STRONG&gt;&lt;BR /&gt;The document says that Catalog → External Data → Credentials should create credentials by selecting the GCP Service Account type, but in my environment.&lt;/P&gt;&lt;P&gt;Menu mismatch: When selecting Credential Type, only AWS IAM Role and Cloudflare API Token are exposed, and the GCP Service Account option specified in the document is not visible at all. (Please check the attached image.)&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="스크린샷 2026-04-09 100042.png" style="width: 381px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/25812i475CE4D02DF6CCB5/image-size/medium?v=v2&amp;amp;px=400" role="button" title="스크린샷 2026-04-09 100042.png" alt="스크린샷 2026-04-09 100042.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Environmental Differences: Our team's databricks workspace is currently running in AWS environments.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;[Question]&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Is it normal for AWS-based workspaces to restrict GCP Credentials generation via UI?&lt;/LI&gt;&lt;LI&gt;If you need AWS to GCP interworking, like me, I'm wondering if you need to force create credentials using CLI or API instead of UI, or if there's a way to enable other cloud credentials in the meta store settings.&lt;/LI&gt;&lt;LI&gt;Or, if you have a separate best practice to connect GCS in AWS environment, please share it&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;STRONG&gt;[Environmental Information]&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Runtime: 16.3&lt;/LI&gt;&lt;LI&gt;Unity Catalog: Enabled&lt;/LI&gt;&lt;LI&gt;Cloud Environment: AWS&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Thu, 09 Apr 2026 01:05:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unity-catalog-lack-of-credential-type-when-gcs-interworking-in/m-p/153801#M5123</guid>
      <dc:creator>Junhyeon-Jeon_1</dc:creator>
      <dc:date>2026-04-09T01:05:48Z</dc:date>
    </item>
    <item>
      <title>Re: [Unity Catalog] Lack of Credential Type When GCS Interworking in Database ricks in AWS Environme</title>
      <link>https://community.databricks.com/t5/administration-architecture/unity-catalog-lack-of-credential-type-when-gcs-interworking-in/m-p/153806#M5124</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Hi — this is expected behavior, not a bug. Unity Catalog storage credentials in the UI are &lt;/SPAN&gt;&lt;STRONG&gt;cloud-specific to your workspace deployment&lt;/STRONG&gt;&lt;SPAN&gt;. Since your workspace runs on AWS, you only see AWS IAM Role and Cloudflare API Token. The GCP Service Account option only appears on GCP-deployed Databricks workspaces.&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3&gt;&lt;STRONG&gt;How to Access GCS from an AWS Databricks Workspace&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN&gt;Unity Catalog external locations don't support cross-cloud storage credentials, but you have a few options:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Option 1: GCS Connector + Service Account Key (most common)&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Upload the GCS connector JAR and authenticate using a GCP service account key stored in a Databricks secret scope:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# Store your GCP SA key JSON in a secret scope first:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# databricks secrets put-secret --scope gcp --key sa-key --string-value '&amp;lt;json&amp;gt;'&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;service_account_key = dbutils.secrets.get("gcp", "sa-key")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.gs.auth.type", "SERVICE_ACCOUNT_JSON_KEYFILE")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.gs.auth.service.account.json.keyfile", service_account_key)&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# OR write the key to a temp file and use:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;# spark.conf.set("fs.gs.auth.service.account.json.keyfile", "/tmp/sa-key.json")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;df = spark.read.format("parquet").load("gs://your-bucket/path/")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;You'll need the &lt;/SPAN&gt;&lt;SPAN&gt;gcs-connector&lt;/SPAN&gt;&lt;SPAN&gt; JAR installed on your cluster (add it via cluster Libraries tab or init script).&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Option 2: GCS S3-Compatible API with HMAC Keys&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;GCS supports S3-compatible access. Create HMAC keys in GCP, then use the S3A connector:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.s3a.endpoint", "&lt;A href="https://storage.googleapis.com" target="_blank"&gt;https://storage.googleapis.com&lt;/A&gt;")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.s3a.access.key", dbutils.secrets.get("gcp", "hmac-access-key"))&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;spark.conf.set("fs.s3a.secret.key", dbutils.secrets.get("gcp", "hmac-secret-key"))&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;df = spark.read.format("parquet").load("s3a://your-gcs-bucket/path/")&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Option 3: Delta Sharing (if data is on a GCP Databricks workspace)&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;If the GCS data is managed by another Databricks workspace on GCP, the cleanest approach is &lt;/SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/delta-sharing/" target="_blank"&gt;&lt;SPAN&gt;Delta Sharing&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN&gt; — share the tables from the GCP workspace and consume them in your AWS workspace. No cross-cloud credentials needed.&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3&gt;&lt;STRONG&gt;Summary&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Approach&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Unity Catalog Governed&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Needs JAR&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;STRONG&gt;Complexity&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;GCS Connector + SA Key&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Yes&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Medium&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;HMAC / S3-Compatible&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Low&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Delta Sharing&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Yes&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;No&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD&gt;
&lt;P&gt;&lt;SPAN&gt;Low&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;STRONG&gt;Note:&lt;/STRONG&gt;&lt;SPAN&gt; Options 1 and 2 bypass Unity Catalog governance (no external locations / storage credentials). If governance is a requirement, Delta Sharing is the recommended path.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Docs:&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI style="font-weight: 400;" aria-level="1"&gt;&lt;A href="https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-storage/" target="_blank"&gt;&lt;SPAN&gt;Connect to Cloud Storage via Unity Catalog (AWS)&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI style="font-weight: 400;" aria-level="1"&gt;&lt;A href="https://docs.databricks.com/aws/en/delta-sharing/" target="_blank"&gt;&lt;SPAN&gt;Delta Sharing&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI style="font-weight: 400;" aria-level="1"&gt;&lt;A href="https://cloud.google.com/dataproc/docs/concepts/connectors/cloud-storage" target="_blank"&gt;&lt;SPAN&gt;GCS Connector for Spark&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Thu, 09 Apr 2026 03:35:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/unity-catalog-lack-of-credential-type-when-gcs-interworking-in/m-p/153806#M5124</guid>
      <dc:creator>anuj_lathi</dc:creator>
      <dc:date>2026-04-09T03:35:04Z</dc:date>
    </item>
  </channel>
</rss>

