<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Delta Sharing from Databricks to SAP BDC fails with invalid_client error in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143889#M4724</link>
    <description>&lt;P&gt;Thank you for the support, but after changing the code and putting in the correct variables it is still returning the same error.&lt;/P&gt;&lt;LI-CODE lang="python"&gt;ValueError: Error when trying to obtain 'access_token' from JSON response: {'error_code': 'INTERNAL_ERROR', 'message': 'Failed to retrieve partner token: INTERNAL: INTERNAL_ERROR: Unable to get user object using principal context with error invalid_client. Please try again or contact support if the issue persists.', 'details': [{'@type': 'type.googleapis.com/google.rpc.RequestInfo', 'request_id': 'cea24364-363b-47cd-90d9-cfb933be4a42', 'serving_data': ''}]}
File &amp;lt;command-5894062492658599&amp;gt;, line 19
     16 # Serialize ORD metadata to JSON so the token and payload can be parsed correctly
     17 ord_json = json.dumps(open_resource_discovery_information)
---&amp;gt; 19 catalog_response = bdc_connect_client.create_or_update_share(
     20 share_name,
     21 ord_json
     22 )
File &amp;lt;command-5894062492658599&amp;gt;, line 19
     16 # Serialize ORD metadata to JSON so the token and payload can be parsed correctly
     17 ord_json = json.dumps(open_resource_discovery_information)
---&amp;gt; 19 catalog_response = bdc_connect_client.create_or_update_share(
     20 share_name,
     21 ord_json
     22 )
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/bdc_connect_client.py:51, in BdcConnectClient.create_or_update_share(self, share_name, body)
     50 def create_or_update_share(self, share_name: str, body: str | dict[str, Any]) -&amp;gt; PutShareResponse | None:
---&amp;gt; 51     self._prepare_client_for_request(share_name)
     53     body = _cast_body_string_to_dict(body)
     55     body = self.partner_client.build_create_or_update_share_request_body(body)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/bdc_connect_client.py:85, in BdcConnectClient._prepare_client_for_request(self, share_name)
     82     cert_pem, key_pem = CertificateManager().generate_self_signed_certificate()
     83     self.cert_info = CertificateInformation(cert_pem, key_pem)
---&amp;gt; 85 access_token = self.partner_client.get_access_token(self.cert_info, share_name)
     86 bdc_connect_endpoint = self.partner_client.get_bdc_connect_endpoint()
     87 tenant = self.partner_client.get_bdc_connect_tenant()
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/databricks_client.py:35, in DatabricksClient.get_access_token(self, cert_info, share_name)
     33 def get_access_token(self, cert_info: CertificateInformation, share_name: str) -&amp;gt; str:
     34     if self.is_brownfield_environment:
---&amp;gt; 35         return self._get_access_token_for_bdc_connect(cert_info, share_name)
     37     return self._get_access_token_for_databricks_connect(cert_info, share_name)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/databricks_client.py:117, in DatabricksClient._get_access_token_for_bdc_connect(self, cert_info, share_name)
    114     raise ValueError(f"Response is not valid JSON. Status code: {response.status_code}, Content: {content_preview}") from e
    116 if "access_token" not in data:
--&amp;gt; 117     raise ValueError(f"Error when trying to obtain 'access_token' from JSON response: {data}")
    119 access_token = data.get("access_token")
    121 self._store_access_token_information(access_token)&lt;/LI-CODE&gt;&lt;P&gt;So it looks like it is a problem with the client_id that is sourced from the environment. Now I have tried to go through this by going into the bdc_connect_client, using the following code:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;from bdc_connect_sdk.auth import BdcConnectClient
from bdc_connect_sdk.auth import DatabricksClient

databricks_client = DatabricksClient(dbutils, "bdc-connect-test")
bdc_connect_client = BdcConnectClient(databricks_client)

bdc_connect_client.__dict__["partner_client"].__dict__&lt;/LI-CODE&gt;&lt;P&gt;It is returning the following:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;{'bdc_connect_access_token_information': {'client_id': '',
                                          'share_location': '',
                                          'share_url': ''},
 'bdc_connect_endpoint': '',
 'bdc_connect_tenant': '',
 'databricks_api_token': '[REDACTED]',
 'databricks_workspace_url': 'https://westeurope.azuredatabricks.net',
 'dbutils': Package 'dbutils'. For more information, type 'dbutils.help()' in a cell.,
 'is_brownfield_environment': True,
 'recipient_name': 'bdc-connect-test'}&lt;/LI-CODE&gt;&lt;P&gt;In this we can see that it is identifying the environment correctly as brownfield and the location is also correctly found. Client_id here is likely just a placeholder, so nothing to worry about yet.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then after that when looking into the recipient, with this code:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;%sql
DESCRIBE RECIPIENT `bdc-connect-test`;&lt;/LI-CODE&gt;&lt;P data-unlink="true"&gt;Which errors with:&amp;nbsp;&lt;SPAN&gt;[&lt;/SPAN&gt;DELTA_SHARING_INVALID_RECIPIENT_AUTH&lt;SPAN&gt;]&lt;/SPAN&gt;&lt;SPAN&gt; Illegal authentication type UNKNOWN for recipient bdc-connect-test. SQLSTATE: 28000&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;It looks like the setup of the connection between the two platforms has been incorrectly configured, can someone confirm this? If this is correctly configured, I'll have to look into a different direction regarding the cause of the problem.&lt;/P&gt;</description>
    <pubDate>Tue, 13 Jan 2026 12:36:06 GMT</pubDate>
    <dc:creator>4Twannie</dc:creator>
    <dc:date>2026-01-13T12:36:06Z</dc:date>
    <item>
      <title>Delta Sharing from Databricks to SAP BDC fails with invalid_client error</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/141519#M4611</link>
      <description>&lt;H1&gt;Context&lt;/H1&gt;&lt;P&gt;&lt;SPAN&gt;We are in the process of extracting data between SAP BDC Datasphere and Databricks (Brownfield Implementation).&lt;/SPAN&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;SAP Datasphere is hosted in AWS (eu10)&lt;/LI&gt;&lt;LI&gt;Databricks is hosted in Azure (West Europe)&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;&lt;SPAN&gt;The BDC Connect System is located in the same region as SAP Datasphere (eu10).&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;So far, we’ve successfully:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Set up the BDC Connect connection from SAP Datasphere to Databricks.&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;Shared data from SAP to Databricks, which is now available in our catalog.&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;&lt;SPAN&gt;Read data from these delta shares in Databricks notebooks.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&lt;span class="lia-unicode-emoji" title=":white_heavy_check_mark:"&gt;✅&lt;/span&gt;Extracting and reading SAP data in Databricks works fine (even though it is slow, which would be a question for another discussion).&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;H1&gt;The Problem&lt;/H1&gt;&lt;P&gt;&lt;SPAN&gt;When trying to exchange data from Databricks back to SAP, things break down.&amp;nbsp;We followed the official&amp;nbsp;&lt;STRONG&gt;&lt;SPAN&gt;&lt;A href="https://docs.databricks.com/aws/en/delta-sharing/sap-bdc/" target="_self"&gt;Databricks Delta Sharing Instructions&lt;/A&gt;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;SPAN&gt;&lt;STRONG&gt;&lt;SPAN&gt;, &lt;/SPAN&gt;&lt;/STRONG&gt;and everything worked until the last part of Step 6:&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="lia-indent-padding-left-30px"&gt;"Use Delta Sharing to share and receive data between Databricks and SAP BDC"&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Specifically, the link: &lt;STRONG&gt;"&lt;A href="https://docs.databricks.com/aws/en/delta-sharing/sap-bdc/share-to-sap" target="_self"&gt;Grant SAP Business Data Cloud (BDC) recipients access to Delta Sharing data shares."&lt;/A&gt;&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;Here's what we did:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;SPAN&gt;Created the share.&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;Shared it with the BDC Connect connection we had set up earlier.&lt;/LI&gt;&lt;LI&gt;Used notebook to try to describe the share in Core Schema Notation so SAP BDC users could read it.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Notebook snippet (with variables filled in):&lt;/P&gt;&lt;LI-CODE lang="python"&gt;bdc_connect_client = BdcConnectClient(DatabricksClient(dbutils, "bdc-connect-test"))

share_name = "test_share"

open_resource_discovery_information = {
    "@openResourceDiscoveryV1": {
        "title": "Title to Share",
        "shortDescription": "Share Short Description",
        "description": "Share Description"
    }
}

catalog_response = bdc_connect_client.create_or_update_share(
    share_name,
    open_resource_discovery_information
)&lt;/LI-CODE&gt;&lt;P&gt;Error returned:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;ValueError: Error when trying to obtain 'access_token' from JSON response: {'error_code': 'INTERNAL_ERROR', 'message': 'Failed to retrieve partner token: INTERNAL: INTERNAL_ERROR: Unable to get user object using principal context with error invalid_client. Please try again or contact support if the issue persists.', 'details': [{'@type': 'type.googleapis.com/google.rpc.RequestInfo', 'request_id': 'c81582e9-e3f0-9d59-8ee3-7c3bbb463b92', 'serving_data': ''}]}&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;H1&gt;What we've tried so far&lt;/H1&gt;&lt;DIV&gt;&lt;UL&gt;&lt;LI&gt;Changed the compute setup: tested both serverless (across multiple environments) and all-purpose compute using Databricks Runtime 17.3 LTS (Apache Spark 4.0.0, Scala 2.13). &lt;EM&gt;Side note: across these requests, we also tested multiple versions of the BDC Connect client library (from 1.1.4 down to 1.1.1) on the different computes.&lt;/EM&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;Updated the description of the share information.&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;&lt;SPAN&gt;Re-ran the notebook with different values.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;None of these attempts resolved the issue.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;H1&gt;Question&lt;/H1&gt;&lt;P&gt;Has anyone successfully set up Delta Sharing from Databricks back into SAP BDC Datasphere?&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;SPAN&gt;Is there a specific configuration required for the Core Schema Notation step?&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;Could this be an authentication/BDC Connect principal context issue?&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;&lt;SPAN&gt;Any known workarounds for the invalid_client error when retrieving the token?&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Any guidance, experiences, or best practices would be greatly appreciated!&lt;/P&gt;&lt;/DIV&gt;</description>
      <pubDate>Tue, 09 Dec 2025 15:46:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/141519#M4611</guid>
      <dc:creator>4Twannie</dc:creator>
      <dc:date>2025-12-09T15:46:15Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Sharing from Databricks to SAP BDC fails with invalid_client error</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/141860#M4640</link>
      <description>&lt;P&gt;This is a common challenge in enterprise SAP Datasphere and Databricks integrations, particularly in brownfield, cross-cloud setups. We’ve seen multiple cases where sharing between SAP and Databricks works as expected, while the reverse path introduces additional complexity related to identity, principal context, and partner trust that is not always evident from the documentation.&lt;/P&gt;&lt;P&gt;In similar SAP BDC–Databricks implementations, the issue has usually been less about Delta Sharing mechanics and more about how authentication and BDC Connect context are established across clouds and runtimes. Once that layer is aligned, the downstream steps tend to fall into place.&lt;/P&gt;&lt;P&gt;If helpful, we’ve supported teams through comparable bidirectional sharing scenarios and can share what has worked in practice.&lt;/P&gt;</description>
      <pubDate>Mon, 15 Dec 2025 13:29:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/141860#M4640</guid>
      <dc:creator>Abeshek</dc:creator>
      <dc:date>2025-12-15T13:29:17Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Sharing from Databricks to SAP BDC fails with invalid_client error</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143844#M4723</link>
      <description>&lt;P class=""&gt;Thanks for sharing all the details and troubleshooting steps – that’s very helpful context. &lt;SPAN class=""&gt;​&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class=""&gt;In the current notebook,&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;open_resource_discovery_information&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;is defined as a Python dict. Before calling&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;create_or_update_share, this structure must be serialized to a JSON string so the BDC Connect client can correctly parse the payload and obtain the access token. Otherwise, the request body is not in the expected format, which surfaces as an&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;invalid_client&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;/ “Failed to retrieve partner token” error on the token retrieval path.&lt;SPAN class=""&gt;​&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class=""&gt;Concretely, you can adjust your code like this:&lt;/P&gt;
&lt;DIV class=""&gt;
&lt;DIV class=""&gt;
&lt;DIV class=""&gt;
&lt;DIV class=""&gt;
&lt;DIV class=""&gt;
&lt;PRE&gt;import json&lt;BR /&gt;from bdc_connect_sdk.auth import BdcConnectClient, DatabricksClient&lt;BR /&gt;&lt;BR /&gt;bdc_connect_client = BdcConnectClient(DatabricksClient(dbutils, "bdc-connect-test"))&lt;BR /&gt;&lt;BR /&gt;share_name = "test_share"&lt;BR /&gt;&lt;BR /&gt;open_resource_discovery_information = {&lt;BR /&gt;"@openResourceDiscoveryV1": {&lt;BR /&gt;"title": "Title to Share",&lt;BR /&gt;"shortDescription": "Share Short Description",&lt;BR /&gt;"description": "Share Description"&lt;BR /&gt;}&lt;BR /&gt;}&lt;BR /&gt;&lt;BR /&gt;# Serialize ORD metadata to JSON so the token and payload can be parsed correctly&lt;BR /&gt;ord_json = json.dumps(open_resource_discovery_information)&lt;BR /&gt;&lt;BR /&gt;catalog_response = bdc_connect_client.create_or_update_share(&lt;BR /&gt;share_name,&lt;BR /&gt;ord_json&lt;BR /&gt;)&lt;/PRE&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;
&lt;P class=""&gt;This ensures the ORD metadata follows the expected JSON format used by the BDC Connect SDK and should allow the access token to be parsed correctly in your environment.&lt;SPAN class=""&gt;​&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 13 Jan 2026 09:12:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143844#M4723</guid>
      <dc:creator>anshu_roy</dc:creator>
      <dc:date>2026-01-13T09:12:43Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Sharing from Databricks to SAP BDC fails with invalid_client error</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143889#M4724</link>
      <description>&lt;P&gt;Thank you for the support, but after changing the code and putting in the correct variables it is still returning the same error.&lt;/P&gt;&lt;LI-CODE lang="python"&gt;ValueError: Error when trying to obtain 'access_token' from JSON response: {'error_code': 'INTERNAL_ERROR', 'message': 'Failed to retrieve partner token: INTERNAL: INTERNAL_ERROR: Unable to get user object using principal context with error invalid_client. Please try again or contact support if the issue persists.', 'details': [{'@type': 'type.googleapis.com/google.rpc.RequestInfo', 'request_id': 'cea24364-363b-47cd-90d9-cfb933be4a42', 'serving_data': ''}]}
File &amp;lt;command-5894062492658599&amp;gt;, line 19
     16 # Serialize ORD metadata to JSON so the token and payload can be parsed correctly
     17 ord_json = json.dumps(open_resource_discovery_information)
---&amp;gt; 19 catalog_response = bdc_connect_client.create_or_update_share(
     20 share_name,
     21 ord_json
     22 )
File &amp;lt;command-5894062492658599&amp;gt;, line 19
     16 # Serialize ORD metadata to JSON so the token and payload can be parsed correctly
     17 ord_json = json.dumps(open_resource_discovery_information)
---&amp;gt; 19 catalog_response = bdc_connect_client.create_or_update_share(
     20 share_name,
     21 ord_json
     22 )
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/bdc_connect_client.py:51, in BdcConnectClient.create_or_update_share(self, share_name, body)
     50 def create_or_update_share(self, share_name: str, body: str | dict[str, Any]) -&amp;gt; PutShareResponse | None:
---&amp;gt; 51     self._prepare_client_for_request(share_name)
     53     body = _cast_body_string_to_dict(body)
     55     body = self.partner_client.build_create_or_update_share_request_body(body)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/bdc_connect_client.py:85, in BdcConnectClient._prepare_client_for_request(self, share_name)
     82     cert_pem, key_pem = CertificateManager().generate_self_signed_certificate()
     83     self.cert_info = CertificateInformation(cert_pem, key_pem)
---&amp;gt; 85 access_token = self.partner_client.get_access_token(self.cert_info, share_name)
     86 bdc_connect_endpoint = self.partner_client.get_bdc_connect_endpoint()
     87 tenant = self.partner_client.get_bdc_connect_tenant()
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/databricks_client.py:35, in DatabricksClient.get_access_token(self, cert_info, share_name)
     33 def get_access_token(self, cert_info: CertificateInformation, share_name: str) -&amp;gt; str:
     34     if self.is_brownfield_environment:
---&amp;gt; 35         return self._get_access_token_for_bdc_connect(cert_info, share_name)
     37     return self._get_access_token_for_databricks_connect(cert_info, share_name)
File /local_disk0/.ephemeral_nfs/envs/pythonEnv-5973f1ae-85db-47f6-ab11-a54e8ade95c3/lib/python3.12/site-packages/bdc_connect_sdk/auth/databricks_client.py:117, in DatabricksClient._get_access_token_for_bdc_connect(self, cert_info, share_name)
    114     raise ValueError(f"Response is not valid JSON. Status code: {response.status_code}, Content: {content_preview}") from e
    116 if "access_token" not in data:
--&amp;gt; 117     raise ValueError(f"Error when trying to obtain 'access_token' from JSON response: {data}")
    119 access_token = data.get("access_token")
    121 self._store_access_token_information(access_token)&lt;/LI-CODE&gt;&lt;P&gt;So it looks like it is a problem with the client_id that is sourced from the environment. Now I have tried to go through this by going into the bdc_connect_client, using the following code:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;from bdc_connect_sdk.auth import BdcConnectClient
from bdc_connect_sdk.auth import DatabricksClient

databricks_client = DatabricksClient(dbutils, "bdc-connect-test")
bdc_connect_client = BdcConnectClient(databricks_client)

bdc_connect_client.__dict__["partner_client"].__dict__&lt;/LI-CODE&gt;&lt;P&gt;It is returning the following:&lt;/P&gt;&lt;LI-CODE lang="python"&gt;{'bdc_connect_access_token_information': {'client_id': '',
                                          'share_location': '',
                                          'share_url': ''},
 'bdc_connect_endpoint': '',
 'bdc_connect_tenant': '',
 'databricks_api_token': '[REDACTED]',
 'databricks_workspace_url': 'https://westeurope.azuredatabricks.net',
 'dbutils': Package 'dbutils'. For more information, type 'dbutils.help()' in a cell.,
 'is_brownfield_environment': True,
 'recipient_name': 'bdc-connect-test'}&lt;/LI-CODE&gt;&lt;P&gt;In this we can see that it is identifying the environment correctly as brownfield and the location is also correctly found. Client_id here is likely just a placeholder, so nothing to worry about yet.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Then after that when looking into the recipient, with this code:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;%sql
DESCRIBE RECIPIENT `bdc-connect-test`;&lt;/LI-CODE&gt;&lt;P data-unlink="true"&gt;Which errors with:&amp;nbsp;&lt;SPAN&gt;[&lt;/SPAN&gt;DELTA_SHARING_INVALID_RECIPIENT_AUTH&lt;SPAN&gt;]&lt;/SPAN&gt;&lt;SPAN&gt; Illegal authentication type UNKNOWN for recipient bdc-connect-test. SQLSTATE: 28000&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;It looks like the setup of the connection between the two platforms has been incorrectly configured, can someone confirm this? If this is correctly configured, I'll have to look into a different direction regarding the cause of the problem.&lt;/P&gt;</description>
      <pubDate>Tue, 13 Jan 2026 12:36:06 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143889#M4724</guid>
      <dc:creator>4Twannie</dc:creator>
      <dc:date>2026-01-13T12:36:06Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Sharing from Databricks to SAP BDC fails with invalid_client error</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143890#M4725</link>
      <description>&lt;P&gt;The error DELTA_SHARING_INVALID_RECIPIENT_AUTH refers to an invalid authorization specification when accessing Delta Sharing resources. This maps to SQLSTATE code 28000 ("invalid authorization specification") and typically occurs when the recipient's authentication with the Delta Sharing service has failed or is misconfigured.&lt;/P&gt;
&lt;P&gt;It could be because of -&amp;nbsp;&lt;/P&gt;
&lt;UL class="p8i6j08 p8i6j02"&gt;
&lt;LI class="p8i6j0a"&gt;Recipient has not activated their credentials—status may show as "Pending" in the UI until activation is completed.&lt;/LI&gt;
&lt;LI class="p8i6j0a"&gt;There may be an invalid recipient configuration or a missing/incorrect recipient token.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;I'd suggest to re‑validate BDC Connect client and “recipient-name” and work the below checks&amp;nbsp; -&lt;/P&gt;
&lt;UL class="marker:text-quiet list-disc"&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;In SAP BDC, open the Databricks connection (BDC Connect for Databricks) and confirm:
&lt;UL&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;The connection status is&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;Connected&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;and uses the same Databricks workspace you are calling from.&lt;SPAN class="inline-flex" aria-label="Grant SAP Business Data Cloud (BDC) recipients access to Delta ..." data-state="closed"&gt;​&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;The&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;recipient&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;name shown in BDC exactly matches the string you pass into&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;DatabricksClient(dbutils, "&amp;lt;recipient-name&amp;gt;")&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;(case‑sensitive, no extra spaces)&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;&lt;SPAN class="inline-flex" data-state="closed"&gt;​In Databricks, verify that&lt;/SPAN&gt;&amp;nbsp;
&lt;UL&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;You are running the notebook in the workspace that is linked to that BDC connection.&amp;nbsp;&lt;/LI&gt;
&lt;LI class="py-0 my-0 prose-p:pt-0 prose-p:mb-2 prose-p:my-0 [&amp;amp;&amp;gt;p]:pt-0 [&amp;amp;&amp;gt;p]:mb-2 [&amp;amp;&amp;gt;p]:my-0"&gt;
&lt;P class="my-2 [&amp;amp;+p]:mt-4 [&amp;amp;_strong:has(+br)]:inline-block [&amp;amp;_strong:has(+br)]:pb-2"&gt;The cluster/warehouse has access to the Unity Catalog and shares you intend to publish.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;If the error persists, please open a support ticket with SAP team, who can look into the logs and address the authentication issue.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 13 Jan 2026 13:15:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/143890#M4725</guid>
      <dc:creator>anshu_roy</dc:creator>
      <dc:date>2026-01-13T13:15:03Z</dc:date>
    </item>
    <item>
      <title>Re: Delta Sharing from Databricks to SAP BDC fails with invalid_client error</title>
      <link>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/145302#M4764</link>
      <description>&lt;P&gt;After some testing and figuring out, we are still at the same point as before. All the settings you mentioned are as expected.&amp;nbsp;Got advised to take pick it up with our Solutions Architect, if nothing will come from that then I will start picking it up with the SAP team.&lt;/P&gt;</description>
      <pubDate>Mon, 26 Jan 2026 20:12:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/delta-sharing-from-databricks-to-sap-bdc-fails-with-invalid/m-p/145302#M4764</guid>
      <dc:creator>4Twannie</dc:creator>
      <dc:date>2026-01-26T20:12:43Z</dc:date>
    </item>
  </channel>
</rss>

