<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154968#M5164</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/176516"&gt;@emma_s&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;P&gt;I’ve reviewed the setup and wanted to clarify the behavior I’m seeing with the ServiceNow connector and U2M OAuth.&lt;/P&gt;&lt;P&gt;The ServiceNow connection was created successfully using a &lt;STRONG&gt;U2M OAuth integration user&lt;/STRONG&gt;, and that integration user has &lt;STRONG&gt;admin permissions&lt;/STRONG&gt; in ServiceNow. The connection test succeeds without any issues.&lt;/P&gt;&lt;P&gt;However, during &lt;STRONG&gt;actual data ingestion&lt;/STRONG&gt;, it appears that the connector is &lt;STRONG&gt;not executing purely in the context of the U2M OAuth integration user&lt;/STRONG&gt;. Instead, ingestion seems to also depend on the &lt;STRONG&gt;Databricks workspace user who created or is running the pipeline&lt;/STRONG&gt;. When that workspace user does not have the required ServiceNow permissions, the ingestion returns no data. If ServiceNow permissions are granted to the workspace user, the data becomes visible.&lt;/P&gt;&lt;P&gt;I’m trying to understand whether this behavior is expected with &lt;STRONG&gt;U2M OAuth&lt;/STRONG&gt;—i.e., pipelines executing under the workspace user’s identity rather than strictly under the integration user—and whether &lt;STRONG&gt;app‑level (client credentials) authentication&lt;/STRONG&gt; is the recommended approach for unattended ingestion scenarios where execution should not depend on individual Databricks users.&lt;/P&gt;&lt;P&gt;Any clarification from the Databricks team or others who have implemented this would be helpful.&lt;/P&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Mon, 20 Apr 2026 15:09:34 GMT</pubDate>
    <dc:creator>LokeshChikuru</dc:creator>
    <dc:date>2026-04-20T15:09:34Z</dc:date>
    <item>
      <title>Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154763#M5154</link>
      <description>&lt;P&gt;Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion and looking for guidance on enforcing integration-user based data access.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Observed behaviour&lt;/STRONG&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;U2M OAuth authentication succeeds when ServiceNow access is granted to the workspace‑logged‑in ADID user however, when using the ServiceNow integration user (even with admin privileges), the source returns no data.&lt;/LI&gt;&lt;LI&gt;When the pipeline is run as the workspace user (ADID) and that user has admin privileges in ServiceNow, data is fetched successfully.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;This suggests OAuth is working , but the effective ServiceNow identity differs for running data ingestion in databricks (integration user vs ADID user), impacting data visibility.&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;Clarification Required&lt;/STRONG&gt;:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Is there a supported way to ensure ServiceNow ingestion always uses the U2M integration user regardless of the workspace user running the pipeline?&lt;/LI&gt;&lt;LI&gt;Does enabling “run as workspace user” force a U2M data access path by design?&lt;/LI&gt;&lt;LI&gt;What is the recommended production model for ServiceNow ingestion and how should pipelines be configured in databricks ?&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Our goal is to use a consistent Databricks recommended best practice for production without relying on human ADID privileges.&lt;/P&gt;</description>
      <pubDate>Thu, 16 Apr 2026 19:32:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154763#M5154</guid>
      <dc:creator>LokeshChikuru</dc:creator>
      <dc:date>2026-04-16T19:32:13Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154782#M5155</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/222951"&gt;@LokeshChikuru&lt;/a&gt;&amp;nbsp;Have you checked the docs? This might help -&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/servicenow-troubleshoot#-authentication-error" target="_blank"&gt;https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/servicenow-troubleshoot#-authentication-error&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 17 Apr 2026 05:57:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154782#M5155</guid>
      <dc:creator>Sumit_7</dc:creator>
      <dc:date>2026-04-17T05:57:41Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154801#M5158</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/175319"&gt;@Sumit_7&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;BLOCKQUOTE&gt;&lt;P&gt;I have reviewed the configuration and do not see any issues with authentication to ServiceNow using the U2M approach (OAuth application with an Integration User).&lt;/P&gt;&lt;P&gt;However, I would like to understand &lt;STRONG&gt;which user context is used when the data fetch occurs during pipeline execution&lt;/STRONG&gt;.&lt;/P&gt;&lt;P&gt;Based on my observations, the &lt;STRONG&gt;ServiceNow integration user is not being used during pipeline execution&lt;/STRONG&gt;. As a result, no data is returned.&lt;BR /&gt;When ServiceNow admin privileges are assigned to the &lt;STRONG&gt;AD user who logged into the Databricks workspace&lt;/STRONG&gt;, the ServiceNow table data becomes visible.&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;/DIV&gt;</description>
      <pubDate>Fri, 17 Apr 2026 12:11:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154801#M5158</guid>
      <dc:creator>LokeshChikuru</dc:creator>
      <dc:date>2026-04-17T12:11:29Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154821#M5160</link>
      <description>&lt;P&gt;Hi, looking through some internal resources,&amp;nbsp;it seems most likely to be down to ServiceNow-side ACLs, High Security Settings, or domain/scope restrictions overriding the admin role on system tables the connector queries.&lt;BR /&gt;&lt;BR /&gt;Quick things to check: &lt;BR /&gt;&lt;BR /&gt;- Run this curl as the integration user against ServiceNow: GET /api/now/v2/table/sys_db_object?sysparm_query=name=&amp;lt;your_table&amp;gt;&amp;amp;sysparm_fields=super_class.name. A 403 or empty result confirms it's a&lt;BR /&gt;ServiceNow-side ACL issue, not the connector. Test sys_dictionary the same way. &lt;BR /&gt;- Compare ACLs on sys_db_object, sys_dictionary, sys_glide_object between a working env (your DEV) and the failing one: that usually surfaces the difference fast. &lt;BR /&gt;- Check for glide.security.strict or custom ACLs overriding admin.&lt;BR /&gt;- Check whether the integration user is in a different application scope or domain than the data — domain separation isn't overridden by admin role. &lt;BR /&gt;- Confirm the admin role is state = "active" on sys_user_has_role, not "requested" or "inactive". &lt;BR /&gt;- There's a workspace-level pipeline flag (ingestionPipelineServiceNowNonAdminAccessSchemaFetchEnabled) for least-privilege setups , support can enable it if needed. &lt;BR /&gt;&lt;BR /&gt;If you want to take OAuth identity off the table entirely while you debug, switching the connection to ROPC (integration user's username + password) removes any ambiguity about who's hitting ServiceNow at runtime. ServiceNow connector supports both —&lt;BR /&gt;&lt;A href="https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/servicenow-source-setup" target="_blank"&gt;https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/servicenow-source-setup&lt;/A&gt;. &lt;BR /&gt;&lt;BR /&gt;If none of the above sorts it, raise a support ticket with the curl response (status + body), the failing pipeline ID, and your workspace ID.&lt;/P&gt;
&lt;P&gt;I hope this helps.&lt;/P&gt;
&lt;P&gt;Thanks,&lt;BR /&gt;&lt;BR /&gt;Emma&lt;/P&gt;</description>
      <pubDate>Fri, 17 Apr 2026 15:23:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154821#M5160</guid>
      <dc:creator>emma_s</dc:creator>
      <dc:date>2026-04-17T15:23:48Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion</title>
      <link>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154968#M5164</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/176516"&gt;@emma_s&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;DIV&gt;&lt;P&gt;I’ve reviewed the setup and wanted to clarify the behavior I’m seeing with the ServiceNow connector and U2M OAuth.&lt;/P&gt;&lt;P&gt;The ServiceNow connection was created successfully using a &lt;STRONG&gt;U2M OAuth integration user&lt;/STRONG&gt;, and that integration user has &lt;STRONG&gt;admin permissions&lt;/STRONG&gt; in ServiceNow. The connection test succeeds without any issues.&lt;/P&gt;&lt;P&gt;However, during &lt;STRONG&gt;actual data ingestion&lt;/STRONG&gt;, it appears that the connector is &lt;STRONG&gt;not executing purely in the context of the U2M OAuth integration user&lt;/STRONG&gt;. Instead, ingestion seems to also depend on the &lt;STRONG&gt;Databricks workspace user who created or is running the pipeline&lt;/STRONG&gt;. When that workspace user does not have the required ServiceNow permissions, the ingestion returns no data. If ServiceNow permissions are granted to the workspace user, the data becomes visible.&lt;/P&gt;&lt;P&gt;I’m trying to understand whether this behavior is expected with &lt;STRONG&gt;U2M OAuth&lt;/STRONG&gt;—i.e., pipelines executing under the workspace user’s identity rather than strictly under the integration user—and whether &lt;STRONG&gt;app‑level (client credentials) authentication&lt;/STRONG&gt; is the recommended approach for unattended ingestion scenarios where execution should not depend on individual Databricks users.&lt;/P&gt;&lt;P&gt;Any clarification from the Databricks team or others who have implemented this would be helpful.&lt;/P&gt;&lt;/DIV&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Apr 2026 15:09:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/databricks-integrating-with-servicenow-via-lakeflow-connect-for/m-p/154968#M5164</guid>
      <dc:creator>LokeshChikuru</dc:creator>
      <dc:date>2026-04-20T15:09:34Z</dc:date>
    </item>
  </channel>
</rss>

