cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion

LokeshChikuru
Databricks Partner

Databricks integrating with ServiceNow via Lakeflow Connect for data ingestion and looking for guidance on enforcing integration-user based data access.

Observed behaviour

  • U2M OAuth authentication succeeds when ServiceNow access is granted to the workspaceโ€‘loggedโ€‘in ADID user however, when using the ServiceNow integration user (even with admin privileges), the source returns no data.
  • When the pipeline is run as the workspace user (ADID) and that user has admin privileges in ServiceNow, data is fetched successfully.

This suggests OAuth is working , but the effective ServiceNow identity differs for running data ingestion in databricks (integration user vs ADID user), impacting data visibility.

Clarification Required:

  1. Is there a supported way to ensure ServiceNow ingestion always uses the U2M integration user regardless of the workspace user running the pipeline?
  2. Does enabling โ€œrun as workspace userโ€ force a U2M data access path by design?
  3. What is the recommended production model for ServiceNow ingestion and how should pipelines be configured in databricks ?

Our goal is to use a consistent Databricks recommended best practice for production without relying on human ADID privileges.

4 REPLIES 4

Sumit_7
Honored Contributor III

LokeshChikuru
Databricks Partner

Hi @Sumit_7 

I have reviewed the configuration and do not see any issues with authentication to ServiceNow using the U2M approach (OAuth application with an Integration User).

However, I would like to understand which user context is used when the data fetch occurs during pipeline execution.

Based on my observations, the ServiceNow integration user is not being used during pipeline execution. As a result, no data is returned.
When ServiceNow admin privileges are assigned to the AD user who logged into the Databricks workspace, the ServiceNow table data becomes visible.

emma_s
Databricks Employee
Databricks Employee

Hi, looking through some internal resources, it seems most likely to be down to ServiceNow-side ACLs, High Security Settings, or domain/scope restrictions overriding the admin role on system tables the connector queries.

Quick things to check:

- Run this curl as the integration user against ServiceNow: GET /api/now/v2/table/sys_db_object?sysparm_query=name=<your_table>&sysparm_fields=super_class.name. A 403 or empty result confirms it's a
ServiceNow-side ACL issue, not the connector. Test sys_dictionary the same way.
- Compare ACLs on sys_db_object, sys_dictionary, sys_glide_object between a working env (your DEV) and the failing one: that usually surfaces the difference fast.
- Check for glide.security.strict or custom ACLs overriding admin.
- Check whether the integration user is in a different application scope or domain than the data โ€” domain separation isn't overridden by admin role.
- Confirm the admin role is state = "active" on sys_user_has_role, not "requested" or "inactive".
- There's a workspace-level pipeline flag (ingestionPipelineServiceNowNonAdminAccessSchemaFetchEnabled) for least-privilege setups , support can enable it if needed.

If you want to take OAuth identity off the table entirely while you debug, switching the connection to ROPC (integration user's username + password) removes any ambiguity about who's hitting ServiceNow at runtime. ServiceNow connector supports both โ€”
https://docs.databricks.com/aws/en/ingestion/lakeflow-connect/servicenow-source-setup.

If none of the above sorts it, raise a support ticket with the curl response (status + body), the failing pipeline ID, and your workspace ID.

I hope this helps.

Thanks,

Emma

LokeshChikuru
Databricks Partner

Hi @emma_s 

Iโ€™ve reviewed the setup and wanted to clarify the behavior Iโ€™m seeing with the ServiceNow connector and U2M OAuth.

The ServiceNow connection was created successfully using a U2M OAuth integration user, and that integration user has admin permissions in ServiceNow. The connection test succeeds without any issues.

However, during actual data ingestion, it appears that the connector is not executing purely in the context of the U2M OAuth integration user. Instead, ingestion seems to also depend on the Databricks workspace user who created or is running the pipeline. When that workspace user does not have the required ServiceNow permissions, the ingestion returns no data. If ServiceNow permissions are granted to the workspace user, the data becomes visible.

Iโ€™m trying to understand whether this behavior is expected with U2M OAuthโ€”i.e., pipelines executing under the workspace userโ€™s identity rather than strictly under the integration userโ€”and whether appโ€‘level (client credentials) authentication is the recommended approach for unattended ingestion scenarios where execution should not depend on individual Databricks users.

Any clarification from the Databricks team or others who have implemented this would be helpful.