cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

SAP Successfator

Phani1
Valued Contributor

Hi Team,

We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.
The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power BI.

I see “SAP successfactors offer an OData API to retrieve information"

Could you please suggest is this the right approach or please guide us the best way to retrieve data and store it in data lake.

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @Phani1, Retrieving data from SAP SuccessFactors and storing it in your Databricks Lakehouse Platform involves several considerations. 

 

Let’s break it down step by step:

 

Data Extraction from SAP SuccessFactors:

  • You’ve correctly identified that SAP SuccessFactors offers an OData API to retrieve information. This is a common approach for extracting data from cloud-based applications like SuccessFactors.
  • You can create custom Lambda functions (as you’ve mentioned) that consume the SuccessFactors API and write the data into Amazon S3. This approach works well and allows you to control the data flow.
  • Alternatively, if you’re using SAP Business Technology Platform (BTP) with the Cloud Integration Suite, consider using the AWS BTP Adapter. This adapter allows you to push data to S3 as an endpoint in your integration flow. If you’ve already integrated SuccessFactors with SAP ERP through Cloud Integration Suite, you can mo....

Storing Data in the Data Lake:

Lakehouse Setup and Power BI Integration:

  • After storing data in S3, you can set up your Lakehouse architecture within Databricks. This involves creating tables, managing partitions, and organizing data.
  • Integrate Power BI with your Lakehouse platform to create visualizations and dashboards. Power BI can directly connect to your Lakehouse tables for reporting and analytics.

 

View solution in original post

1 REPLY 1

Kaniz
Community Manager
Community Manager

Hi @Phani1, Retrieving data from SAP SuccessFactors and storing it in your Databricks Lakehouse Platform involves several considerations. 

 

Let’s break it down step by step:

 

Data Extraction from SAP SuccessFactors:

  • You’ve correctly identified that SAP SuccessFactors offers an OData API to retrieve information. This is a common approach for extracting data from cloud-based applications like SuccessFactors.
  • You can create custom Lambda functions (as you’ve mentioned) that consume the SuccessFactors API and write the data into Amazon S3. This approach works well and allows you to control the data flow.
  • Alternatively, if you’re using SAP Business Technology Platform (BTP) with the Cloud Integration Suite, consider using the AWS BTP Adapter. This adapter allows you to push data to S3 as an endpoint in your integration flow. If you’ve already integrated SuccessFactors with SAP ERP through Cloud Integration Suite, you can mo....

Storing Data in the Data Lake:

Lakehouse Setup and Power BI Integration:

  • After storing data in S3, you can set up your Lakehouse architecture within Databricks. This involves creating tables, managing partitions, and organizing data.
  • Integrate Power BI with your Lakehouse platform to create visualizations and dashboards. Power BI can directly connect to your Lakehouse tables for reporting and analytics.

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.