cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

SAP Successfator

Phani1
Valued Contributor

Hi Team,

We are working on a new Data Product onboarding to the current Databricks Lakehouse Platform.
The first step is foundation where we should get data from SAP success factors to S3+ Bronze layer and then do the initial setup of Lakehouse+Power BI.

I see “SAP successfactors offer an OData API to retrieve information"

Could you please suggest is this the right approach or please guide us the best way to retrieve data and store it in data lake.

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @Phani1, Retrieving data from SAP SuccessFactors and storing it in your Databricks Lakehouse Platform involves several considerations. 

 

Let’s break it down step by step:

 

Data Extraction from SAP SuccessFactors:

  • You’ve correctly identified that SAP SuccessFactors offers an OData API to retrieve information. This is a common approach for extracting data from cloud-based applications like SuccessFactors.
  • You can create custom Lambda functions (as you’ve mentioned) that consume the SuccessFactors API and write the data into Amazon S3. This approach works well and allows you to control the data flow.
  • Alternatively, if you’re using SAP Business Technology Platform (BTP) with the Cloud Integration Suite, consider using the AWS BTP Adapter. This adapter allows you to push data to S3 as an endpoint in your integration flow. If you’ve already integrated SuccessFactors with SAP ERP through Cloud Integration Suite, you can mo....

Storing Data in the Data Lake:

Lakehouse Setup and Power BI Integration:

  • After storing data in S3, you can set up your Lakehouse architecture within Databricks. This involves creating tables, managing partitions, and organizing data.
  • Integrate Power BI with your Lakehouse platform to create visualizations and dashboards. Power BI can directly connect to your Lakehouse tables for reporting and analytics.

 

View solution in original post

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @Phani1, Retrieving data from SAP SuccessFactors and storing it in your Databricks Lakehouse Platform involves several considerations. 

 

Let’s break it down step by step:

 

Data Extraction from SAP SuccessFactors:

  • You’ve correctly identified that SAP SuccessFactors offers an OData API to retrieve information. This is a common approach for extracting data from cloud-based applications like SuccessFactors.
  • You can create custom Lambda functions (as you’ve mentioned) that consume the SuccessFactors API and write the data into Amazon S3. This approach works well and allows you to control the data flow.
  • Alternatively, if you’re using SAP Business Technology Platform (BTP) with the Cloud Integration Suite, consider using the AWS BTP Adapter. This adapter allows you to push data to S3 as an endpoint in your integration flow. If you’ve already integrated SuccessFactors with SAP ERP through Cloud Integration Suite, you can mo....

Storing Data in the Data Lake:

Lakehouse Setup and Power BI Integration:

  • After storing data in S3, you can set up your Lakehouse architecture within Databricks. This involves creating tables, managing partitions, and organizing data.
  • Integrate Power BI with your Lakehouse platform to create visualizations and dashboards. Power BI can directly connect to your Lakehouse tables for reporting and analytics.

 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!