SAP Databricks is the force within SAP BDC to unlock ML/AI capabilities on SAP Data
SAP data is considered the gold standard, and the ability to seamlessly integrate and analyze this data is essential. This blog post delves into the powerful synergy between SAP's latest platform, Business Data Cloud (BDC), and SAP Databricks. Together, these two critical technologies can unlock unprecedented AI and machine learning data potential for organizations. In this blog, we will explore how customers use SAP BDC data products to do ML forecasting on their Cash Flow data products using the ai_query functionality.
SAP's most ambitious platform is the SAP Business Data Cloud, which is built on SAP Datasphere and SAP Databricks as its core components. This platform enables organizations to maintain business context while ensuring that data is accessible for advanced analytics and AI applications. What makes the Business Data Cloud powerful is its emphasis on governance, scalability, and seamless integration, allowing teams to extract insights without compromising data quality or control.
SAP Databricks brings the power of the Databricks lakehouse architecture to the SAP ecosystem. It combines the best of data warehouses and data lakes, offering a unified platform for machine learning and Gen AI. With SAP Databricks, organizations can leverage their SAP data alongside other enterprise data for advanced analytics and ML/AI applications.
All the data products coming from different SAP LoBs, like S/4HANA on RISE, can be delta shared with SAP Databricks. Delta Sharing is an open, secure protocol for sharing live data across organizations and platforms without copying or moving it. Let's see this in action by taking the CashFlow ML forecasting use case.
Cash flow forecasting is very critical for a company because it directly impacts financial stability and operational agility. Accurate forecasts enable proactive decision-making, helping leaders anticipate liquidity shortfalls, optimize working capital, manage risks effectively, and ensure sufficient funds to support strategic initiatives and growth.
Once all the Cashflow data products are activated in the SAP BDC cockpit, they will be stored in the SAP BDC managed object store. After the data products are placed there, they can be shared with SAP Databricks using the Delta Sharing protocol. Once you log in to the SAP Databricks workspace, you can see it as shown below
You can grant and set ACLs on these delta shared tables, and then you can start building your SQL query, which can call AI functions like forecast using ai_forecast. Here we are doing forecasting
Here is the SQL code for doing forecasting on the Cashflow delta shared data product.
With parameters:
Also, in this CTE, we are joining the company code data product, which will help us to get the master data for the company code for which we are forecasting the cash flow.
WITH cashflowaggregated AS ( |
You can develop the above query in the SQL editor and paste it to visualize it in the SQL editor as shown below. Also, this forecast of the cashflow can be visualized in tools like Power BI, which can be connected to the SAP Databricks serverless warehouses or send it back to the SAP Managed object store to visualize it in SAP Datasphere/SAP Analytics Cloud using the bidirectional delta sharing connector.
The combination of SAP BDC and SAP Databricks provides a powerful solution for organizations aiming to maximize the value of their SAP data. By efficiently delta-sharing data into SAP Databricks and utilizing its advanced machine learning (ML) and artificial intelligence (AI) capabilities, businesses can gain deeper insights, automate processes, and foster innovation. This integration enables organizations to transition from traditional reporting to a data-driven decision-making approach. Additionally, it demonstrates the simplicity of using ML and AI through the ai_query functionality in Databricks.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.