cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Pyspark and SQL Warehouse

Mcnamara
New Contributor

If i write pyspark code and i need to get data in powerbi will it be possible to merge data into one semantic model? For instance the pipeline were developed using SQL so its directly compatible with SQL Warehouse 

1 REPLY 1

Walter_C
Databricks Employee
Databricks Employee

Yes, it is possible to merge data into one semantic model in Power BI when using PySpark code to get data. Databricks supports integration with Power BI, allowing you to create a unified semantic model.

You can develop your data pipeline using PySpark and then load the processed data into Databricks SQL Warehouse. From there, Power BI can connect to the SQL Warehouse to access the data. This integration enables you to build a semantic model in Power BI that combines data from various sources, including those processed with PySpark.

The SQL Warehouse in Databricks is compatible with Power BI, and you can use it to create a semantic layer that defines tables, relationships, and metrics. This setup allows you to leverage the data processed in Databricks within Power BI, ensuring that your data pipeline remains consistent and integrated.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group