Yes, it is possible to merge data into one semantic model in Power BI when using PySpark code to get data. Databricks supports integration with Power BI, allowing you to create a unified semantic model.
You can develop your data pipeline using PySpark and then load the processed data into Databricks SQL Warehouse. From there, Power BI can connect to the SQL Warehouse to access the data. This integration enables you to build a semantic model in Power BI that combines data from various sources, including those processed with PySpark.
The SQL Warehouse in Databricks is compatible with Power BI, and you can use it to create a semantic layer that defines tables, relationships, and metrics. This setup allows you to leverage the data processed in Databricks within Power BI, ensuring that your data pipeline remains consistent and integrated.