Hi @esi, I'm sorry, but currently, there is no direct way to import Power BI tables into Databricks as a Spark dataframe. The connectors available are designed to work in the opposite direction, i.e., to load data from Databricks into Power BI for visualization.
However, you might consider exporting your Power BI data to a format that Databricks can read, such as CSV or Parquet, and then loading that file into Databricks. This would involve using Power BI's export functionality and then using Databricksโ data loading functions to import the data.
Here's an example of how you could load a CSV file into a Spark dataframe in Databricks:
python
df = spark.read.format('csv').option('header','true').option('inferSchema', 'true').load('/FileStore/tables/mydata.csv')
In this code, โ/FileStore/tables/mydata.csvโ is the path to your CSV file in Databricks FileStore.Please note that this is a workaround and might not be suitable for all use cases, particularly if you have large amounts of data in Power BI or if you need to do this import operation frequently.
Sources:
- [Databricks PySpark API Reference](https://api-docs.databricks.com/python/pyspark/latest/pyspark.sql/dataframe.html)
- [Databricks FileStore](https://docs.databricks.com/data/filestore.html)