<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Refresh options on PBI from Databricks workflow using Azure Databricks in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/refresh-options-on-pbi-from-databricks-workflow-using-azure/m-p/103523#M41467</link>
    <description>&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;I have a workflow that includes my medallion architecture and DLT. Currently, I have a separate notebook for refreshing my Power BI semantic model, which works based on the method described in &lt;A href="https://www.madeiradata.com/post/refresh-a-powerbi-dataset-from-azure-databricks" target="_blank" rel="noopener"&gt;Refresh a PowerBI dataset from Azure Databricks&lt;/A&gt;.&amp;nbsp; However, now that I’m placing the notebook at the end of my workflow, I’m unsure how to set it all up properly. What is the best practice regarding connecting a PBI semantic model to a databricks (medallion) workflow?&lt;/P&gt;&lt;P&gt;When trying out, I stop at the step where I've to select a source in Power BI ("Get Data"), I have to choose a cluster HTTP. But since the workflow runs on serverless cluster, I’m not sure how this fits together... Also, should I select "Azure Databricks" instead of just "Databricks," since my storage is in a Storage Account and not Delta Lake? just to confirm.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Additionally, I’m looking into how to manage dataset refreshes in the Power BI semantic model. Is there a way to determine whether a full refresh is needed? Why refresh fully when the update is minimal? In Analysis Services, there is an "Update" process, but the full process will refresh all data. How do I manage &lt;EM&gt;the type of refresh&lt;/EM&gt;&amp;nbsp;in Power BI or Databricks?&lt;BR /&gt;&lt;BR /&gt;Thanks!&lt;/P&gt;</description>
    <pubDate>Mon, 30 Dec 2024 10:32:13 GMT</pubDate>
    <dc:creator>mkEngineer</dc:creator>
    <dc:date>2024-12-30T10:32:13Z</dc:date>
    <item>
      <title>Refresh options on PBI from Databricks workflow using Azure Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/refresh-options-on-pbi-from-databricks-workflow-using-azure/m-p/103523#M41467</link>
      <description>&lt;P&gt;Hi!&lt;/P&gt;&lt;P&gt;I have a workflow that includes my medallion architecture and DLT. Currently, I have a separate notebook for refreshing my Power BI semantic model, which works based on the method described in &lt;A href="https://www.madeiradata.com/post/refresh-a-powerbi-dataset-from-azure-databricks" target="_blank" rel="noopener"&gt;Refresh a PowerBI dataset from Azure Databricks&lt;/A&gt;.&amp;nbsp; However, now that I’m placing the notebook at the end of my workflow, I’m unsure how to set it all up properly. What is the best practice regarding connecting a PBI semantic model to a databricks (medallion) workflow?&lt;/P&gt;&lt;P&gt;When trying out, I stop at the step where I've to select a source in Power BI ("Get Data"), I have to choose a cluster HTTP. But since the workflow runs on serverless cluster, I’m not sure how this fits together... Also, should I select "Azure Databricks" instead of just "Databricks," since my storage is in a Storage Account and not Delta Lake? just to confirm.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Additionally, I’m looking into how to manage dataset refreshes in the Power BI semantic model. Is there a way to determine whether a full refresh is needed? Why refresh fully when the update is minimal? In Analysis Services, there is an "Update" process, but the full process will refresh all data. How do I manage &lt;EM&gt;the type of refresh&lt;/EM&gt;&amp;nbsp;in Power BI or Databricks?&lt;BR /&gt;&lt;BR /&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Mon, 30 Dec 2024 10:32:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/refresh-options-on-pbi-from-databricks-workflow-using-azure/m-p/103523#M41467</guid>
      <dc:creator>mkEngineer</dc:creator>
      <dc:date>2024-12-30T10:32:13Z</dc:date>
    </item>
    <item>
      <title>Re: Refresh options on PBI from Databricks workflow using Azure Databricks</title>
      <link>https://community.databricks.com/t5/data-engineering/refresh-options-on-pbi-from-databricks-workflow-using-azure/m-p/103541#M41477</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/122685"&gt;@mkEngineer&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Have you reviewed this documentation:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi" target="_blank"&gt;https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;Also I don't think Serverless compute for Notebook will work for your connection with Power BI. You might need to setup a Serverless SQL warehouse instead. Above document mentions how to setup the connection.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 30 Dec 2024 13:23:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/refresh-options-on-pbi-from-databricks-workflow-using-azure/m-p/103541#M41477</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2024-12-30T13:23:53Z</dc:date>
    </item>
  </channel>
</rss>

