Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
Hi @PiotrU, There are a couple of ways you can achieve this:
Synapse Link for Dataverse:
Set up Synapse Link for Dataverse in your Azure environment.
Use the Make.powerapps UI to create a near-real-time data replication link from Dataverse to your chosen Delta Lake (which must be in the same region).
Once the link is established, you can:
Read Data: Access Dataverse data in Fabric directly from Power Apps.
โ07-01-202403:38 AM - edited โ07-01-202403:39 AM
Had the same task in a previous assignment. There is no other way than starting a job through the API from within PowerApps. For that you need running compute and leverage Serverless most likely.
Haven't thought of that actually, yes that will work! But still you need a seperate azure function for it, which is a bit cumbersome still I would say.
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.