Hello Yeshwanth,
The use case is as follows. We have a cloud-hosted, nosql web application that accelerates the design process going from multiple tables to a single, "master" table. In addition to specifying the required joins and unions, the system hosts the domain specific information to generate a complete information set. The end-user is an analyst that needs to build the data set for a specific, short-lived project. The inputs to the workflow is a subset of the tables the user has access to in the Lakehouse. The inputs also include the project-specific information required to augment the data set with multiple sources of truth, and how to interpret the lack of records and field values (i.e., fill out and interpret a "non-event"). The output is a master table that can be hosted on databricks (when write permissions exist) or exported to a preferred BI tool (e.g., Power BI, Tableau, Dataiku etc.).
To that end, I would like to provide our users the ability to access their data managed by databricks. The app would use the read permissions to fuel the design and materialize the project-specific "master" data set.
Finally, we have this functionality working using Google drive and the like (we use OAuth2 and the PKCE flow). The idea is to get a similar functionality up and running for Databricks. A step in the process is to pre-register the application with the data hosting service. Our app will use the connection between servers, with the proof of authorization negotiated by the user, to access the user's hosted data.