Hey all.
Just wanted to make sure everyone had some up-to-date intel regarding leveraging Plotly Dash with Databricks.
Most Dash app integrations w Databricks today leverage the Databricks Python SQL Connector. More technical details are available via: https://dash.plotly.com/dash-with-databricks-sql and wide range of profiled customer use cases and other information available via https://plotly.com/dash/databricks-integration/
The use cases include simple through to sophisticated production-grade data applications which can include leveraging SQL Alchemy/ORM for advanced workflows (i.e. WAY more than "dashboards") including (editing, writeback, real-time/streaming, etc.) ... and for internal through to external use cases.
NOTE: We are also seeing an increasing number of scenarios where Dash apps (also) leverage the Databricks SDK (and the Jobs API as part of that) ifor use cases with Databricks' back end being used as a computational engine, and the Dash app having role of the web UI front end (vs having to dive into a notebook) to facilitate parameterizing and initiating (interactively or on scheduled basis) computationally intensive jobs for ML training, what-if scenarios, simulations, etc. etc.
Here's a semi-recent article providing an initial 'how to' for such: https://plotlygraphs.medium.com/databricks-sdk-plotly-dash-the-easiest-way-to-get-jobs-done-70d44e1c... (with more to follow in the very near term).
We look forward to any/all feedback (and use cases!).
Dave (dave@plot.ly)