cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks Notebook Rendering Issue: IPython.lib.display.IFrame

SR_71
New Contributor II

Similar issue here: https://stackoverflow.com/questions/71336374/randomforestclassifier-explainer-dashboard-output-in-da...

Actual output โ€“ Databricks Notebookimage 

Expected Output โ€“ Jupyter Notebookimage 

Reproducible Code Example

#pip install explainerdashboard

from sklearn.ensemble import RandomForestClassifier

from explainerdashboard import ClassifierExplainer, ExplainerDashboard

from explainerdashboard.datasets import titanic_survive, feature_descriptions

X_train, y_train, X_test, y_test = titanic_survive()

model = RandomForestClassifier(n_estimators=50, max_depth=10).fit(X_train, y_train)

explainer = ClassifierExplainer(model, X_test, y_test,

                              cats=[''Deck', 'Embarked'],

                              descriptions=feature_descriptions,

                              labels=['Not survived', 'Survived'])

ExplainerDashboard(explainer, mode = 'inline',

                  importances=False,

                  model_summary=False,

                  contributions=True,

                  whatif=False,

                  shap_dependence=False,

                  shap_interaction=False,

                  decision_trees=False).run()

6 REPLIES 6

Abishek
Valued Contributor
Valued Contributor

Please do the following correction in your code then you will be able to get the ExplainerDashboard.

1) Set the Environment variables in Databricks Cluster

For example :

DASH_REQUEST_PATHNAME_PREFIX=/driver-proxy/o/4080082044610008/1004-091920-cxkidhkx/8888

Workspace ID : 4080082044610008 

Cluster-ID: 1004-091920-cxkidhkx 

Port number : 8888

2) Install the explainerdashboard library

%pip install explainerdashboard

3) Sample code for to validate the "dash" 

from sklearn.ensemble import RandomForestClassifier

from explainerdashboard import ClassifierExplainer, ExplainerDashboard

from explainerdashboard.datasets import titanic_survive, feature_descriptions

X_train, y_train, X_test, y_test = titanic_survive()

model = RandomForestClassifier(n_estimators=50, max_depth=10).fit(X_train, y_train)

explainer = ClassifierExplainer(model, X_test, y_test,

   cats=['Deck', 'Embarked'],

   descriptions=feature_descriptions,

   labels=['Not survived', 'Survived'])

ExplainerDashboard(explainer, mode = 'dash',

   importances=False,

   model_summary=False,

   contributions=True,

   whatif=False,

   shap_dependence=False,

   shap_interaction=False,

   decision_trees=False).run(8888)

4) Dashboard URL

https://xxxxxxxx.databricks.com/driver-proxy/o/4080082044610008/1004-091920-cxkidhkx/8888

SophiaEvt
New Contributor II

Hi @Abishek Subramanianโ€‹, I experience the same issue and tried your solution step by step. It seems to work, but when I go to the dashboard URL it just says 'Loading...' indefinitely (I waited for an hour before killing it). I tried with both your example code and with my own dataset of only 360 records. Same issue... Any clues what could be wrong?

Abishek
Valued Contributor
Valued Contributor

Can you share the cluster URL / spark config / sample code 

Abishek
Valued Contributor
Valued Contributor

Environment variables:

Environment variablesModel Explainer:

Model Explainer 

Abishek
Valued Contributor
Valued Contributor

@Sophia Evtimovaโ€‹  Once you add the Environment variables under the spark config you will get a dashboard

Environment variablesScreenshot 2023-06-22 at 8.44.11 PM

ChanduBhujang
New Contributor II

Hi Abhishek, 

I followed your steps, I am having in identifying the dashboard link. How do I figure out the first two words dbc-dp- for my cluster? 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.