cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

matte
by New Contributor III
  • 12904 Views
  • 7 replies
  • 16 kudos

Resolved! Way of using pymc.model_to_graphviz into a Databricks notebook

Hi everybody,I created a simple bayesian model using the pymc library in Python. I would like to graphically represent my model using the pymc.model_to_graphviz(model=model) method.However, it seems it does not work within a databrcks notebook, even ...

  • 12904 Views
  • 7 replies
  • 16 kudos
Latest Reply
Own
Contributor
  • 16 kudos

%sh apt install -y graphviz

  • 16 kudos
6 More Replies
dvmentalmadess
by Valued Contributor
  • 2245 Views
  • 1 replies
  • 2 kudos

Resolved! Store a secret only accessible to the current user

During an interactive notebook session, I want a user to be able to retrieve a secret specific to that user. I haven't decided on storage mechanisms, but I'm open to storage mechanisms that can scalably authorize access to a single user and that I ca...

  • 2245 Views
  • 1 replies
  • 2 kudos
Latest Reply
dvmentalmadess
Valued Contributor
  • 2 kudos

I ended up using Databricks Secrets as the storage mechanism after learning from my account rep that the limit is soft and we can request a higher scope limit. In this case, each user gets a dedicated scope and no other users have access.

  • 2 kudos
Slalom_Tobias
by New Contributor III
  • 2661 Views
  • 2 replies
  • 3 kudos

Resolved! ML Practioner | ML 10 - Feature Store notebook | feature_store import error

the following code...from pyspark.sql.functions import monotonically_increasing_id, lit, expr, randimport uuidfrom databricks import feature_storefrom pyspark.sql.types import StringType, DoubleTypefrom databricks.feature_store import feature_table, ...

  • 2661 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hope that was an easy fix - @Tobias Cortese​ ! Thanks for marking the "best answer"!

  • 3 kudos
1 More Replies
jnjns
by New Contributor II
  • 1034 Views
  • 0 replies
  • 3 kudos

Java Error for installation rasterframes

Hi all,I have followed the steps in this notebook to install rasterframes on my databricks cluster.Eventually I am able to import the following:from pyrasterframes import rf_ipython from pyrasterframes.utils import create_rf_spark_session from pyspar...

  • 1034 Views
  • 0 replies
  • 3 kudos
orion216
by New Contributor II
  • 13656 Views
  • 5 replies
  • 2 kudos

Resolved! Keep long-running notebook alive when closing browser

I am working with Azure Databricks jupyter notebooks and have time-consuming jobs (complex queries, model training, loops over many items, etc.).Every time I close the browser (or step away for a long time) of some running notebook, even before the c...

  • 13656 Views
  • 5 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hey @Eric P​ Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best? If not, please tell us so we can help you.Thanks!

  • 2 kudos
4 More Replies
mhansinger
by New Contributor II
  • 18119 Views
  • 4 replies
  • 1 kudos

Resolved! Set default "spark.driver.maxResultSize" from the notebook

Hello,I would like to set the default "spark.driver.maxResultSize" from the notebook on my cluster. I know I can do that in the cluster settings, but is there a way to set it by code?I also know how to do it when I start a spark session, but in my ca...

  • 18119 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @Maximilian Hansinger​ Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark the answer as best? If not, please tell us so we can help you.Thanks!

  • 1 kudos
3 More Replies
Labels