cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

anthonylavado
by New Contributor III
  • 1190 Views
  • 3 replies
  • 7 kudos

Can't Add Cluster-scoped Init Script to Model Serving Cluster

Similar to this other question: https://community.databricks.com/s/question/0D58Y00008hahwuSAA/cant-edit-the-cluster-created-by-mlflow-model-servingWe're using Azure Databricks, and have a model that requires a WHL to be downloaded from a private add...

  • 1190 Views
  • 3 replies
  • 7 kudos
Latest Reply
939772
New Contributor III
  • 7 kudos

Has anyone had success with this? Trying to solve a resolve issue.

  • 7 kudos
2 More Replies
mattsteinpreis
by New Contributor III
  • 2823 Views
  • 4 replies
  • 5 kudos

Getting Py4J "Could not find py4j jar" error when trying to use pypmml, solution doesn't work

I'm trying to use pypmml in a DB notebook, but I'm getting the known `Error : Py4JError: Could not find py4j jar at` error. I've followed the solution here: https://kb.databricks.com/libraries/pypmml-fail-find-py4j-jar.html. However, this has not wor...

  • 2823 Views
  • 4 replies
  • 5 kudos
Latest Reply
pawelmitrus
Contributor
  • 5 kudos

I've been struggling myslef with it, but after installing pypmml for spark, I can use the other library, maybe it will work for you:runtime 10.4 LTS MLinstall pypmml-spark (https://github.com/autodeployai/pypmml-spark)install pmml4s-spark (org.pmml4s...

  • 5 kudos
3 More Replies
Bradley
by New Contributor III
  • 2285 Views
  • 7 replies
  • 3 kudos

How do I install a non python/R/maven/jar library into a cluster?

I'm trying to install a non standard package into the cluster using the init scripts. The package I'm trying to install needs to be downloaded using wget, and uncompressed using tar. Then added to the PATH, or at least I need to know where the downlo...

  • 2285 Views
  • 7 replies
  • 3 kudos
Latest Reply
Bradley
New Contributor III
  • 3 kudos

Thank you for the support. Yes, I was able to find a working solution.I placed the files into the distributed file system, dbfs. For others, this can be done manually using the databricks cli, or using the init scripts. In this case I found it easier...

  • 3 kudos
6 More Replies
Labels