cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Command to install spark package from notebook cell

JJ_LVS1
New Contributor III

Hi All, 

I need to install a spark-xml package from a notebook cell (hoping it will work on a DLT cluster). 

Maven Package: com.databricks:spark-xml_2.12:0.16.0

Can anyone help me with the command to install from the notebook cell?

Fairly new to all this and anything I've found searching hasn't worked. I've installed it on the interactive cluster from the library install area with no issues but can't get the command from a cell to work.

TIA

JJ

8 REPLIES 8

Ajay-Pandey
Esteemed Contributor III

Hi @Jason Johnson​ ,

Please refer below link that might help you-

Libraries CLI | Databricks on AWS

JJ_LVS1
New Contributor III

Because I need this on a DLT cluster, there is no way to interface with it using CLI since it's the cluster created at runtime. I think I can run CLI commands within a notebook cell so I'll try to figure out how to do that. Please reply if you know how.

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, you can refer to the notebook scoped libraries: https://docs.databricks.com/libraries/notebooks-python-libraries.html , please let us know if this helps.

JJ_LVS1
New Contributor III

Thanks for the response. Pip install doesn't work for this library.

Anonymous
Not applicable

Hi @Jason Johnson​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

JJ_LVS1
New Contributor III

Hi, Have not solved his issue yet. This maven spark-xml library does not install via pip so exploring options to execute the install via databricks cli in a notebook cell (which I don't know how to do so still trying to make it work).

JJ_LVS1
New Contributor III

I would call this one closed. Don't see a way to make this work given how the spark-xml library functions.

Anonymous
Not applicable

Hi @Jason Johnson​ 

I'm sorry you could not find a solution to your problem in the answers provided.

Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.

I suggest providing more information about your problem, such as specific error messages, error logs or details about the steps you have taken. This can help our community members better understand the issue and provide more targeted solutions.

Alternatively, you can consider contacting the support team for your product or service. They may be able to provide additional assistance or escalate the issue to the appropriate section for further investigation.

Thank you for your patience and understanding, and please let us know if there is anything else we can do to assist you.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.