cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks default python libraries list & version

kjoth
Contributor II

We are using data-bricks. How do we know the default libraries installed in the databricks & what versions are being installed.

I have ran pip list, but couldn't find the pyspark in the returned list.

1 ACCEPTED SOLUTION

Accepted Solutions

Prabakar
Esteemed Contributor III
Esteemed Contributor III

Hi @karthick Jโ€‹ , you can refer to the release notes for this.

https://docs.databricks.com/release-notes/runtime/releases.html

To know which library and what version of that library are installed on the cluster, you can check the respective DBR version in the release notes which will give your the list of libraries that will be installed.

View solution in original post

5 REPLIES 5

Prabakar
Esteemed Contributor III
Esteemed Contributor III

Hi @karthick Jโ€‹ , you can refer to the release notes for this.

https://docs.databricks.com/release-notes/runtime/releases.html

To know which library and what version of that library are installed on the cluster, you can check the respective DBR version in the release notes which will give your the list of libraries that will be installed.

Thanks for the info. But in this page, the python library pyspark package is not available. Am I missing something. Thanks for helping out

Hubert-Dudek
Esteemed Contributor III

relating to checking spark version it can be useful as well:image(pyspark on databricks is not installed via pip but version will match)

jose_gonzalez
Moderator
Moderator

Hi @karthick Jโ€‹ ,

If you would like to see all the libraries installed in your cluster and the version, then I will recommend to check the "Environment" tab. In there you will be able to find all the libraries installed in your cluster.

Please follow these steps to access the Environment tab:

  • Navigate and open you cluster cluster view
  • Select "Spark UI" tab
  • Select the "Environment" sub tab. It will be inside. (I have attached a screenshot)Environment tab

Erik
Valued Contributor II

Neither pyspark,nor any of the other python pre-installed python libraries show up for me when I look there.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!