cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Import folder (no .whl or .jar files) and run the `python3 setup.py bdist_wheel` for lib install

Dicer
Valued Contributor

I want to import the ibapi python module in Azure Databricks Notebook.

Before this, I downloaded the the TWS API folder from https://interactivebrokers.github.io/#

 

I need to go through the following steps to install the API:

  1. Download and install TWS Gateway or Client
  2. Download and install Python to C:\Program Files\python... 
  3. Setup additional PATH environment variables (see pip warnings)
  4. Install wheel, i.e. type in the command line: py -m pip install wheel
  5. Download "API Latest" (or latest version that contains python) from http://interactivebrokers.github.io/ and install into the folder X as proposed (but for python the directory shouldn't matter)
  6. Go to X/.../source/pythonclient/ (cd in command line)
  7. Build a wheel, i.e. py setup.py bdist_wheel
  8. Look up the whl file in folder ./dist: should be something like ibapi-9.7x.x..
  9. Install the wheel, i.e. py -m pip install --user --upgrade dist/ibapi-9.73.7-py3-none-any.whl

Reference: https://stackoverflow.com/questions/57618117/installing-the-ibapi-package/59829707#59829707

 

I checked the Databricks Install library methods. It seems the closest solution is to import JAR or Python Whl files.

However, the API folder does not have JAR or .whl files.

 

May I know whether there is any method that allows me to do the following things in Databricks?

- import the entire folder

- cd the folder and run py setup.py bdist wheel  bash script. (For this operation, write privilege folder right is needed)

 

Thank you!

 

1 REPLY 1

arpit
Contributor III
Contributor III

You can try to upload the folder in the workspace location and try to cd in the desired folder and try to install in via notebook. But it would be a notebook scope installation. If you are looking for a cluster scoped installation then you would need jar or wl file.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.