cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

databricks-connect' is not recognized as an internal or external command, operable program or batch file on windows

dimoobraznii
New Contributor III

Hello,

I've installed databricks-connect on Windows 10:

C:\Users\danoshin>pip install -U "databricks-connect==9.1.*"
Collecting databricks-connect==9.1.*
  Downloading databricks-connect-9.1.2.tar.gz (254.6 MB)
     |████████████████████████████████| 254.6 MB 7.6 kB/s
Collecting py4j==0.10.9
  Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
     |████████████████████████████████| 198 kB ...
Requirement already satisfied: six in c:\users\danoshin\appdata\local\packages\pythonsoftwarefoundation.python.3.9_qbz5n2kfra8p0\localcache\local-packages\python39\site-packages (from databricks-connect==9.1.*) (1.16.0)
Using legacy 'setup.py install' for databricks-connect, since package 'wheel' is not installed.
Installing collected packages: py4j, databricks-connect
    Running setup.py install for databricks-connect ... done
Successfully installed databricks-connect-9.1.2 py4j-0.10.9
WARNING: You are using pip version 21.2.3; however, version 21.3.1 is available.
You should consider upgrading via the 'C:\Users\danoshin\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\python.exe -m pip install --upgrade pip' command.

However, I can't run it:

C:\Users\danoshin>databricks-connect configure
'databricks-connect' is not recognized as an internal or external command,
operable program or batch file.

I was trying to find the path and found something but it looks wrong too:

C:\Users\danoshin\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\pyspark

The path has the databricks-connect.py

How to make it work on Windows?

3 REPLIES 3

Kaniz
Community Manager
Community Manager

Hi @ dimoobraznii! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

Hubert-Dudek
Esteemed Contributor III

@Dmitry Anoshin​ , my only ideas so far are:

  • uninstall pyspark as it can make conflict,
  • run everything in administrator mode

-werners-
Esteemed Contributor III

@Dmitry Anoshin​ , that seems messed up.

the best you can do is to remove databricks connect and also to uninstall any pyspark installation.

And then follow the installation guide.

It should work after following the procedure.

I use a Linux VM for this purpose, mainly to avoid windows 10. I am looking into using VSCode with WSL2, which should work too, but still gotta test it.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.