databricks-connect' is not recognized as an internal or external command,
operable program or batch file on windows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-27-2021 05:38 PM
Hello,
I've installed databricks-connect on Windows 10:
C:\Users\danoshin>pip install -U "databricks-connect==9.1.*"
Collecting databricks-connect==9.1.*
Downloading databricks-connect-9.1.2.tar.gz (254.6 MB)
|████████████████████████████████| 254.6 MB 7.6 kB/s
Collecting py4j==0.10.9
Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
|████████████████████████████████| 198 kB ...
Requirement already satisfied: six in c:\users\danoshin\appdata\local\packages\pythonsoftwarefoundation.python.3.9_qbz5n2kfra8p0\localcache\local-packages\python39\site-packages (from databricks-connect==9.1.*) (1.16.0)
Using legacy 'setup.py install' for databricks-connect, since package 'wheel' is not installed.
Installing collected packages: py4j, databricks-connect
Running setup.py install for databricks-connect ... done
Successfully installed databricks-connect-9.1.2 py4j-0.10.9
WARNING: You are using pip version 21.2.3; however, version 21.3.1 is available.
You should consider upgrading via the 'C:\Users\danoshin\AppData\Local\Microsoft\WindowsApps\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\python.exe -m pip install --upgrade pip' command.
However, I can't run it:
C:\Users\danoshin>databricks-connect configure
'databricks-connect' is not recognized as an internal or external command,
operable program or batch file.
I was trying to find the path and found something but it looks wrong too:
C:\Users\danoshin\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.9_qbz5n2kfra8p0\LocalCache\local-packages\Python39\site-packages\pyspark
The path has the databricks-connect.py
How to make it work on Windows?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-28-2021 01:57 AM
@Dmitry Anoshin , my only ideas so far are:
- uninstall pyspark as it can make conflict,
- run everything in administrator mode
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-28-2021 01:57 AM
@Dmitry Anoshin , that seems messed up.
the best you can do is to remove databricks connect and also to uninstall any pyspark installation.
And then follow the installation guide.
It should work after following the procedure.
I use a Linux VM for this purpose, mainly to avoid windows 10. I am looking into using VSCode with WSL2, which should work too, but still gotta test it.

