- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-02-2021 06:30 AM
I'm trying to export a csv file from my Databricks workspace to my laptop.
I have followed the below steps.
1.Installed databricks CLI
2. Generated Token in Azure Databricks
3. databricks configure --token
5. Token:xxxxxxxxxxxxxxxxxxxxxxxxxx
6. databricks fs cp -r dbfs:/your_folder destination/your_folder
I get the below error. Can anyone help?
Error: ConnectionError: HTTPSConnectionPool(host='%3cdatabricks-instance%3e', port=443): Max retries exceeded with url: /api/2.0/workspace/get-status?path=%2FFileStore%2Fshared_uploads%2Fwwid%40 (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x000001E8649DDC08>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-02-2021 06:59 AM
Please check %USERPROFILE%\.databrickscfg file on Windows it should include
[DEFAULT]
host = <workspace-URL>
token = <personal-access-token>
Please validate host variable. Here are additional info how to get it https://docs.databricks.com/workspace/workspace-details.html#workspace-url
Please validate cli connection using some simple command for example
databricks workspace list
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-02-2021 06:59 AM
Please check %USERPROFILE%\.databrickscfg file on Windows it should include
[DEFAULT]
host = <workspace-URL>
token = <personal-access-token>
Please validate host variable. Here are additional info how to get it https://docs.databricks.com/workspace/workspace-details.html#workspace-url
Please validate cli connection using some simple command for example
databricks workspace list
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-09-2021 04:02 AM
Thanks for the help, Hubert.
My host URL was not set correctly. I made the changes as suggested & that fixed the issue.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-10-2021 05:00 AM
I am glad that it helped.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-02-2021 08:49 AM
Depending on the file size, you can use display() in a notebook to download a CSV to your local laptop. It's possible for an admin has disabled this feature so you may not see it.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-09-2021 02:15 AM
Thanks Joseph, but the file I want to export is of a bigger size. It goes beyond the display limit of the rows.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2021 07:14 PM
Hi @Sarvagna Mahakali There is an easier hack:
a) You can save results locally on the disk and create a hyper link for downloading CSV . You can copy the file to this location: dbfs:/FileStore/table1_good_2020_12_18_07_07_19.csv
b) Then download with link: HTTPS://<yourworkspace>.cloud.databricks.com/files/trace_good_2020_12_18_07_07_19.csv
Hope it helps! 😊

