โ11-29-2023 03:08 AM
Hi!
I have been carrying out a POC, so I created the CSV file in my workspace and tried to read the content using the techniques below in a Python notebook, but did not work.
Option1:
As an alternative, I uploaded the CSV file into a blob storage account and able to read it without any issues.
I am curious to find as I believe there must be a way to read the CSV file from my work space aswell.
I would be glad if you can post here how to do so?
Thanks!
โ11-29-2023 03:55 AM
I tried the below repo_file values in both options. However, I continue to see the same exception
โ11-29-2023 08:14 AM
Hi @Dev
you can read like below with the latest cluster 12.2 and above (prefix file:/)
โ08-05-2024 03:37 AM
This worked for me thanks, adding file:/ before Workspace
โ11-29-2023 08:49 AM - edited โ11-29-2023 08:53 AM
@Retired_mod I copied the file path and used the same, but it didn't help. It has been working fine if I copy the file path from repos but not from the user's workspace area.
3 weeks ago
for unity catalog enabled clusters, with the default security permmissions, I think we cannot access like this anymore.
3 weeks ago
Hi @Dev ,
Generally, What happens spark reader APIs point to the DBFS by default. And, to read the file from User workspace, we need to append 'file:/' in the prefix.
Thanks
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group