- 51463 Views
- 25 replies
- 16 kudos
for example I have one.py and two.py in databricks and I want to use one of the module from one.py in two.py. Usually I do this in my local machine by import statement like below
two.py__
from one import module1
.
.
.
How to do this in databricks???...
- 51463 Views
- 25 replies
- 16 kudos
Latest Reply
This alternative worked for us: https://community.databricks.com/t5/data-engineering/is-it-possible-to-import-functions-from-a-module-in-workspace/td-p/5199
24 More Replies
- 9142 Views
- 1 replies
- 0 kudos
without uploading the file into dbfs? Thanks!
- 9142 Views
- 1 replies
- 0 kudos
Latest Reply
In my opinion, it doesn't make sense, but...you can Mount SMB Azure file share on a Windows Machine https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows and then mount the same folder on databricks using pip install ...
- 38945 Views
- 16 replies
- 0 kudos
I couldn't find in documentation a way to export an RDD as a text file to a local folder by using python. Is it possible?
- 38945 Views
- 16 replies
- 0 kudos
Latest Reply
To: Export a file to local desktop
Workaround : Basically you have to do a "Create a table in notebook" with DBFS
The steps are:
Click on "Data" icon > Click "Add Data" button > Click "DBFS" button > Click "FileStore" folder icon in 1st pane "Sele...
15 More Replies