Hi,
I am saving some files locally on my cluster and moving them after my job. These are log files of my process so I cant directly reference a DBFS location.
However the dbutils.fs.cp command does not work on the shared cluster. This does however work on a individual cluster. I believe this is related to how the clusters are split amongst users.
File location: "/home/spark-daed4064-233f-446c-b9f2-5b/log.txt''
Copy command:
import os
#path gets set to /home/spark-daed4064-233f-446c-b9f2-5b/
path = os.getcwd()
new_path = f"{path}/logs.txt"
# output printed out -> /home/spark-4c17311c-654a-4c71-b551-2e/logs.txt
print(new_path)
dbutils.fs.cp(new_path, "dbfs:/databricks/scripts/logs.txt")