Hi All
Looking to get some help. We are on Unity Catalog in Azure. We have a requirement to use Python to write out PNG files (several) via Matplotlib and then drop those into an ADLS2 Bucket. With Unity Catalog, we can easily use dbutils.fs.cp or fs.put to do this. However, the PNGs need to be written to the Cluster first before we use Copy to move them over to an ADLS2 Bucket.
The issue: dbutils cannot access all locations on the Cluster and the folders it can access we get ERROR 13 Access Denied when trying to write PNGs to those spots. So I am not sure where i can drop the files to do the copy with. Here is the code snippet:
<CODE HERE TO GENERATE CHART>... FOLLOWED BY..
img_name = f'{product_level_1}-{product_level_2}-{product_level_3}-{value_type_1}-{value_type_2}.png'
plt.savefig('/databricks-datasets/'+img_name, bbox_inches='tight', format='png')
dbutils.fs.cp('/databricks-datasets/'+img_name, storage_url+img_name)
print(img_name)
So, the plt.save if I just go img_name, it drops to default workspace location but then dbutils cannot locate it. Then when I try to select folders dbutils can have it doesn't work with permission issues.