by
g96g
• New Contributor III
- 1056 Views
- 1 replies
- 0 kudos
0I have a problem with reading the file from ADLS gen 2.I have dont the mounting properly as after executing dbutils.fs.ls('/mnt/bronze') I can see the file path.the way how I did the mounting:
# dbutils.fs.mount(
# source = "abfss://"+container_r...
- 1056 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Givi Salu Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 6274 Views
- 2 replies
- 2 kudos
When I save files on "dbfs:/FileStore/shared_uploads/brunofvn6@gmail.com/", it doesn't appear anywhere in my workspace. I've tried to copy the path of the workspace with the right mouse button, pasted on ("my pandas dataframe").to_csv('path'), but wh...
- 6274 Views
- 2 replies
- 2 kudos
Latest Reply
I think I discover how to do this. Is in the label called data in the left menu of the databricks environment, in the top left of the menu there are two labels "Database Tables" and "DBFS" in which "Database Table" is the default label. So it is just...
1 More Replies
- 2381 Views
- 1 replies
- 0 kudos
Which file size is better 1 GB file size in target or 128 MB or lesser than that , I am interested in knowing concept too.
- 2381 Views
- 1 replies
- 0 kudos
Latest Reply
If data is getting appended primarily to the delta table and read ratio is higher than writes ratio - larger file sizes ( 1GB) would be ideal. However, if your delta table undergoes frequent upserts/merges, having smaller files than the default 1GB ...