Hi @James Owen​, We haven’t heard from you on the last response from @Debayan Mukherjee​, and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to...
We are doing DBFS migration. In that we have a folder 'user' in Root DBFS having data 5.8 TB in legacy workspace. We performed AWS CLi Sync/cp between Legacy to Target and again performed the same between Target bucket to Target dbfs While implemen...
Thanks for the quick response.Regarding the suggested AWS data sync approach, we have tried data sync in multiple ways, it is creating folders in s3 bucket itself not on DBFS. As our task is to copy from bucket to DBFS.It seems that it only supports ...
When i tried to read file from dbfs, it throws error - Caused by: FileReadException: Error while reading file dbfs:/.......................parquet is not a Parquet file. Expected magic number at tail [80, 65, 82, 49] but found [105, 108, 101, 115].Bu...
Hi @KARTHICK N​, What's the one-line code you're trying to read the file, precisely the path?Can you confirm if your file is a CSV or Parquet file?Are you trying to read it in python or scala?
I am working on Databricks workspace migration, where I need to copy the Databricks workspace including DBFS from source to target (both source and target are in different subscription/account). Can someone suggest what could be approach to migrate D...
Hi @Aquib Javeed​, We haven’t heard from you on the last response from me, and I was checking back to see if my suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.
Hi,Is it possible to restrict upload files to dfbs root (Since everyone has access) ? The idea is to force users to use an ADLS2 mnt with credential passthrough for security reasons.Also, right now users use azure blob explorer to interact with ADLS2...
Hi @E H​, We haven't heard from you on the last response from @Arvind Ravish​ , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please share it with the community as it can be helpful to others.Also, p...
Thanks for your kind reply:Below works for me:https://imgur.com/BmMzatIBut why, as you mentioned, using the classic path, below does not work?https://imgur.com/Ba1a4Iv
I wanted to save my delta tables in my Databricks database. When I saveAsTable, there is an error message Azure Databricks: AnalysisException: Database 'bf' not found​Ye, There is no database named "bf" in my database.Here is my full code:import os
i...
We've setup a premium workspace with passthrough credentials cluster , while they do work and access my adls gen 2 storageI can't make it install a library on the cluster from there. and keeping getting"Library installation attempted on the driver no...
Sorry I can't figure this out, the link you've added is irrelevant for passthrough credentials, if we add it the cluster won't be passthrough, Is there a way to add this just for a specific folder? while keeping passthrough for the rest?
Hello everyone,I've created by error a DBFS folder named : ${env]But when I run this command :dbutils.fs.rm("/mnt/${env]")It returns me this error : java.net.URISyntaxException: Illegal character in path at index 12: /mnt/$%7Benv]How can I do please ...
hello all! I am using the guide https://docs.databricks.com/data/filestore.html to save folder of static html content to the DBFS FileStore directory (as a sub-directory) and have "enable DBFS web browsing" setting on but still I can't view the web p...
@Sergii Ivakhno​ In FileStore you can save files, such as images and libraries, that are accessible within HTML and JavaScript when you call displayHTML. However when you try to access the link it will download the file to your local desktop.
Problem statement:Source file format : .tar.gzAvg size: 10 mbnumber of tar.gz files: 1000Each tar.gz file contails around 20000 csv files.Requirement : Untar the tar.gz file and write CSV files to blob storage / intermediate storage layer for further...
Hi all,
I am using saveAsTextFile() to store the results of a Spark job in the folder dbfs:/FileStore/my_result.
I can access to the different "part-xxxxx" files using the web browser, but I would like to automate the process of downloading all fil...
works well if the file is stored in FileStore. However if it is stored in the mnt folder, you will need something like this:https://community.cloud.databricks.com/dbfs/mnt/blob/<file_name>.csv?o=<your_number_here>Note that this will prompt you for yo...
want to create an external function using CREATE FUNCTION (External) and expose it to users of my SQL endpoint. Although this works from a SQL notebook, if I try to use the function from a SQL endpoint, I get "User defined expression is not supporte...
Hi Folks,I have installed and configured databricks CLI in my local machine. I tried to move a local file from my personal computer using dbfs cp to dbfs:/ path. I can see the file is copied from local, and is only visible in local. I am not able to ...
Hi, Could you try to save the file from your local machine to dbfs:/FileStore location?# Put local file test.py to dbfs:/FileStore/test.pydbfs cp test.py dbfs:/FileStore/test.py