โ04-12-2023 02:45 PM
I have sql warehouse endpoints that work fine when querying from applications such as Tableau, but just running the included sample query against a running endpoint from the Query Editor from the workspace is returning "Unable to upload to DBFS Query". Not sure where to start looking to solve this.
โ04-15-2023 05:43 PM
@Marvin Ginnsโ :
The "Unable to upload to DBFS Query" error typically occurs when the Databricks Query Editor is unable to upload the query results to the Databricks File System (DBFS). This can happen for various reasons, such as network connectivity issues, insufficient DBFS storage, or authentication errors.
Here are some troubleshooting steps that you can try:
%fs df
3. If you are running low on storage, you can try deleting some unnecessary files or increasing your DBFS storage limit.
4 .Verify your credentials: Make sure that you have the necessary permissions to access the SQL warehouse endpoint and the DBFS. If you are using a service principal to authenticate, verify that the principal has the required permissions to access the endpoint and the DBFS.
5. Check your query: Verify that your query is syntactically correct and that it is not returning an excessive amount of data. Large query results may take a long time to upload to the DBFS and can cause the "Unable to upload to DBFS Query" error.
โ04-17-2023 11:10 AM
Hi Suteja, thanks for your feedback. See my comments below and let me know if you assist further!
1) Check your network connectivity: Make sure that you have a stable internet connection and that you are able to access other websites or services without issues.
Connectivity is confirmed since I can connect to the SQL endpoint via Partner Connect and load the data into any number of tools, while at the same time receive this error in Query Editor immediately afterwards. This is also affecting anyone else using Query Editor from any location in the U.S.
2, 3) Check your DBFS storage: Verify that you have enough DBFS storage to store the query results. You can check your current DBFS storage usage by running the following command in a notebook cell:
I receive the following error in a notebook but connected to the driver web terminal and ran a 'df -h' and it seems DBFS disk space should not be an issue.
4) Verify your credentials: Make sure that you have the necessary permissions to access the SQL warehouse endpoint and the DBFS. If you are using a service principal to authenticate, verify that the principal has the required permissions to access the endpoint and the DBFS.
I am admin and receive the same message, but our end-users are using Query Editor in the Workspace via their AAD credentials (Azure Databricks). End-users have access to hive_metastore.default and can view table sample data via Data Explorer. Users can also load data via Partner Connect with their user tokens in their tools like Power BI so their warehouse permissions and hive permissions do not seem to be an issue. I'm not aware on any explicit DBFS volume permissions that are required on the managed storage account where root is located.
5) Check your query: Verify that your query is syntactically correct and that it is not returning an excessive amount of data. Large query results may take a long time to upload to the DBFS and can cause the "Unable to upload to DBFS Query" error.
Here's an example of a query that is generating the error. Let me know if there's other suggestions or where I can find logs related to DBFS or the SQL endpoint. Thx!
โ04-26-2023 09:05 AM
This was resolved by opening a ticket with MSFT.
โ06-17-2024 06:45 AM
โ08-01-2024 09:49 AM
I also opened a ticket to MSFT and this was the response:
"I have confirmed with Core Databricks team and below is the root cause of the issue,
One of the product features (serverless notebooks) uses server-side copy operation in Azure blob storage to process the command results. After enabling private DBFS, the authentication to cloud storage in the control plane switches from access keys to AAD tokens generated on behalf of a Managed Identity (Access Connector). AAD tokens were scoped such that server-side copy would not work. Correcting the scope of AAD tokens addresses the issue. Hence they confirmed that the future deployments should not have any issues"
โ04-15-2023 11:34 PM
Hi @Marvin Ginnsโ
Thank you for posting your question in our community! We are happy to assist you.
To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?
This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group