cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Tableau Prep to Databricks Error

Karsten
Visitor

Hi all,

When writing from Tableau Prep to Databricks on Azure, we receive the following error message:

[Databricks][Hardy] (52) Error communicating with the service: 403 

Interestingly, reading from Databricks works without any issues. The Databricks user has all necessary permissions and no issues are logged on this page. A temporary table is successfully created and dropped.

What is going wrong here?

3 REPLIES 3

amirabedhiafi
New Contributor II

Hello @Karsten !

I don't think it is a read permission issue because Tableau prep write path is different from its read path.

Tableau prep writes database output in stages so it generates rows, writes them to a temporary table/staging area then moves the data into the final destination table.

So the fact that the temp table is created and dropped only proves the first part works, not the final write or replace step.

https://help.tableau.com/current/prep/en-us/prep_save_share.htm

DBKS output from Tableau prep supports create and replace only and append is not currently supported. 
If you are using append, switch to create or replace or handle the append inside DBKS with a notebook or job.

Don't forget for UC, reading needs SELECT but writing needs more:

  • USE CATALOG

  • USE SCHEMA

  • MODIFY on the target table for writing

  • CREATE TABLE on the schema if Tableau creates the table

  • MANAGE or ownership if Tableau drops or replaces an existing table

https://docs.databricks.com/aws/en/tables/tables-concepts

The user or SP must have at least CAN USE on the SQL warehouse.

 https://kb.databricks.com/dbsql/jobs-in-sql-warehouse-returning-403-error

Tableau requires the DBKS ODBC driver 2.8.2 or newer for this output scenario and DBKS output support is tied to Tableau cloud or publishing to Tableau cloud from version 2025.3.

If this answer resolves your question, could you please mark it as โ€œAccept as Solutionโ€? It will help other users quickly find the correct fix.

Senior BI/Data Engineer | Microsoft MVP Data Platform | Microsoft MVP Power BI | Power BI Super User | C# Corner MVP

Hello @amirabedhiafi!

Thanks for your fast reply. I'll forward this information to mx colleague (I'm only the Tableau guy). 

We have the newest driver and supposedly my user got admin privileges. But let's see.

One more question: I'm wondering why there are no errors logged under my user account in DBKS. Shouldn't there be some entries there?

Hi again !

When I check your screenshot I can see that Tableau activity is actually logged under your user and the staging steps seem to succeed.

In other words, Tableau created a temporary volume, uploads a CSV into /Volumes/... and drops the volume. So I think the issue may not be the initial SQL connection or staging table creation.

Canyou please filter query history by failed Tableau queries around the exact timestamp?

If you don't find any failed SQL query the error may be happening outside QH, possibly during the final write step, UC volume access, external location or simply the ODBC/REST call.

If this answer resolves your question, could you please mark it as โ€œAccept as Solutionโ€? It will help other users quickly find the correct fix.

Senior BI/Data Engineer | Microsoft MVP Data Platform | Microsoft MVP Power BI | Power BI Super User | C# Corner MVP