cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

PowerBI "Token expired while fetching results: TEAuthTokenExpired."

pauloquantile
New Contributor III

Hi everyone,

We are at the moment stumbeling upon a big challenge with loading data into PowerBI. I need some advice!

To give a bit of conext: we introduced Databricks instead of Azure Synapse for a client of ours. We are currently busy with moving all the PowerBI's to read from Azure Synapse instead of Databricks. Everything was fine and working well with smaller datasets. But for the larger and most important ones we stumble upon the "Token expired while fetching results: TEAuthTokenExpired." error. This occures when refreshing the data online/locally.

- All the PowerBI datasets are authenticated using OAuth.

- Some of the datasets need to load more than 350m records. I know, this isn't a best practice. But loading data from Azure Synapse was able, eventhough it would take more than 5 hours. We are now focussing on just replacing the data source with Databricks and optimize the PowerBI's later.

- Is it some way possible to increase the token duration when making fetching results via PowerBI?

 

I am not able to find a working solution on the Databricks forum and the internet, please help me out! 

pauloquantile_0-1693824616520.png

 

6 REPLIES 6

Hi @Retired_mod ,

 

Thank you for the response! The problem isn't in the queries, it is just the volume of the data. Loading the data for some reason just takes hours and hours. 

 

Where can I find some of the resources or tutorials about the OAuth token?

I am curious if requesting new access would help us in this case, since it fails during e.g. when loading 200m records. When you have loaded e.g. 100m records, I can expect that it might cause some problems because it might lose the state of loading the data?

 

Thanks!

Thank you! I will take a look at it tomorrow.

I just can't imagine that Databricks isn't doing something about this problem, because we can't be the only one that want to load in a lot of data!

I know it isn't a best practice, and we would like to aggregate more in the DWH, but selling a product to a client that can't load data for more than an hour is a serious risk for the product if you ask me.

pauloquantile
New Contributor III

Currently our solution to this problem is using a Personal Access Token as authentication method. I stumbled upon the problem that when the dataset is scheduled via PowerBI it went back to OAuth authentication. Still checking if the problem is staying.

 

If this works, we will use OAuth for the datasets that take less than an hour and the token for datasets > 1 hour. Also looking into if I can manage to generate a personal access token via a service principal instead of an user.

@pauloquantile Hi, Paulo. We do have 350+ million records and I am facing the same issue. Is there any workaround for this??

I mentioned it in the thread on 09/07/2023!

viralpatel
New Contributor II

@Retired_mod Is there any further update from Databricks which can be helpful here OR what @pauloquantile mentioned is the only workaround solution?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group