โ09-04-2023 03:51 AM - edited โ09-04-2023 03:54 AM
Hi everyone,
We are at the moment stumbeling upon a big challenge with loading data into PowerBI. I need some advice!
To give a bit of conext: we introduced Databricks instead of Azure Synapse for a client of ours. We are currently busy with moving all the PowerBI's to read from Azure Synapse instead of Databricks. Everything was fine and working well with smaller datasets. But for the larger and most important ones we stumble upon the "Token expired while fetching results: TEAuthTokenExpired." error. This occures when refreshing the data online/locally.
- All the PowerBI datasets are authenticated using OAuth.
- Some of the datasets need to load more than 350m records. I know, this isn't a best practice. But loading data from Azure Synapse was able, eventhough it would take more than 5 hours. We are now focussing on just replacing the data source with Databricks and optimize the PowerBI's later.
- Is it some way possible to increase the token duration when making fetching results via PowerBI?
I am not able to find a working solution on the Databricks forum and the internet, please help me out!
โ09-04-2023 08:46 AM
Hi @Retired_mod ,
Thank you for the response! The problem isn't in the queries, it is just the volume of the data. Loading the data for some reason just takes hours and hours.
Where can I find some of the resources or tutorials about the OAuth token?
I am curious if requesting new access would help us in this case, since it fails during e.g. when loading 200m records. When you have loaded e.g. 100m records, I can expect that it might cause some problems because it might lose the state of loading the data?
Thanks!
โ09-04-2023 08:57 AM
Thank you! I will take a look at it tomorrow.
I just can't imagine that Databricks isn't doing something about this problem, because we can't be the only one that want to load in a lot of data!
I know it isn't a best practice, and we would like to aggregate more in the DWH, but selling a product to a client that can't load data for more than an hour is a serious risk for the product if you ask me.
โ09-07-2023 07:25 AM
Currently our solution to this problem is using a Personal Access Token as authentication method. I stumbled upon the problem that when the dataset is scheduled via PowerBI it went back to OAuth authentication. Still checking if the problem is staying.
If this works, we will use OAuth for the datasets that take less than an hour and the token for datasets > 1 hour. Also looking into if I can manage to generate a personal access token via a service principal instead of an user.
โ06-17-2024 09:24 PM - edited โ06-18-2024 06:36 AM
@pauloquantile Hi, Paulo. We do have 350+ million records and I am facing the same issue. Is there any workaround for this??
โ06-23-2024 11:45 PM
I mentioned it in the thread on 09/07/2023!
โ07-23-2024 11:48 PM
@Retired_mod Is there any further update from Databricks which can be helpful here OR what @pauloquantile mentioned is the only workaround solution?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group