Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I have tried many times all the answers from the internet and stackover flowI have already created the config section before this steps, it passed but this below step is not executing.
We were getting this problem when using directory-scoped SAS tokens. While I know there are a number of potential issues that can cause this problem, one potential explanation is that it turns out there is an undocumented spark setting needed on the ...
Hello,I'm trying to use the CDM connector for Spark, but I can't connecto to the Azure storage account when using the connector. I mounted a container of storage account with a SAS-token. When I'm trying to read CDM data from a (mounted) storage acco...
Hi @Martijn de Bruijn Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
Hi, I am trying to connect to the Storage Account using the SAS token, and receive this error: Unable to load SAS token provider class: java.lang.IllegalArgumentException - more on the picture.I couldnt find anything on the web for this error.I also ...
@Retko Okter :It seems that there is an issue with the SAS token provider class. This error can occur when the SAS token is not correctly formatted or is invalid.Here are some steps you can try to resolve the issue:Verify that the SAS token is corre...
In my scenario, the new data coming in are the current, valid records. Any records that are not in the new data should be labeled as 'Gone", any matching records should be labeled with "Updated". And finally, any new records should be added.So in sum...
Detection deletions does not work out of the box.The merge statement will evaluate the incoming data against the existing data. It will not check the existing data against the incoming data.To mark deletions, you will have to specifically update tho...
I am trying to convert SAS files to CSV in Azure Databricks. SAS files are in Azure Blob. I am successfully able to mount the azure blob in Databricks, but when I read from it, it has no files even though there are files on Blob. Has anyone done this...