- 4214 Views
- 3 replies
- 1 kudos
Hi All,, quick question:Is this correct data flow pattern: Databricks -> Az SQL -> Tableau??Or does it have to go through ADLS: Databricks -> ADLS -> Az SQL - > Tableau? Also, is it better to leverage databricks lakehouse sql warehouse capability as ...
- 4214 Views
- 3 replies
- 1 kudos
Latest Reply
I would not call it 'better' per se. A lakehouse is a more modern approach to a classic datawarehouse, using flexible distributed cloud compute, cheap storage and open file formats.If you have an existing environment, which works well, that is heavi...
2 More Replies
- 8160 Views
- 1 replies
- 0 kudos
Hi community!I was in a Databricks webinar and one of the participants said "Delta Live Tables seems to have some limitations when using with Unity Catalog. Is the idea to get parity with Hive?" and someone answered "DLT + Unity Catalog combination h...
- 8160 Views
- 1 replies
- 0 kudos
- 30492 Views
- 1 replies
- 1 kudos
Hashes are commonly used in SCD2 merges to determine whether data has changed by comparing the hashes of the new rows in the source with the hashes of the existing rows in the target table. PySpark offers multiple different hashing functions like:MD5...
- 30492 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @Retired_mod ,thank you for your comprehensive answer. What is your opinion on the trade-off between using a hash like xxHASH64 which returns a LongType column and thus would offer good performance when there is a need to join on the hash column v...
- 1882 Views
- 0 replies
- 0 kudos
Hello,I am using Delta Live Tables to store data and then trying to save them to ADLS. I've specified the storage location of the Delta Live Tables in my Delta Live Tables pipeline. However, when I check the files that are saved in ADLS, they are cor...
- 1882 Views
- 0 replies
- 0 kudos