- 3133 Views
- 1 replies
- 0 kudos
I am getting this type of erro "com.databricks.spark.sqldw.SqlDWConnectorException: Exception encountered in Azure Synapse Analytics connector code" while loading the data into Synapse. I am able to load other tables with the same connection. Not sur...
- 3133 Views
- 1 replies
- 0 kudos
Latest Reply
The error message you provided indicates that an exception occurred in the Azure Synapse Analytics connector code while loading data into Synapse. This can happen due to various reasons, such as issues with the table structure, data type compatibilit...
by
JH
• New Contributor II
- 8817 Views
- 0 replies
- 0 kudos
Hi Databricks,
We want to understand a vulnerability issue - CVE-2023-22946.
does this CVE affect Azure databricks users? Does it affect data plane?Is it managed by databricks?Should end users do anything to mitigate it if this issue is a shared ...
- 8817 Views
- 0 replies
- 0 kudos
by
Teja07
• New Contributor II
- 2829 Views
- 1 replies
- 0 kudos
We would like to know the number of DBU's utilized cluster level/workspace level which was hosted in Azure. We were able to see the amount spend but not able to see the DBU's. If we get any idea of no.of DBU's it would help us to think of reserved DB...
- 2829 Views
- 1 replies
- 0 kudos
Latest Reply
@Mani Teja G we don't have screen like aws for azure, but in azure you can monitor everything in cost management --> cost analysis and select tags as Databricks
- 1520 Views
- 1 replies
- 1 kudos
Is there a way to get the databricks units (DBU) for an existing Azure databricks workspace?
- 1520 Views
- 1 replies
- 1 kudos
Latest Reply
I'm keen to know this as well.
- 7031 Views
- 4 replies
- 7 kudos
I have sensor data coming into Azure Event Hub and need some help in deciding how to best ingest it into the Data Lake and Delta Lake:
Option 1:
azure event hub > databricks structured streaming > delta lake (bronze)
Option 2:
azure event hub > e...
- 7031 Views
- 4 replies
- 7 kudos
Latest Reply
If batch job is possible and you need to process data I would use probably:azure event hub from (events after previous job run) > databricks job process as dataframe > save df to delta lakeno streaming or capturing needed in that case
3 More Replies