Solace to Azure Data Lake Storage
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-08-2024 08:25 PM
Hi Team,
What is the most effective method for performing data ingestion from Solace to Azure Data Lake Storage (ADLS) utilizing an Azure Databricks notebook? Any recommendations would be greatly appreciated.
Regards,
Phani
Labels:
- Labels:
-
Delt Lake
2 REPLIES 2
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Hi Phani,
You can use our Spark Micro Integration to bridge the stream from Solace to Databricks.
Refer to the below link for more details.
https://solace.com/integration-hub/apache-spark/
Regards,
Suresh Palaka
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
3 weeks ago
Here is the sample script to invoke the connector
val struct_stream = spark.readStream.format("solace")
.option("host", "")
.option("vpn", "")
.option("username", "")
.option("password", "")
.option("queue", "")
.option("connectRetries", 3)
.option("reconnectRetries", 3)
.option("batchSize", 10000)
.option("partitions", 1)
.load()

