- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-27-2023 10:27 AM
Hi Experts,
We do have one use case where we have batch load that create a dataframe at end and now we want to load this data at Azure Storage Queue/Message Queue so that some Rest API can read the data/messages from the queue later and process it accordingly.
Is it possible ? If not, what would be the best approach to follow here.
Thanks.
- Labels:
-
Azure databricks
-
AzureStorage
-
MessageQueue
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-28-2023 10:32 AM
@Sanjoy Senโ :
Yes, it is possible to load the data from a DataFrame into Azure Storage Queue/Message Queue. Here's one possible approach:
- Convert the DataFrame into a JSON string using the toJSON() method.
- Use the Azure Storage Queue/Message Queue client library for Python to create a queue client object and send the JSON string as a message to the queue. You will need to authenticate with the Azure storage account using a connection string or credentials.
Here's some example code to get you started:
import json
from azure.storage.queue import QueueClient
# Convert the DataFrame to a JSON string
json_data = df.toJSON().collect()
# Create a queue client object
queue_client = QueueClient.from_connection_string(conn_str="<your-connection-string>", queue_name="<your-queue-name>")
# Send the JSON string as a message to the queue
for data in json_data:
message = json.dumps(data)
queue_client.send_message(message)
In the above code, replace <your-connection-string> with the connection string for your Azure storage account and <your-queue-name> with the name of your queue. Also, make sure to handle any errors or exceptions that may occur during the message sending process.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-28-2023 09:26 AM
@Suteja Kanuriโ looking for your input here. Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-28-2023 10:32 AM
@Sanjoy Senโ :
Yes, it is possible to load the data from a DataFrame into Azure Storage Queue/Message Queue. Here's one possible approach:
- Convert the DataFrame into a JSON string using the toJSON() method.
- Use the Azure Storage Queue/Message Queue client library for Python to create a queue client object and send the JSON string as a message to the queue. You will need to authenticate with the Azure storage account using a connection string or credentials.
Here's some example code to get you started:
import json
from azure.storage.queue import QueueClient
# Convert the DataFrame to a JSON string
json_data = df.toJSON().collect()
# Create a queue client object
queue_client = QueueClient.from_connection_string(conn_str="<your-connection-string>", queue_name="<your-queue-name>")
# Send the JSON string as a message to the queue
for data in json_data:
message = json.dumps(data)
queue_client.send_message(message)
In the above code, replace <your-connection-string> with the connection string for your Azure storage account and <your-queue-name> with the name of your queue. Also, make sure to handle any errors or exceptions that may occur during the message sending process.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-28-2023 09:59 PM
@Suteja Kanuriโ excellent and I would like to thank you once again.

