cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Loading data from dataframe to Azure Storage Queue/Message Queue.

sensanjoy
Contributor

Hi Experts,

We do have one use case where we have batch load that create a dataframe at end and now we want to load this data at Azure Storage Queue/Message Queue so that some Rest API can read the data/messages from the queue later and process it accordingly.

Is it possible ? If not, what would be the best approach to follow here.

Thanks.

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

@Sanjoy Sen​ :

Yes, it is possible to load the data from a DataFrame into Azure Storage Queue/Message Queue. Here's one possible approach:

  1. Convert the DataFrame into a JSON string using the toJSON() method.
  2. Use the Azure Storage Queue/Message Queue client library for Python to create a queue client object and send the JSON string as a message to the queue. You will need to authenticate with the Azure storage account using a connection string or credentials.

Here's some example code to get you started:

import json
from azure.storage.queue import QueueClient
 
# Convert the DataFrame to a JSON string
json_data = df.toJSON().collect()
 
# Create a queue client object
queue_client = QueueClient.from_connection_string(conn_str="<your-connection-string>", queue_name="<your-queue-name>")
 
# Send the JSON string as a message to the queue
for data in json_data:
    message = json.dumps(data)
    queue_client.send_message(message)

In the above code, replace <your-connection-string> with the connection string for your Azure storage account and <your-queue-name> with the name of your queue. Also, make sure to handle any errors or exceptions that may occur during the message sending process.

View solution in original post

3 REPLIES 3

sensanjoy
Contributor

@Suteja Kanuri​  looking for your input here. Thanks.

Anonymous
Not applicable

@Sanjoy Sen​ :

Yes, it is possible to load the data from a DataFrame into Azure Storage Queue/Message Queue. Here's one possible approach:

  1. Convert the DataFrame into a JSON string using the toJSON() method.
  2. Use the Azure Storage Queue/Message Queue client library for Python to create a queue client object and send the JSON string as a message to the queue. You will need to authenticate with the Azure storage account using a connection string or credentials.

Here's some example code to get you started:

import json
from azure.storage.queue import QueueClient
 
# Convert the DataFrame to a JSON string
json_data = df.toJSON().collect()
 
# Create a queue client object
queue_client = QueueClient.from_connection_string(conn_str="<your-connection-string>", queue_name="<your-queue-name>")
 
# Send the JSON string as a message to the queue
for data in json_data:
    message = json.dumps(data)
    queue_client.send_message(message)

In the above code, replace <your-connection-string> with the connection string for your Azure storage account and <your-queue-name> with the name of your queue. Also, make sure to handle any errors or exceptions that may occur during the message sending process.

@Suteja Kanuri​  excellent and I would like to thank you once again.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.