cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Can we use spark-stream to read/write data from mysql? I can't find an example.

sarvesh
Contributor III

If someone can link me an example where stream is used to read or write to mysql please do.

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

Regarding writing (sink) is possible without problem via foreachBatch .

I use it in production - stream autoload csvs from data lake and writing foreachBatch to SQL (inside foreachBatch function you have temporary dataframe with records and just use write to any jdbc or odbc).

Here is more deltails:

https://docs.databricks.com/spark/latest/structured-streaming/foreach.html

Reading stream from mysql is not the best architecture and officialy is not possible. Theoretically you could create custom reciver but better idea is just put what you save to mysql to kafka or some other broker/queue.

Easy way around (but it make sense only if you have less new records than 1 per second) is to use Azure logic apps - new record in Mysql is trigger and append data to Event hub which is then read as a stream by Databricks.

View solution in original post

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @ sarvesh! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

sarvesh
Contributor III

Thank you for taking sometime out to reply, as of now i have not found a structured way.

Hubert-Dudek
Esteemed Contributor III

Regarding writing (sink) is possible without problem via foreachBatch .

I use it in production - stream autoload csvs from data lake and writing foreachBatch to SQL (inside foreachBatch function you have temporary dataframe with records and just use write to any jdbc or odbc).

Here is more deltails:

https://docs.databricks.com/spark/latest/structured-streaming/foreach.html

Reading stream from mysql is not the best architecture and officialy is not possible. Theoretically you could create custom reciver but better idea is just put what you save to mysql to kafka or some other broker/queue.

Easy way around (but it make sense only if you have less new records than 1 per second) is to use Azure logic apps - new record in Mysql is trigger and append data to Event hub which is then read as a stream by Databricks.

Great example of your use case @Hubert Dudek​ 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.