cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to prevent sql queries in 2 notebooks from reading the same row from a Table ?

KrishZ
Contributor

I have an SQL query to select and update rows in a table.

I do this in batches of 300 rows (select 300 , update the selected 300 , select new 300 and update the newly selected and so on..)

I run this query in 2 different notebooks concurrently to speed up my processing

Can someone tell how to prevent the same row from the table getting selected in the sql query ?

4 REPLIES 4

PriyaAnanthram
Contributor III

Not too clear what your trying to do but I'd try to break the data set logically using mbe a column like date and then process them Hope that helps

PriyaAnanthram
Contributor III

mbe A rownumber column that you can add may help you run it in batches of 300

Yup that's exactly my current plan but there's an issue here. I will explain:

I have 1 million rows in my table

Let's say I give row# 1 to row# 500000 to 1st notebook

and

I give row# 500001 to row# 1 Million to 2nd Notebook.

What if the data is such that the first half (row# 1 to row# 500000) takes 1/10th the time for processing when compared to 2nd half (row# 500001 to row# 1 Million) ?

You see how this can mean Notebook 1 will finish way before Notebook 2. Ideally both Notebooks should run for equal time (around the same time) to finish my activity the fastest..

Hence predetermining the datasets for each notebook is not efficient. The notebooks should dynamically ingest a new batch(of 300 rows) as soon as it finishes the current batch. But my problem is both notebooks might end up ingesting the same batch .

Let me know if that makes sense 🙂

Anonymous
Not applicable

Hi @Krishna Zanwar​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!