Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to control the order of execution of two thread accessing the same shared data

in my example thread 1 and 2 write to the same shared resource x. and the main thread will read the value. I want to have the result 24 i.e. thread2 then thread1.

how can I control that ? I tried to define thread 2 before 1 and print the result after join the two threads and the result was 24 but this result is not guaranteed since the threads work in parallel

    int x = 10;

    void mainThread1(){

        x *= 2; 

    }

    void mainThread2(){
        x += 2;

    }

    int main() {

        std::thread th1(mainThread1);
        std::thread th2(mainThread2);

        std::cout << x << std::endl; // random print 20 or 22

        th1.join();
        th2.join();
 }
like image 988
DoctorX Avatar asked Oct 23 '25 06:10

DoctorX


1 Answers

Presumably you are looking for something other than the trivial answer here: join the first thread before starting the second one. You are showing a simplified example to illustrate a more general situation.

this result is not guaranteed since the threads work in parallel

This is correct. Issues related to relative sequences of operations between multiple threads, and when the effects of one execution thread are "visible" in another execution thread fall into a rather broad, complicated, and extensive C++ domain called "synchronization".

Inter-thread synchronization employs mutexes, condition variables, and atomics. Just like every other large C++ topic it is impractical to describe everything that needs to be described about synchronization in a brief answer on Stackoverflow, here. That would take pages. Therefore, I'll just provide a capsule summary of how this simple use case would handle synchronization, and refer you for all further gory details to your favorite C++ textbook or reference material, for more information. But the basic outline of how to do this here is:

  1. Define a mutex, a condition variable, and a bool flag, in addition to x.
  2. The first execution thread locks the mutex, updates x, sets the bool flag, and notifies the condition variable, then releases the mutex.
  3. The second execution thread locks the mutex, then wait_for the condition variable, with the condition being flag == true.
  4. The second execution thread updates x, and releases the mutex.

This is the classical solution that guarantees the order of execution, between the two execution threads, that you are looking for. The above answers how to guarantee that the results of

thread2 then thread1

are produced in the shown code.

like image 142
Sam Varshavchik Avatar answered Oct 25 '25 20:10

Sam Varshavchik



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!