Skip to content

Sharing Data (Queue, Pipe, Manager)

Why sharing is different

Processes do not share memory like threads.

So you can’t safely update a global variable and expect other processes to see it.

Use a Queue to send messages/results.

mp_queue.py
from multiprocessing import Process, Queue
 
 
def worker(q: Queue, x: int) -> None:
    q.put(x * x)
 
 
if __name__ == "__main__":
    q = Queue()
 
    procs = [Process(target=worker, args=(q, i)) for i in range(5)]
    for p in procs:
        p.start()
 
    results = [q.get() for _ in procs]
 
    for p in procs:
        p.join()
 
    print(sorted(results))
mp_queue.py
from multiprocessing import Process, Queue
 
 
def worker(q: Queue, x: int) -> None:
    q.put(x * x)
 
 
if __name__ == "__main__":
    q = Queue()
 
    procs = [Process(target=worker, args=(q, i)) for i in range(5)]
    for p in procs:
        p.start()
 
    results = [q.get() for _ in procs]
 
    for p in procs:
        p.join()
 
    print(sorted(results))

Pipe

PipePipe is a two-way connection.

mp_pipe.py
from multiprocessing import Process, Pipe
 
 
def worker(conn):
    conn.send("hello")
    conn.close()
 
 
if __name__ == "__main__":
    parent, child = Pipe()
    p = Process(target=worker, args=(child,))
    p.start()
    print(parent.recv())
    p.join()
mp_pipe.py
from multiprocessing import Process, Pipe
 
 
def worker(conn):
    conn.send("hello")
    conn.close()
 
 
if __name__ == "__main__":
    parent, child = Pipe()
    p = Process(target=worker, args=(child,))
    p.start()
    print(parent.recv())
    p.join()

Manager (shared dict/list)

Manager provides proxy objects.

mp_manager.py
from multiprocessing import Process, Manager
 
 
def worker(shared, i):
    shared[i] = i * i
 
 
if __name__ == "__main__":
    with Manager() as manager:
        shared = manager.dict()
 
        procs = [Process(target=worker, args=(shared, i)) for i in range(5)]
        for p in procs:
            p.start()
        for p in procs:
            p.join()
 
        print(dict(shared))
mp_manager.py
from multiprocessing import Process, Manager
 
 
def worker(shared, i):
    shared[i] = i * i
 
 
if __name__ == "__main__":
    with Manager() as manager:
        shared = manager.dict()
 
        procs = [Process(target=worker, args=(shared, i)) for i in range(5)]
        for p in procs:
            p.start()
        for p in procs:
            p.join()
 
        print(dict(shared))

Guidance

  • Prefer passing data via Queue when possible.
  • Use Manager for simple shared state (slower than local memory).

πŸ§ͺ Try It Yourself

Exercise 1 – Start a Process

Exercise 2 – Process Pool map()

Exercise 3 – Multiprocessing Queue

If this helped you, consider buying me a coffee β˜•

Buy me a coffee

Was this page helpful?

Let us know how we did