User User - 8 days ago 5x
Python Question

Multiprocessing Share Unserializable Objects Between Processes

There are three questions as possible duplicates (but too specific):

By answering this question all three other questions can be answered.
Hopefully I make myself clear:

Once I created an object in some process created by multiprocessing:

  1. How do I pass a reference to that object to an other process?

  2. (not so important) How do I make sure that this process does not die while I hold a reference?

Example 1 (solved)

from concurrent.futures import *

def f(v):
return lambda: v * v

if __name__ == '__main__':
with ThreadPoolExecutor(1) as e: # works with ThreadPoolExecutor
l = list(, [1,2,3,4]))
print([g() for g in l]) # [1, 4, 9, 16]

Example 2

returns an object with mutable state. This identical object should be accessible from other processes.

Example 3

I have an object which has an open file and a lock - how do I grant access to other processes?


I do not want this specific error to not appear. Or a solution to this specific usecase. The solution should be general enough to just share unmovable objects between processes. The objects can potentially be created in any process. A solution that makes all objects movable and preserves identity can be good, too.

Any hints are welcome, any partial solution or code fragments that point at how to implement a solution are worth something. So we can create a solution together.

Here is an attempt to solve this but without multiprocessing:


What you want the other processes to do with the references?

The references can be passed to any other process created with multiprocessing(duplicate 3). One can access attributes, call the reference. Accessed attibutes may or may not be proxies.

What's the problem with just using a proxy?

Maybe there is no problem but a challenge. My impression was that a proxy has a manager and that a manager has its own process and so the unserializable object must be serialized and transfered (partially solved with StacklessPython/fork).
Also there exist proxies for special objects - it is hard but not impossible to build a proxy for all objects (solvable).

Solution? - Proxy + Manager?

Eric Urban showed that serialization is not the problem. The real challenge is in Example2&3: the synchronization of state. My idea of a solution would be to create a special proxy class for a manager. This proxy class

  1. takes a constuctor for unserializable objects

  2. takes a serializable object and transfers it to the manager process.

  3. (problem) according to 1. the unserializable object must be created in the manager process.


Most of the time it's not really desirable to pass the reference of an existing object to another process. Instead you create your class you want to share between processes:

class MySharedClass:
    # stuff...

Then you make a proxy manager like this:

import multiprocessing.managers as m
class MyManager(m.BaseManager):
    pass # Pass is really enough. Nothing needs to be done here.

Then you register your class on that Manager, like this:

MyManager.register("MySharedClass", MySharedClass)

Then once the manager is instanciated and started, with manager.start() you can create shared instances of your class with manager.MySharedClass. This should work for all needs. The returned proxy works exactly like the original objects, except for some exceptions described in the documentation.