A
andrea crotti
We need to be able to reload code on a live system. This live system
has a daemon process always running but it runs many subprocesses with
multiprocessing, and the subprocesses might have a short life...
Now I found a way to reload the code successfully, as you can see from
this testcase:
def func():
from . import a
print(a.ret())
class TestMultiProc(unittest.TestCase):
def setUp(self):
open(path.join(cur_dir, 'a.py'), 'w').write(old_a)
def tearDown(self):
remove(path.join(cur_dir, 'a.py'))
def test_reloading(self):
"""Starting a new process gives a different result
"""
p1 = Process(target=func)
p2 = Process(target=func)
p1.start()
res = p1.join()
open(path.join(cur_dir, 'a.py'), 'w').write(new_a)
remove(path.join(cur_dir, 'a.pyc'))
p2.start()
res = p2.join()
As long as I import the code in the function and make sure to remove the
"pyc" files everything seems to work..
Are there any possible problems which I'm not seeing in this approach or
it's safe?
Any other better ways otherwise?
has a daemon process always running but it runs many subprocesses with
multiprocessing, and the subprocesses might have a short life...
Now I found a way to reload the code successfully, as you can see from
this testcase:
def func():
from . import a
print(a.ret())
class TestMultiProc(unittest.TestCase):
def setUp(self):
open(path.join(cur_dir, 'a.py'), 'w').write(old_a)
def tearDown(self):
remove(path.join(cur_dir, 'a.py'))
def test_reloading(self):
"""Starting a new process gives a different result
"""
p1 = Process(target=func)
p2 = Process(target=func)
p1.start()
res = p1.join()
open(path.join(cur_dir, 'a.py'), 'w').write(new_a)
remove(path.join(cur_dir, 'a.pyc'))
p2.start()
res = p2.join()
As long as I import the code in the function and make sure to remove the
"pyc" files everything seems to work..
Are there any possible problems which I'm not seeing in this approach or
it's safe?
Any other better ways otherwise?