Skip to content
alexanderhiam edited this page May 31, 2012 · 2 revisions

One of the many great things about using single-board Linux computers for hardware expansion is the ability to run processes asynchronously. This can easily be achieved by running PyBBIO programs in the background and managing them with jobs (see here for one of the many tutorials on this process), but sometimes it is more desirable to run multiple processes from the same program, especially if they need to share information between them. Luckily Python offers a straightforward way to do this with the built-in multiprocessing module, which includes ways to create and manage separate processes as well as create shared queues and pipes between them.

One caveat of using the multiprocessing library is that you usually need a way to stop your child processes when the main process wants to stop, or else they could block your program from exiting, or even become runaway daemon processes. The SafeProcess library provides a wrapper for Python's multiprocessing.Process class which acts exactly the same, but is automatically terminated during PyBBIO's cleanup routine.

SafeProcess may be used in one of two ways. The simplest is just to pass it a function and tell it to start. It will then fork a child process which will call the function once then die. If the target function has an infinite loop, the child process will sit happily in the background running it until the program is stopped. An example of this can be seen in PyBBIO/examples/SafeProcess_test.py.

The second way to use the SafeProcess class is to inherit from it, overriding the run() method with the code to be executed when the process is started. One thing that SafeProcess provides that multiprocessing.Process does not is the config() method. This may be overridden to handle any initialization that does not require that arguments be passed in, without having to override __init__(). For example:

class MyProcess(SafeProcess):
  def config(self):
    self.value = 3

  def run(self):
    while(True):
      print self.value
      self.value += 1
      delay(1000)

If arguments are required the __init__() method may be overridden, but it must call SafeProcess.__init__():

class MyProcess(SafeProcess):
  def __init__(self, start_value):
    self.value = start_value
    SafeProcess.__init__(self)

  def run(self):
    while(True):
      print self.value
      self.value += 1
      delay(1000)

You would then create an instance of the MyProcess class and tell it to start, so all together you might have:

from bbio import *
from SafeProcess import *

class MyProcess(SafeProcess):
  def __init__(self, start_value):
    self.value = start_value
    SafeProcess.__init__(self)

  def run(self):
    while(True):
      print self.value
      self.value += 1
      delay(1000)


def setup():
  my_process = MyProcess(100)
  my_process.start()

def loop():
  print "the main loop still executes"
  delay(5000)
  stop()

run(setup, loop)

In any of the above cases, the child process can be killed manually with the SafeProcess.terminate() method.

To see SafeProcess in action, check out the EventLoop class in the EventIO library.

Be sure to read up on the multiprocessing module for details on how to share information between processes.