Inter Process Communication in Python

 

Python Multiprocessing

Parallel processing is getting more attention nowadays. If you still don’t know about the parallel processing, learn from wikipedia.

As CPU manufacturers start adding more and more cores to their processors, creating parallel code is a great way to improve performance. Python introduced multiprocessing module to let us write parallel code.

To understand the main motivation of this module, we have to know some basics about parallel programming. After reading this article, we hope that, you would be able to gather some knowledge on this topic.

Python Multiprocessing Process, Queue and Locks

There are plenty of classes in python multiprocessing module for building a parallel program. Among them, three basic classes are Process, Queue and Lock. These classes will help you to build a parallel program.

But before describing about those, let us initiate this topic with simple code. To make a parallel program useful, you have to know how many cores are there in you pc. Python Multiprocessing module enables you to know that. The following simple code will print the number of cores in your pc.


import multiprocessing

print("Number of cpu : ", multiprocessing.cpu_count())

The following output may vary for your pc. For me, number of cores is 8.

python multiprocessing example cpu count

Python multiprocessing Process class

Python multiprocessing Process class is an abstraction that sets up another Python process, provides it to run code and a way for the parent application to control execution.

There are two important functions that belongs to the Process class – start() and join() function.

At first, we need to write a function, that will be run by the process. Then, we need to instantiate a process object.

If we create a process object, nothing will happen until we tell it to start processing via start() function. Then, the process will run and return its result. After that we tell the process to complete via join() function.

Without join() function call, process will remain idle and won’t terminate.

So if you create many processes and don’t terminate them, you may face scarcity of resources. Then you may need to kill them manually.

One important thing is, if you want to pass any argument through the process you need to use args keyword argument. The following code will be helpful to understand the usage of Process class.


from multiprocessing import Process


def print_func(continent='Asia'):
    print('The name of continent is : ', continent)

if __name__ == "__main__":  # confirms that the code is under main function
    names = ['America', 'Europe', 'Africa']
    procs = []
    proc = Process(target=print_func)  # instantiating without any argument
    procs.append(proc)
    proc.start()

    # instantiating process with arguments
    for name in names:
        # print(name)
        proc = Process(target=print_func, args=(name,))
        procs.append(proc)
        proc.start()

    # complete the processes
    for proc in procs:
        proc.join()

The output of the following code will be:
python multiprocessing example process class

Python multiprocessing Queue class

You have basic knowledge about computer data-structure, you probably know about Queue.

Python Multiprocessing modules provides Queue class that is exactly a First-In-First-Out data structure. They can store any pickle Python object (though simple ones are best) and are extremely useful for sharing data between processes.

Queues are specially useful when passed as a parameter to a Process’ target function to enable the Process to consume data. By using put() function we can insert data to then queue and using get() we can get items from queues. See the following code for a quick example.


from multiprocessing import Queue

colors = ['red', 'green', 'blue', 'black']
cnt = 1
# instantiating a queue object
queue = Queue()
print('pushing items to queue:')
for color in colors:
    print('item no: ', cnt, ' ', color)
    queue.put(color)
    cnt += 1

print('\npopping items from queue:')
cnt = 0
while not queue.empty():
    print('item no: ', cnt, ' ', queue.get())
    cnt += 1

python multiprocessing queue

Python multiprocessing Lock Class

The task of Lock class is quite simple. It allows code to claim lock so that no other process can execute the similar code until the lock has be released. So the task of Lock class is mainly two. One is to claim lock and other is to release the lock. To claim lock the, acquire() function is used and to release lock release() function is used.

Python multiprocessing example

In this Python multiprocessing example, we will merge all our knowledge together.

Suppose we have some tasks to accomplish. To get that task done, we will use several processes. So, we will maintain two queue. One will contain the tasks and the other will contain the log of completed task.

Then we instantiate the processes to complete the task. Note that the python Queue class is already synchronized. That means, we don’t need to use the Lock class to block multiple process to access the same queue object. That’s why, we don’t need to use Lock class in this case.

Below is the implementation where we are adding tasks to the queue, then creating processes and starting them, then using join() to complete the processes. Finally we are printing the log from the second queue.


from multiprocessing import Lock, Process, Queue, current_process
import time
import queue # imported for using queue.Empty exception


def do_job(tasks_to_accomplish, tasks_that_are_done):
    while True:
        try:
            '''
                try to get task from the queue. get_nowait() function will 
                raise queue.Empty exception if the queue is empty. 
                queue(False) function would do the same task also.
            '''
            task = tasks_to_accomplish.get_nowait()
        except queue.Empty:

            break
        else:
            '''
                if no exception has been raised, add the task completion 
                message to task_that_are_done queue
            '''
            print(task)
            tasks_that_are_done.put(task + ' is done by ' + current_process().name)
            time.sleep(.5)
    return True


def main():
    number_of_task = 10
    number_of_processes = 4
    tasks_to_accomplish = Queue()
    tasks_that_are_done = Queue()
    processes = []

    for i in range(number_of_task):
        tasks_to_accomplish.put("Task no " + str(i))

    # creating processes
    for w in range(number_of_processes):
        p = Process(target=do_job, args=(tasks_to_accomplish, tasks_that_are_done))
        processes.append(p)
        p.start()

    # completing process
    for p in processes:
        p.join()

    # print the output
    while not tasks_that_are_done.empty():
        print(tasks_that_are_done.get())

    return True


if __name__ == '__main__':
    main()

Depending on the number of task, the code will take some time to show you the output. The output of the following code will vary from time to time.

python multiprocessing example

Python multiprocessing Pool

Python multiprocessing Pool can be used for parallel execution of a function across multiple input values, distributing the input data across processes (data parallelism). Below is a simple Python multiprocessing Pool example.


from multiprocessing import Pool

import time

work = (["A", 5], ["B", 2], ["C", 1], ["D", 3])


def work_log(work_data):
    print(" Process %s waiting %s seconds" % (work_data[0], work_data[1]))
    time.sleep(int(work_data[1]))
    print(" Process %s Finished." % work_data[0])


def pool_handler():
    p = Pool(2)
    p.map(work_log, work)


if __name__ == '__main__':
    pool_handler()

Below image shows the output of the above program. Notice that pool size is 2, so two executions of work_log function is happening in parallel. When one of the function processing finishes, it picks the next argument and so on.

python multiprocessing pool

 

############################INTER PROCESS COMMUNICATION
#####PIPE

import multiprocessing

def sender(conn, msgs):
"""
function to send messages to other end of pipe
"""
for msg in msgs:
conn.send(msg)
print("Sent the message: {}".format(msg))
conn.close()

def receiver(conn):
"""
function to print the messages received from other
end of pipe
"""
while 1:
msg = conn.recv()
if msg == "END":
break
print("Received the message: {}".format(msg))

if __name__ == "__main__":
# messages to be sent
msgs = ["hello", "hey", "hru?", "END"]

# creating a pipe
parent_conn, child_conn = multiprocessing.Pipe()

# creating new processes
p1 = multiprocessing.Process(target=sender, args=(parent_conn,msgs))
p2 = multiprocessing.Process(target=receiver, args=(child_conn,))

# running processes
p1.start()
p2.start()

# wait until processes finish
p1.join()
p2.join()

 

###QUEUE

import multiprocessing

def square_list(mylist, q):
"""
function to square a given list
"""
# append squares of mylist to queue
for num in mylist:
q.put(num * num)

def print_queue(q):
"""
function to print queue elements
"""
print("Queue elements:")
while not q.empty():
print(q.get())
print("Queue is now empty!")

if __name__ == "__main__":
# input list
mylist = [1,2,3,4]

# creating multiprocessing Queue
q = multiprocessing.Queue()

# creating new processes
p1 = multiprocessing.Process(target=square_list, args=(mylist, q))
p2 = multiprocessing.Process(target=print_queue, args=(q,))

# running process p1 to square list
p1.start()
p1.join()

# running process p2 to get queue elements
p2.start()
p2.join()

 

SHARE MEMORY

import multiprocessing

def square_list(mylist, result, square_sum):
"""
function to square a given list
"""
# append squares of mylist to result array
for idx, num in enumerate(mylist):
result[idx] = num * num

# square_sum value
square_sum.value = sum(result)

# print result Array
print("Result(in process p1): {}".format(result[:]))

# print square_sum Value
print("Sum of squares(in process p1): {}".format(square_sum.value))

if __name__ == "__main__":
# input list
mylist = [1,2,3,4]

# creating Array of int data type with space for 4 integers
result = multiprocessing.Array('i', 4)

# creating Value of int data type
square_sum = multiprocessing.Value('i')

# creating new process
p1 = multiprocessing.Process(target=square_list, args=(mylist, result, square_sum))

# starting process
p1.start()

# wait until process is finished
p1.join()

# print result array
print("Result(in main program): {}".format(result[:]))

# print square_sum Value
print("Sum of squares(in main program): {}".format(square_sum.value))