Fundamentals of Microsoft .NET Programming: Multiprocessing

  • 10/15/2011

Looking for Parallelism

Some problems have naturally parallel solutions. For example, consider the Mandelbrot set shown in Figure 2-1. To produce this image, the program considers each pixel in the result separately. For that pixel, it performs a series of calculations that do not involve the other pixels in any way. This program can make as many threads as it wants, and each can work independently to calculate a color for its own pixel.

The only place where the threads need to interact is when they copy their results into the single final image. Even there, the threads don’t need to use locks because each thread works with a different pixel and doesn’t need to look at or modify the other pixels.

Figure 2-1

Figure 2-1 Displaying the Mandelbrot set is an embarrassingly parallel task.

This kind of algorithm, which is naturally parallel, is sometimes called embarrassingly parallel. Other embarrassingly parallel problems include ray tracing, generating frames for an animated movie (which may also involve ray tracing), some artificial intelligence approaches such as genetic algorithms, and random heuristics where the program picks a random solution and evaluates its effectiveness.

Even if a problem isn’t embarrassingly parallel, you may be able to come up with a workable parallel solution. For example, consider the earlier problem of adding up 2 million numbers. The simple solution of making two threads each add up half of the numbers doesn’t work because they spend a huge amount of time competing for access to the value total.

However, suppose each thread added up its own subtotal and only copied the result into total when it was finished. The new thread code looks like this:

subtotal = 0
For i = start To finish
    Get subtotal
    Calculate result = subtotal + value[i]
    Save result In subtotal
Lock total
    Get total
    Calculate result = total + subtotal
    Save result In total

In this version of the program, the loop where most of the work occurs contains no locks, so the threads can work independently. Only at the end do the threads need to lock the value total. The previous version of the program required 2 million locks, and those locks prevented the threads from running in parallel. This version uses only two locks and the threads can execute the vast majority of their calculations in parallel, so this version will be much faster.

This example shows how the approach you use can determine whether a program will benefit from multithreading. The key to making this solution work is avoiding locks. A good multithreaded application doesn’t use too many locks, avoids making threads contend for other scarce resources such as disk drives, and generally keeps calculations as separate as possible, as long as possible. Often, a thread’s contribution to the overall solution is used at the end of the thread’s calculations.