Go to the first, previous, next, last section, table of contents.


When not to use MPI

In some situations MPI may not be the most appropriate communications library. Programs that have irregular communication patterns are often difficult to express in MPI's message-passing model. The difficulty lies in pairing up every send with a matching receive. MPI's asychronous communication operations can be used to make this kind of program more manageable, but other communication libraries, like SHMEM, for example, may be more appropriate for these problems.

Second, there are domain-specific applications which can be implemented better with an API tailored to that application. For example, Global Arrays is a good match for array-based or linear algebra-based computations.

Dynamic programs are another area for which MPI may not be suitable. MPI implementations generally require all processes involved in the computation to start up at the same time. Participants can not join and leave the computation arbitrarily as it progresses. While the forthcoming MPI-2 standard will have support for such dynamic programs, in the HPVM 1.9 release, Fast Messages (see section Fast Messages) is the only communications library that supports dynamic programs.


Go to the first, previous, next, last section, table of contents.