next up previous contents
Next: Conclusion Up: Adventures in Parallel Logic Previous: Granularity Control

Further Issues

Exploitation of parallelism from logic programming languages needs to take into account the current trends in the evolution of this programming paradigm. In particular, some of the most relevant topics are the followings.

Constraint Logic Programming: logic programming is extended with capabilities to handle constraints over different domains. Originally introduced by Lassez [70] and successively applied in a wide variety of frameworks (e.g. [52, 11]), constraint logic programming is quickly becoming a fundamental feature of most implementations of logic programming. The interactions between parallelism and constraints handling are currently an important topic of research and many problems are still open.

Concurrency: concurrency can be defined as the ability to express actions which are logically simultaneous. This term is often confused with the concept of parallelism (i.e., gain of speed-up by executing actions simultaneously), although the two notions are inherently different--concurrent systems need not be parallel, and parallel systems need not be concurrent.

Concurrency has been shown [92, 63] to be a powerful programming instruments, allowing to write programs capables of coroutining and reactive behaviour.

Data Parallelism: Another important classification of the forms of parallelism is the one which distinguishes Control Parallelism from Data Parallelism [7].

Broadly speaking, Control Parallelism (often indicated also as MIMD parallelism) arises when different operations (or functions, procedures) are applied to different data items in parallel. No a-priori synchronization exists between the different parallel computations--i.e., any form of synchronization needs to be explicitly programmed.

Data Parallelism, also known as SIMD/SPMD parallelism, arises when the same operation is applied in parallel to more than one different data-items. This involves a sort of lock-step execution pattern where in one step the desired operation is concurrently applied to the different data items.

In many cases data parallelism can be seen as a special instance of control parallelism, but the chief advantage of data parallelism is its more specialized nature, which makes it suitable to more efficient implementations, incurring in a considerably reduced amount of overhead.

An ideal parallel system should provide features for efficiently supporting data parallelism, identifying its occurrences and reducing the overhead as much as possible.

next up previous contents
Next: Conclusion Up: Adventures in Parallel Logic Previous: Granularity Control

Tue Mar 19 14:37:09 MST 1996