Parallel processing

related topics
{system, computer, user}
{disease, patient, cell}
{theory, work, human}
{math, number, function}
{game, team, player}
{ship, engine, design}

Parallel processing is the ability to carry out multiple operations or tasks simultaneously. The term is used in the contexts of both human cognition, particularly in the ability of the brain to simultaneously process incoming stimuli, and in parallel computing by machines.

Parallel Processing by the Brain

Parallel processing is the ability of the brain to simultaneously process incoming stimuli of differing quality. This becomes most important in vision, as the brain divides what it sees into four components: color, motion, shape, and depth. These are individually analyzed and then compared to stored memories, which helps the brain identify what you are viewing. The brain then combines all of these into one image that you see and comprehend. Parallel processing has been linked, by some experimental psychologists, to the Stroop effect. This is a continual and seamless operation.

Parallel Processing in Computers

The simultaneous use of more than one CPU or processor core to execute a program or multiple computational threads. Ideally, parallel processing makes programs run faster because there are more engines (CPUs or cores) running it. In practice, it is often difficult to divide a program in such a way that separate CPUs or cores can execute different portions without interfering with each other. Most computers have just one CPU, but some models have several, and multi-core processor chips are becoming the norm. There are even computers with thousands of CPUs.

With single-CPU, single-core computers, it is possible to perform parallel processing by connecting the computers in a network. However, this type of parallel processing requires very sophisticated software called distributed processing software.

Note that parallelism differs from concurrency. Concurrency is a term used in the operating systems and databases communities which refers to the property of a system in which multiple tasks remain logically active and make progress at the same time by interleaving the execution order of the tasks and thereby creating an illusion of simultaneously executing instructions. Page text.[1]Parallelism, on the other hand, is a term typically used by the supercomputing community to describe executions that physically execute simultaneously with the goal of solving a problem in less time or solving a larger problem in the same time. Parallelism exploits concurrency.[1]

Parallel processing is also called parallel computing. In the quest of cheaper computing alternatives parallel processing provides a viable option. The idle time of processor cycles across network can be used effectively by sophisticated distributed computing software. a

References

  • Myers, David G. Psychology. 6th ed. New York: Worth, 2001.
  • Parallel Processing via MPI & OpenMP, M. Firuziaan, O. Nommensen. Linux Enterprise, 10/2002

[Computer Chronicles: Parallel Processing (1986)] Last Access 2010-11-20

Full article ▸

related documents
Computer hardware
Trivial File Transfer Protocol
16550 UART
JOHNNIAC
Audio editing
Star network
Peripheral
Acoustic coupler
Video editing software
Jaggies
Pulse-amplitude modulation
GeForce
Mobile ad-hoc network
DASS2
Image and Scanner Interface Specification
Network Layer
Coda (file system)
Real-time computing
Red Hat Linux
System-on-a-chip
Electronic mixer
Multimeter
Manchester code
TX-0
Floating point unit
StrongARM
Windowing system
Revision Control System
Myrinet
Csound