What is parallel programming?
Parallelism in the context of computer science is the ability to break a program down into parts that can run independently of one another. This means that tasks can be done out of order and the result will still be the same as if they were done in the correct order.
Concurrency is the ability of an algorithm or program to perform more than one task at a time. The concept is similar to parallel processing, but with the possibility of many independent jobs doing different things at the same time instead of doing the same job.
Concurrent programs can be difficult to write because managing independent tasks requires coordination of resources. The famous Dining Philosophers Problem is a classic thought experiment that illustrates the complexity of resource sharing and concurrency.
Modern multitasking operating systems are concurrent with their ability to run many different programs at the same time. As computer hardware becomes cheaper, running complex jobs on clusters becomes more realistic. Several programming languages have been designed with parallelism in mind, including Go.