Prev: stopping a multiprocessing.managers.BaseManager nicely (lookslike a hack)
Next: Call for papers: SETP-10, USA, July 2010
From: Paweł Banyś on 7 Mar 2010 19:18 Hello, I have already read about Python and multiprocessing which allows using many processors. The idea is to split a program into separate tasks and run each of them on a separate processor. However I want to run a Python program doing a single simple task on many processors so that their cumulative power is available to the program as if there was one huge CPU instead of many separate ones. Is it possible? How can it be achieved? Best regards, Paweł
From: Diez B. Roggisch on 7 Mar 2010 19:28 Am 08.03.10 01:18, schrieb Paweł Banyś: > Hello, > > I have already read about Python and multiprocessing which allows using > many processors. The idea is to split a program into separate tasks and > run each of them on a separate processor. However I want to run a Python > program doing a single simple task on many processors so that their > cumulative power is available to the program as if there was one huge > CPU instead of many separate ones. Is it possible? How can it be achieved? That's impossible to answer without knowing anything about your actual task. Not everything is parallelizable, or algorithms suffer from penalties if parallelization is overdone. So in essence, what you've read already covers it: if your "simple task" is dividable in several, independent sub-tasks that don't need serialization, multiprocessing is your friend. Diez
From: Gib Bogle on 7 Mar 2010 19:49 Paweł Banyś wrote: .... How can it be achieved? Very carefully.
From: Steven D'Aprano on 7 Mar 2010 20:08 On Mon, 08 Mar 2010 01:18:13 +0100, Paweł Banyś wrote: > Hello, > > I have already read about Python and multiprocessing which allows using > many processors. The idea is to split a program into separate tasks and > run each of them on a separate processor. However I want to run a Python > program doing a single simple task on many processors so that their > cumulative power is available to the program as if there was one huge > CPU instead of many separate ones. Is it possible? How can it be > achieved? Try Parallel Python. http://www.parallelpython.com/ I haven't used it, but it looks interesting. However, the obligatory warning against premature optimization: any sort of parallel execution (including even lightweight threads) is hard to build and much harder to debug. You should make sure that the potential performance benefits are worth the pain before you embark on the job: are you sure that the naive, single process version isn't fast enough? -- Steven
From: Martin P. Hellwig on 7 Mar 2010 21:08
On 03/08/10 00:18, Paweł Banyś wrote: > Hello, > > I have already read about Python and multiprocessing which allows using > many processors. The idea is to split a program into separate tasks and > run each of them on a separate processor. However I want to run a Python > program doing a single simple task on many processors so that their > cumulative power is available to the program as if there was one huge > CPU instead of many separate ones. Is it possible? How can it be achieved? > > Best regards, > > Paweł As far as I know, the Python VM (cpython) will not analyze your code and automatically spread parts over different processing units. I did read, two years or so ago, that AMD was looking in to something that does just what you say on a cpu level, that is present itself as one logical cpu but underneath there are multiple physical ones. I wouldn't hold my breath though waiting for it. -- mph |