From: Allen Fowler on 4 Sep 2009 12:06 Hello, I have a list of tasks/items that I want handed off to threads/processes to complete. (I would like to stick with process if I could, since there is some CPU work here. ) Each task involves some calculations and a call to a remote server over urllib2/HTTP. The time to complete each task varies from 1 to 20 seconds depending on a number of factors including variable delay on the remote server. I would like to: 1) Have a maximum of 20 "in-flight" tasks. (thus worker processes?) 2) Not overload the external server that each task is calling. No more than "3 new tasks" per second. More "waiting" tasks may be OK, i need to test it. 3) Certain tasks in my list must be processed in the correct order. (I guess the asignment logic must somehow tag those to by done by the same worker?) Do any of you have suggestions? Can someone point me in the direction of sample code for this? Thank you, :)
From: Aahz on 10 Sep 2009 10:37 In article <mailman.974.1252080796.2854.python-list(a)python.org>, Allen Fowler <allen.fowler(a)yahoo.com> wrote: > >1) Have a maximum of 20 "in-flight" tasks. (thus worker processes?) Good bet. >3) Certain tasks in my list must be processed in the correct order. (I >guess the asignment logic must somehow tag those to by done by the same >worker?) The simpler way to do this would be to bundle these tasks into a single queue object that contains a list of tasks. Each worker iterates over the list of tasks that it receives, which could be a single task. -- Aahz (aahz(a)pythoncraft.com) <*> http://www.pythoncraft.com/ "To me vi is Zen. To use vi is to practice zen. Every command is a koan. Profound to the user, unintelligible to the uninitiated. You discover truth everytime you use it." --reddy(a)lion.austin.ibm.com
|
Pages: 1 Prev: os.system('cls') Next: standard way to search through pdf-documents |