From: Crown Royal on 15 Sep 2010 08:17 A client has a webserver, and ever since we updated .net from 2.0 to 3.5, the default apps pool stops after a while. The users get a service unavailable, if the default apps pool is started, then everything is back to normal. I've searched all over to find an answer to fix this, and although it seems like a common problem, noone has an answer. Can anyone provide help? Thx
From: Brian Cryer on 15 Sep 2010 09:24 "Crown Royal" <CrownRoyal(a)discussions.microsoft.com> wrote in message news:9692F827-2C6A-4B22-A88B-0CE6CD84C67B(a)microsoft.com... >A client has a webserver, and ever since we updated .net from 2.0 to 3.5, >the > default apps pool stops after a while. The users get a service > unavailable, > if the default apps pool is started, then everything is back to normal. > I've > searched all over to find an answer to fix this, and although it seems > like a > common problem, noone has an answer. > > Can anyone provide help? A couple of thoughts: 1. Don't mix applications in the same pool which are for different versions of .net. This might account for your problem. 2. (A variant of the above) Current wisdom is to have a separate application pool for each application. If you move each application to its own separate pool then does the problem go away? IIS 7 creates a separate application pool by default for each application, whereas II6 doesn't. My guess is that if you give each application its own application pool that either the problem will go away or you will find that one application is misbehaving and crashing - although if this were the case I'd hope to see some evidence of it in the event logs. Hope this helps. -- Brian Cryer http://www.cryer.co.uk/brian
|
Pages: 1 Prev: Continuous IIS logging in SCE 2007 Next: it works perfectly, But!!... |