Prev: How to specify the main() class in "jar -cfe ..." ?
Next: Buy Genuine Google Adsense Account only for Rs.200/- for indian people.
From: J K on 1 Apr 2010 02:15 Hey guys, I work on a medium-to-large scale distributed web application. Often issues come up that need trouble shooting. Currently, sys-admins have to grep logs on as many as 100 machines to find useful information. Of course this can be done via bash, but that is slow and error prone. I would like to implement an easier way. It seems that there are three options: 1. Write logs over the network to some "central" location. 2. Write logs into a database 3. Develop some remote search capability to search or index all the logs 1 seems okay but there would have to be redundancy in that "central" location 2 would provide a performance hit 3 seems like the best, but the indexing capability may be complicated. Does anyone know of any libraries that fit into 1, 2, or 3 (we use log4j, so compatibility would be nice). I'd love to hear about how others have addressed this issue. Anyway, love to hear thoughts on this! Thanks in advance. --- frmsrcurl: http://compgroups.net/comp.lang.java.programmer/
From: Kevin McMurtrie on 1 Apr 2010 04:03 In article <-JOdnakvCblipCnWnZ2dnUVZ_s-dnZ2d(a)giganews.com>, J K <user(a)compgroups.net/> wrote: > Hey guys, > > I work on a medium-to-large scale distributed web application. Often issues > come up that need trouble shooting. Currently, sys-admins have to grep logs > on as many as 100 machines to find useful information. Of course this can be > done via bash, but that is slow and error prone. I would like to implement > an easier way. > > It seems that there are three options: > > 1. Write logs over the network to some "central" location. > 2. Write logs into a database > 3. Develop some remote search capability to search or index all the logs > > 1 seems okay but there would have to be redundancy in that "central" location > 2 would provide a performance hit > 3 seems like the best, but the indexing capability may be complicated. > > Does anyone know of any libraries that fit into 1, 2, or 3 (we use log4j, so > compatibility would be nice). > > I'd love to hear about how others have addressed this issue. > > Anyway, love to hear thoughts on this! > > Thanks in advance. > > --- > frmsrcurl: http://compgroups.net/comp.lang.java.programmer/ 1) You'll need to switch to local files if the central location is unresponsive. Might as well use local files to start with. 2) Does your DBA seem like the kind of person who could kill a man? It really matters here. 3) Yes. Have a system that pulls logfiles and processes them. It can go offline for days with no impact on server performance, and that makes it a much cheaper system to maintain. If you need realtime searching, just script the grep. You do have an accurate list of active servers for the script to use, right? -- I won't see Google Groups replies because I must filter them as spam
From: RedGrittyBrick on 1 Apr 2010 04:29 On 01/04/2010 07:15, J K wrote: > I work on a medium-to-large scale distributed web application. Often issues come up that need trouble shooting. Currently, sys-admins have to grep logs on as many as 100 machines to find useful information. Of course this can be done via bash, but that is slow and error prone. I would like to implement an easier way. > > It seems that there are three options: > > 1. Write logs over the network to some "central" location. > 2. Write logs into a database > 3. Develop some remote search capability to search or index all the logs > My Java hat isn't firmly enough seated on my head - so the first things that popped into my mind were ... 1. Syslog 2. Ugh. 3. When faced with a very similar task, Larry Wall invented Perl. -- RGB
From: Donkey Hottie on 1 Apr 2010 04:26 On 1.4.2010 11:03, Kevin McMurtrie wrote: > In article <-JOdnakvCblipCnWnZ2dnUVZ_s-dnZ2d(a)giganews.com>, > J K <user(a)compgroups.net/> wrote: > >> Hey guys, >> >> I work on a medium-to-large scale distributed web application. Often issues >> come up that need trouble shooting. Currently, sys-admins have to grep logs >> on as many as 100 machines to find useful information. Of course this can be >> done via bash, but that is slow and error prone. I would like to implement >> an easier way. >> >> It seems that there are three options: >> >> 1. Write logs over the network to some "central" location. >> 2. Write logs into a database >> 3. Develop some remote search capability to search or index all the logs >> >> 1 seems okay but there would have to be redundancy in that "central" location >> 2 would provide a performance hit >> 3 seems like the best, but the indexing capability may be complicated. >> >> Does anyone know of any libraries that fit into 1, 2, or 3 (we use log4j, so >> compatibility would be nice). >> >> I'd love to hear about how others have addressed this issue. >> >> Anyway, love to hear thoughts on this! >> >> Thanks in advance. >> >> --- >> frmsrcurl: http://compgroups.net/comp.lang.java.programmer/ > > > 1) You'll need to switch to local files if the central location is > unresponsive. Might as well use local files to start with. > > 2) Does your DBA seem like the kind of person who could kill a man? It > really matters here. > > 3) Yes. Have a system that pulls logfiles and processes them. It can > go offline for days with no impact on server performance, and that makes > it a much cheaper system to maintain. > > If you need realtime searching, just script the grep. You do have an > accurate list of active servers for the script to use, right? All this brings Novell Sentinel to my mind. It is not a programming library though, but a quite expensive system for processing the various logs. One helluva system it is! -- Q: What's the difference between a dead dog in the road and a dead lawyer in the road? A: There are skid marks in front of the dog.
From: RedGrittyBrick on 1 Apr 2010 04:47
On 01/04/2010 09:29, RedGrittyBrick wrote: > On 01/04/2010 07:15, J K wrote: >> I work on a medium-to-large scale distributed web application. Often >> issues come up that need trouble shooting. Currently, sys-admins have >> to grep logs on as many as 100 machines to find useful information. Of >> course this can be done via bash, but that is slow and error prone. I >> would like to implement an easier way. >> >> It seems that there are three options: >> >> 1. Write logs over the network to some "central" location. >> 2. Write logs into a database >> 3. Develop some remote search capability to search or index all the logs >> > > My Java hat isn't firmly enough seated on my head - so the first things > that popped into my mind were ... > > 1. Syslog > 2. Ugh. > 3. When faced with a very similar task, Larry Wall invented Perl. > <presses hat down> http://logging.apache.org/chainsaw/ ? -- RGB |