From: Michael Powe on
Hello,

I am tasked with writing an application to process some large text
files, i.e. > 1 GB. The input will be csv and the output will be in the
format of an IIS web server log.

I've done this sort of thing before. In the past, I've just
brute-forced it, with a BufferedReader and BufferedWriter handling the
input/output line by line.

I have a little time to complete this project and I'd like to build
something more efficient, that won't peg the CPU for an hour.

My thought was to have a read thread and a write thread and create a
buffer into which some amount of input would be written; and then, when
a threshold was reached, the data would be written out.

Is this a good idea? Are there better ways to manage this?

And finally, I need pointers as to how I would create such a buffer.
The threaded read/write part I can do.

Thanks for any help.

mp

--
Michael Powe michael(a)trollope.org Naugatuck CT USA
Re graphics: A picture is worth 10K words -- but only those to describe
the picture. Hardly any sets of 10K words can be adequately described
with pictures.
From: rossum on
On Wed, 10 Feb 2010 06:28:14 -0500, Michael Powe
<michael+gnus(a)trollope.org> wrote:

>Hello,
>
>I am tasked with writing an application to process some large text
>files, i.e. > 1 GB. The input will be csv and the output will be in the
>format of an IIS web server log.
>
>I've done this sort of thing before. In the past, I've just
>brute-forced it, with a BufferedReader and BufferedWriter handling the
>input/output line by line.
>
>I have a little time to complete this project and I'd like to build
>something more efficient, that won't peg the CPU for an hour.
>
>My thought was to have a read thread and a write thread and create a
>buffer into which some amount of input would be written; and then, when
>a threshold was reached, the data would be written out.
>
>Is this a good idea? Are there better ways to manage this?
>
>And finally, I need pointers as to how I would create such a buffer.
>The threaded read/write part I can do.
>
>Thanks for any help.
>
>mp
If the innput is a CSV file then the logical unit is presumably a
record, either as a line of text or (partly) processed.

Create a queue. The read process adds records to the queue. The
write process pulls records off the queue.

rossum

From: Tom Anderson on
On Wed, 10 Feb 2010, Michael Powe wrote:

> My thought was to have a read thread and a write thread and create a
> buffer into which some amount of input would be written; and then, when
> a threshold was reached, the data would be written out.
>
> Is this a good idea?

I'm slightly skeptical. If the processing is simple, then most of the time
will be spend doing IO even with a simple implementation. Adding threads
to overlap IO and processing might not be a big win. You could try writing
a sequential version of the program (with sufficiently large buffers - a
few megabytes, maybe?), then measuring how fast it runs - if the total
input and output data rate is close to your storage subsystem's capacity,
then no amount of programming cleverness will make it much faster.

If, OTOH, there's significant headroom above the rate you reach, then
using threads as you describe would be a good thing to try. Either that or
non-blocking IO via the NIO package, but i think you'd get decent results
from threads.

> And finally, I need pointers as to how I would create such a buffer. The
> threaded read/write part I can do.

You could try java.io.PipedInputStream and PipedOutputStream. If you want
a bigger buffer, you could grab the code for these from OpenJDK and modify
it. Mind you, circular buffers are a pretty standard bit of programming,
so there will be dozens of other implementations and descriptions out
there on the web.

tom

--
It's rare that you're simply presented with a knob whose only two
positions are "Make History" and "Flee Your Glorious Destiny." --
Tycho Brahae
From: Roedy Green on
On Wed, 10 Feb 2010 06:28:14 -0500, Michael Powe
<michael+gnus(a)trollope.org> wrote, quoted or indirectly quoted someone
who said :

>I've done this sort of thing before. In the past, I've just
>brute-forced it, with a BufferedReader and BufferedWriter handling the
>input/output line by line.

There is quite a bit of CPU work parsing a CSV file. Try
http://mindprod.com/products1.html#CSV
and give it a 64K buffer before you go to a lot of work cooking up
something exotic.
--
Roedy Green Canadian Mind Products
http://mindprod.com

Every compilable program in a sense works. The problem is with your unrealistic expections on what it will do.
From: EJP on
On 10/02/2010 10:28 PM, Michael Powe wrote:
> I have a little time to complete this project and I'd like to build
> something more efficient, that won't peg the CPU for an hour.

Fix your code. It only takes a few seconds to read a file of practically
any size. In my experience the only way you can take an hour to process
any file on modern equipment is if you read the whole file into memory
via concatenation of Strings and then process it, which is the wrong
approach from every possible point of view. Process a line at a time.