From: David Eather on 28 Feb 2010 23:10 On 7/01/2010 4:27 AM, bmearns wrote: > My question is based on the concept of collecting partially-random > data in a good hash function (like sha256) as a means of accumulating > the random bits. For instance, if the data has an entropy rate of 0.5 > Shannons per bit, we might collect 512 bits of this data into sha256, > and then assume (perhaps generously) that the digest is 256 bits of > pure entropy. > > As I understand it, this idea is sound in general (not withstanding > the difficulties of getting a good entropy-rate estimation), but what > if the entropy rate is very low? Say for instance our data only has 1 > Shannon per billion bits. If we collect 256 billion bits into our > hash, can we make the same assumption about the output, or will there > come a point where some of the earlier entropy will "fall out" of the > hash, or rather, be pushed out by the massive amount of data we've > collected? > > Thanks, > -Brian I can think of one situation where this question could be true. If you rehashed the data pool every time new entropy was added and you added less then approximately .6 of a bit of entropy (i.e. your TRNG is broken and other entropy generating functions are disabled or faulty) , then yes entropy falls out of the hash function. The amount falling out might be .605 bit - additional entropy; but I have no proof of that.
|
Pages: 1 Prev: Why shouldn't you ever reuse a stream cipher key? Next: Generic Crypto APIs ? |