From: Maaartin on
On Mar 29, 3:48 pm, Tom St Denis <t...(a)iahu.ca> wrote:
> With all respect to DJB I think the more likely reason for 8/1 is his
> lack of faith in his design more than "speed doesn't matter."  Which
> is ironic given how dismissive of HMAC he is (hint: if he invented a
> fast hash it would only serve to make HMAC fast).

I don't think so. As far as I remember, he rather makes use of the
given rules to the last nitpicking detail, and as he was allowed to
submit a slow design, so he did. After the first round he submitted a
tweak making it 16 times faster just by changing the parameters to
16/32.

IMHO, he says that "speed simply doesn't matter" since "anyone who
really cares about speed shouldn't be using HMAC anyway; other MACs
are faster and inspire more confidence." reminding us this way about
his Poly1305.

> You have to keep in mind a lot of what DJB says at times may be
> factually correct but also totally irrelevant.

I think in this thread
http://www.ecrypt.eu.org/stream/phorum/read.php?1,883
he was plain wrong, but I don't know enough, could somebody check it?
From: Kristian Gj�steen on
Paul Rubin <no.email(a)nospam.invalid> wrote:
>Is there a way to digitally sign a large file without using an SHA-like
>hash function?

I would not be surprised if it is possible to create some sort of digital
signature scheme that does not essentially contain a collision resistant
hash function.

(We can turn a deterministic signature scheme into a hash family,
but the hash family does not have to be collision resistant, even if
distinct messages should lead to distinct signatures.)

>I'm wondering why digital signatures aren't another exception: is
>it because they're not used or checked that often, or because
>there's a way to do them without hashing?

I don't have any numbers, but I believe the time required for hashing is
typically very small compared to the time required for the number theory.
Therefore, you do not care that much about the speed of the hash.

--
Kristian Gj�steen
From: Greg Rose on
In article <hor1ns$a9b$1(a)orkan.itea.ntnu.no>,
Kristian Gj�steen <kristiag+news(a)math.ntnu.no> wrote:
>Paul Rubin <no.email(a)nospam.invalid> wrote:
>>Is there a way to digitally sign a large file without using an SHA-like
>>hash function?
>
>I would not be surprised if it is possible to create some sort of digital
>signature scheme that does not essentially contain a collision resistant
>hash function.

It's rare for me to disagree with Kristian, but I
think collision resistance is pretty much the most
important requirement for a hash function in a
digital signature scheme. If collisions are easy,
producing alternative signed documents is also
easy.

>>I'm wondering why digital signatures aren't another exception: is
>>it because they're not used or checked that often, or because
>>there's a way to do them without hashing?
>
>I don't have any numbers, but I believe the time required for hashing is
>typically very small compared to the time required for the number theory.
>Therefore, you do not care that much about the speed of the hash.

Because the public key operation is so dominant,
the hash operation is negligible on even moderate
amounts of data. But since the amount of data is
variable, it can always be made big enough to
matter, and speed becomes important again.

Greg.
--
From: Thomas Pornin on
According to Tom St Denis <tom(a)iahu.ca>:
> I just don't think DJB has a clue how to really push a body of work to
> Industry.

To be fair, I know of no way to do such a push which would work if the
pusher is a lone individual, regardless of his academic glory. Usually,
most of industry tries to follow standards, and standard committees are
ruled by big companies and bigger institutions. To establish a new
standard, one very big company, or a few big companies, first implement
it, then lobby for it. It would be much unrealistic of DJB to try to
push new standards by himself, and I do not think he tries anyway.


> He did a bit with the 224-bit curve but didn't apply any of it to the
> 192/256 or 521 bit curves.

His work on the P-224 NIST curve was mostly a testbed for the square
root extraction method he had published at that time. That method used
precomputed tables to speed up square roots in GF(p) where p = 1 mod 4.
Among the standard NIST curves, only P-224 uses such a field. A fast
square root speeds up point (de)compression. This is a gain only if the
rest of the computation is fast, so he wrote a fast implementation using
the hardware features which would yield the best performance at that
time, i.e. the x87 FPU: x86 processors were still 32-bit, with no more
than 7 usable registers, and the FPU provided more horsepower. Nowadays,
with 64-bit x86 architectures, fast implementations would use either the
16 "plain" registers, or possibly play tricks with SSE2, not the FPU,
and it would not be expressed with 'double' types in C.

To be brief, he was not interesting in "speeding up standard curves",
but rather in "showing how fast one can get, with the help of his
research".


--Thomas Pornin
From: Paul Rubin on
ggr(a)nope.ucsd.edu (Greg Rose) writes:
>>I would not be surprised if it is possible to create some sort of digital
>>signature scheme that does not essentially contain a collision resistant
>>hash function.
> It's rare for me to disagree with Kristian, but I think collision
> resistance is pretty much the most important requirement for a hash
> function in a digital signature scheme.

Right, but are we overlooking the possibility of a signature scheme that
doesn't use a hash function the way we're thinking of it, or at least
doesn't use all of the security characteristics that an SHA-3 candidate
is required to have? Maybe there is some primitive P that is unsuitable
for some SHA applications, but is faster than SHA-3 and can still give
secure signatures under some specific scheme designed for use with P.

> But since the amount of data is variable, it can always be made big
> enough to matter, and speed becomes important again.

Yes, I'm thinking of large files. The question came up because
I just downloaded a 30GB Wikipedia dump and checking the md5 took
a while.