From: Kristian Gj�steen on
Simon Johnson <simon.johnson(a)gmail.com> wrote:
>With AES we have no idea whether the proof of security that we've got
>for the current collection of attacks will resist an unknown attack.
>
>With BBS we know there is only ever one attack: integer factorisation.
>That's it; anything that breaks BBS must also factor integers.

Consider the following two statements:

We have no idea if there are unknown, efficient methods that
solve the "AES problem" (ie how to distinguish the AES permutation
from a random permutation).

We have no idea if there are unknown, efficient methods that
solve the factoring problem.

Now spot the difference.

One might argue about differences in pedigree, but the fact is, huge
effort by a lot of very clever people have produced very impressive
results, but both problems still seem essentially hard. In other words,
there's evidence that these problems are _really_ hard.

>[...]
>Again, at the risk of repeating myself ad-nauseum: We don't know if
>there is a brutal unpublished or undiscovered attack on AES. However,
>we do know that any attack that breaks BBS must also factor integers.
>There are a potentially *infinite* number of ways to break AES. There
>is exactly one way to break BBS.

Note that there are potentially many ways to solve the factoring problem.
Some of them may involve solving the "BBS problem", although I don't
think that is a promising line of inquiry.

--
Kristian Gj�steen
From: Peter Fairbrother on
Mok-Kong Shen wrote:
> Peter Fairbrother wrote:
>
>> Let's say a perfectly strong cipher cannot be broken by any method
>> easier than brute force.
>>
>> Any attack which needs less than brute force weakens the cipher.
>>
>> I assume that the first cipher is perfectly strong.
>>
>> If the second cipher does some of the work of a brute-force attack, then
>> the combo can be broken with less effort (on the part of the attacker)
>> than a brute-force attack.
>
> In a previous post (13.01) I wrote :"if one cascades two algorithms
> A and B and A and B are of different nature and the keys employed for
> the two are random and independent, then the combined strength is
> evidently greater than either of the components (assuming, of course,
> that neither component has strength 0)". Your second cipher is not
> 'independent' of the first, being specially adapted to the first, I
> would consider.

Possibly it's not independent, but that's not what you said - you said
they were "of different nature".

Does being of different nature *guarantee* independence? No.

Which is one reason, among many, why we don't automatically consider
double encryption to be *guaranteed* to be more secure than single
encryption, even with independent keying.

It's fairly easy to show that if the cipherings are independent (which
can happen even if the same cipher is used for both cipherings) then the
combo is likely [*] to be stronger - but it's very hard to show that the
cipherings are in fact independent.

For instance, we don't know *for sure* that triple DES is stronger than
single DES. It seems overwhelmingly likely, and we assume it is, but we
do not know for sure.


(
[*] at first glance it might seem that the combo will always be
stronger, rather than just likely to be - but there are a few catches.
Very long subject, and not immediately relevant, but to give an example:
suppose a cipher is designed so that it never outputs the plaintext as
ciphertext, no matter what the plaintext or key used. A combo might well
negate this property.

Whether it's a good property or not is a different question, as is
whether the negation means that the ciphers are not independent - to
answer which means you have to strictly define independence, which I
have tried to avoid doing.

And it's only an example - as I said, it's a very long subject!
)

> On the other hand, it seems at least intuitively strange why being
> the 'first' is important.

I can only refer you back to:

M. Maurer and J. L. Massey, Cascade ciphers: The importance of being
first, Journal of Cryptology, vol. 6, no. 1, pp. 55�61, 1993.

www.isiweb.ee.ethz.ch/archive/massey_pub/pdf/BI434.pdf

I don't feel it would be right for me to proselytize their theory, as I
disagree with it.

But there is some sense in it, and the order in which the ciphers are
employed most definitively can matter - that much I do agree with.

There's a demonstration of this pretty early on in the paper.

-- Peter Fairbrother
From: Peter Fairbrother on
Kristian Gj�steen wrote:

> Consider the following two statements:
>
> We have no idea if there are unknown, efficient methods that
> solve the "AES problem" (ie how to distinguish the AES permutation
> from a random permutation).


To rephrase that somewhat: is there a method of breaking AES which on
average needs less resources than brute force?

And the answer is yes - but we (probably) don't know what it is.


(existence proof)

-- Peter Fairbrother
From: Thomas Pornin on
According to Tom St Denis <tom(a)iahu.ca>:
> You also can't sign with Rabin [directly].

Actually you can. Rabin himself published in 1979 two algorithms,
one for asymmetric encryption and one for signatures. For once,
the Wikipedia page is both concise and informative:
http://en.wikipedia.org/wiki/Rabin_signature_algorithm

Rabin signature verification is even faster than RSA signature
verification (a modular square vs a modular cube). The signature
generation is somewhat more complex, since for a given random padding,
the resulting value may fail to be a square; a naive implementation will
notice it when computing the square root, which entails a loop and
basically about four times the work (modulo n = pq, about one number in
4 is a square). A less naive implementation will use a Jacobi symbol
computation, which is faster than extracting a square root.

A Rabin signature is also slightly longer than a RSA signature, to
account for the extra padding. But that extra padding can be very short;
e.g., a single byte will be sufficient with overwhelming probability.


The real problem with Rabin signatures is that there is no equivalent to
PKCS#1. PKCS#1 makes RSA implementation easy: it tells where each byte
should go, without any ambiguity. Someone implementing Rabin signature
must produce his own detailed specification, something which is rarely
done correctly, if at all.


--Thomas Pornin
From: unruh on
On 2010-01-18, Joseph Ashwood <ashwood(a)msn.com> wrote:
> "Simon Johnson" <simon.johnson(a)gmail.com> wrote in message
> news:6efb7e29-9634-4e1c-8f7f-ee61a7cafa37(a)v25g2000yqk.googlegroups.com...
>> Given the fact that Euclid,
>> Euler, Gauss, Riemann and a whole other bunch of pre-eminent
>> mathematicans were unable to find an efficient factoring algorithm
>> should give us the warm fuzzies about BBS.
>
> Given the fact that factoring is nearly polynomial. Given that the

How in the world is sub-exponential nearly polynomial?
exp(ln(N)^(1/3)) is not a polynomial in any definition I know of.
While it is slower than exponential in the length( length=ln(N)) it is
still far from polynomial. Now it happens that for the lengths actually
used, it does not increase terribly fast, but that is NOT "nearly
polynomial" Since you spend most of this post telling others that they
are just wrong, you should make sure that you are not "just wrong" as
well.


> algorithms keep getting better. Given that hardware is quickly outpacing the
> normal limitations to factoring. Given that BBS is therefore nearly broken,
> noone should have warm fuzzies about BBS.

BBS with modulus of length 5 was broken long long ago. BBS with length
10^5 is far from broken. Which are you talking about.


>
> So to summarize:
> BBS is nearly broken for usable key sizes

Key sizes which are "useable" scales with the speed and size of
computers, and much more favourably than does the breaking. So what are
you talking about?

> Your claim that it can be used for every portion is directly false.
> Joe
>