From: unruh on
On 2010-01-18, Peter Fairbrother <zenadsl6186(a)zen.co.uk> wrote:
> Mok-Kong Shen wrote:
>> Peter Fairbrother wrote:
>>
>
>> On the other hand, it seems at least intuitively strange why being
>> the 'first' is important.
>
> I can only refer you back to:
>
> M. Maurer and J. L. Massey, Cascade ciphers: The importance of being
> first, Journal of Cryptology, vol. 6, no. 1, pp. 55?61, 1993.
>
> www.isiweb.ee.ethz.ch/archive/massey_pub/pdf/BI434.pdf
>
> I don't feel it would be right for me to proselytize their theory, as I
> disagree with it.
>
> But there is some sense in it, and the order in which the ciphers are
> employed most definitively can matter - that much I do agree with.
>
> There's a demonstration of this pretty early on in the paper.

The argument they present is essentially the one I presented a while
ago, namely that if the cascade weakened the cypher then the attacker
can add his own cascade to the encrypted material and thus have an
attack on the cypher, meaning it was that weak to start off with. But
the key assumption that they make is that the cascade adds a negligible
amount of work. Ie, C2(C1(M,k1),k2) takes essentially the same amount of
work as C1(M,k). Your "counterexamples" have C2 take much more work than
C1.


From: Mok-Kong Shen on
Peter Fairbrother wrote:
> Mok-Kong Shen wrote:

>> In a previous post (13.01) I wrote :"if one cascades two algorithms
>> A and B and A and B are of different nature and the keys employed for
>> the two are random and independent, then the combined strength is
>> evidently greater than either of the components (assuming, of course,
>> that neither component has strength 0)". Your second cipher is not
>> 'independent' of the first, being specially adapted to the first, I
>> would consider.
>
> Possibly it's not independent, but that's not what you said - you said
> they were "of different nature".
>
> Does being of different nature *guarantee* independence? No.
>
> Which is one reason, among many, why we don't automatically consider
> double encryption to be *guaranteed* to be more secure than single
> encryption, even with independent keying.
>
> It's fairly easy to show that if the cipherings are independent (which
> can happen even if the same cipher is used for both cipherings) then the
> combo is likely [*] to be stronger - but it's very hard to show that the
> cipherings are in fact independent.
>
> For instance, we don't know *for sure* that triple DES is stronger than
> single DES. It seems overwhelmingly likely, and we assume it is, but we
> do not know for sure.

Yes. Indeed there are problems of 'definitions'. What constitutes
'different nature'? That's ambiguous. Also, as I mentioned, to determine
e.g. 'independency' of keys seems impossible in the absolute (pedantic)
sense. Probably what is workable in practice is in terms of some
'practical' probability sense or, more correctly, common sense etc.,
That way, maybe DES and AES could be considered independent (or of
different nature), I think.

M. K. Shen
From: Joseph Ashwood on
"unruh" <unruh(a)wormhole.physics.ubc.ca> wrote in message
news:slrnhl9g5v.ok8.unruh(a)wormhole.physics.ubc.ca...
> On 2010-01-18, Joseph Ashwood <ashwood(a)msn.com> wrote:
>> "Simon Johnson" <simon.johnson(a)gmail.com> wrote in message
>> news:6efb7e29-9634-4e1c-8f7f-ee61a7cafa37(a)v25g2000yqk.googlegroups.com...
>>> Given the fact that Euclid,
>>> Euler, Gauss, Riemann and a whole other bunch of pre-eminent
>>> mathematicans were unable to find an efficient factoring algorithm
>>> should give us the warm fuzzies about BBS.
>>
>> Given the fact that factoring is nearly polynomial. Given that the
>
> How in the world is sub-exponential nearly polynomial?

A reasonable argument is:
> exp(ln(N)^(1/3)) is not a polynomial in any definition I know of.
> While it is slower than exponential in the length( length=ln(N)) it is
> still far from polynomial. Now it happens that for the lengths actually
> used, it does not increase terribly fast, but that is NOT "nearly
> polynomial"

Based on that it is not polynomial, but it is nearly polynomial. The
category of sub-exponential/super-polynomial is relatively narrow, and quite
close to polynomial. The rate of advance on the asymptotic is very slow, and
most of the recent advances run out of steam fairly quickly, but the
asymptotics dominate rather quickly at the lengths already in use. As
progress inevitably happens on the factoring problem, one of two things will
happen; either the lower asymptotic will rise (currently linear) or the
upper bound will fall (currently too complex to write here). Either of these
is helpful in really knowing the complexity. Right now though, factoring is
nearly polynomial.

>> algorithms keep getting better. Given that hardware is quickly outpacing
>> the
>> normal limitations to factoring. Given that BBS is therefore nearly
>> broken,
>> noone should have warm fuzzies about BBS.
>
> BBS with modulus of length 5 was broken long long ago. BBS with length
> 10^5 is far from broken. Which are you talking about.

I'm not convinced 10kbit factoring is "far from broken." Certainly it would
be substantially more difficult than the 768-bit factoring recently
completed, but it is not that far from being considered. The 768-bit was
done publicly, the computer farms of major corporations are notably larger
than the 768 farm, not yet the million or so times more powerful that would
be needed, but also not far enough for my taste.

Beyond this the usability factor is important. BBS with a 1024 bit modulus
is very slow, few kbyte/sec (each squaring only gives 1 bit), this is
already too slow for most purposes. Increasing the modulus to 10kbit will
cause this to become unusable in most circumstances.

>> So to summarize:
>> BBS is nearly broken for usable key sizes
>
> Key sizes which are "useable" scales with the speed and size of
> computers, and much more favourably than does the breaking. So what are
> you talking about?

I don't find even 1024-bit BBS's few kbyte/sec to be usable in most
circumstances, and that is borderline right now, within reach of major
corporations within a few years. There is also the problem that after
compensating for the growth of file sizes, the overall complexity of
en/decryption is growing possibly more quickly than the complexity of the
attack. File sizes, and so the en/decryption complexity, have increased
dramatically, this drags on the ability to grow the modulus size. I don't
have the exact metrics in front of me so I don't know whether this is enough
to completely negate the asymptotic change, but it will change the ability
to use BBS and should be considered strongly.
Joe

From: Paul Rubin on
Peter Fairbrother <zenadsl6186(a)zen.co.uk> writes:
> To rephrase that somewhat: is there a method of breaking AES which on
> average needs less resources than brute force?
>
> And the answer is yes - but we (probably) don't know what it is.

Why do you think the answer is yes?
From: David Eather on
Joseph Ashwood wrote:
> "unruh" <unruh(a)wormhole.physics.ubc.ca> wrote in message
> news:slrnhl9g5v.ok8.unruh(a)wormhole.physics.ubc.ca...
>> On 2010-01-18, Joseph Ashwood <ashwood(a)msn.com> wrote:
>>> "Simon Johnson" <simon.johnson(a)gmail.com> wrote in message
>>> news:6efb7e29-9634-4e1c-8f7f-ee61a7cafa37(a)v25g2000yqk.googlegroups.com...
>>>
>>>> Given the fact that Euclid,
>>>> Euler, Gauss, Riemann and a whole other bunch of pre-eminent
>>>> mathematicans were unable to find an efficient factoring algorithm
>>>> should give us the warm fuzzies about BBS.
>>>
>>> Given the fact that factoring is nearly polynomial. Given that the
>>
>> How in the world is sub-exponential nearly polynomial?
>
> A reasonable argument is:
>> exp(ln(N)^(1/3)) is not a polynomial in any definition I know of.
>> While it is slower than exponential in the length( length=ln(N)) it is
>> still far from polynomial. Now it happens that for the lengths actually
>> used, it does not increase terribly fast, but that is NOT "nearly
>> polynomial"
>
> Based on that it is not polynomial, but it is nearly polynomial. The
> category of sub-exponential/super-polynomial is relatively narrow, and
> quite close to polynomial. The rate of advance on the asymptotic is very
> slow, and most of the recent advances run out of steam fairly quickly,
> but the asymptotics dominate rather quickly at the lengths already in
> use. As progress inevitably happens on the factoring problem, one of two
> things will happen; either the lower asymptotic will rise (currently
> linear) or the upper bound will fall (currently too complex to write
> here). Either of these is helpful in really knowing the complexity.
> Right now though, factoring is nearly polynomial.
>
>>> algorithms keep getting better. Given that hardware is quickly
>>> outpacing the
>>> normal limitations to factoring. Given that BBS is therefore nearly
>>> broken,
>>> noone should have warm fuzzies about BBS.
>>
>> BBS with modulus of length 5 was broken long long ago. BBS with length
>> 10^5 is far from broken. Which are you talking about.
>
> I'm not convinced 10kbit factoring is "far from broken." Certainly it
> would be substantially more difficult than the 768-bit factoring
> recently completed, but it is not that far from being considered. The
> 768-bit was done publicly, the computer farms of major corporations are
> notably larger than the 768 farm, not yet the million or so times more
> powerful that would be needed, but also not far enough for my taste.
>
> Beyond this the usability factor is important. BBS with a 1024 bit
> modulus is very slow, few kbyte/sec (each squaring only gives 1 bit),
> this is already too slow for most purposes. Increasing the modulus to
> 10kbit will cause this to become unusable in most circumstances.
>
>>> So to summarize:
>>> BBS is nearly broken for usable key sizes
>>
>> Key sizes which are "useable" scales with the speed and size of
>> computers, and much more favourably than does the breaking. So what are
>> you talking about?
>
> I don't find even 1024-bit BBS's few kbyte/sec to be usable in most
> circumstances, and that is borderline right now, within reach of major
> corporations within a few years. There is also the problem that after
> compensating for the growth of file sizes, the overall complexity of
> en/decryption is growing possibly more quickly than the complexity of
> the attack. File sizes, and so the en/decryption complexity, have
> increased dramatically, this drags on the ability to grow the modulus
> size. I don't have the exact metrics in front of me so I don't know
> whether this is enough to completely negate the asymptotic change, but
> it will change the ability to use BBS and should be considered strongly.
> Joe

You could always hash the state of the BBS and use the hash output for a
speedup.