From: Kristian Gj�steen on
Simon Johnson <simon.johnson(a)gmail.com> wrote:
>AES is provably secure against a limited set of attacks. A new attack
>could break it tomorrow. With BBS there is only one attack: factoring.
>It's true that somebody could devise a much faster factoring algorithm
>but there is strong circumstantial evidence that suggests that this
>will probably not be possible.
>
>At any rate, the case is much, much stronger than the comparable case
>for AES.
>
>I don't think BBS gets enough love in the crypto community.It's very
>simple and very secure; it's a truly beautiful algorithm.

Suppose we can factor a 2048-bit modulus in x years today. Suppose that,
tomorrow, I discover how to break BBS with a 2048-bit modulus in y years,
and y < x. How fast can we factor a 2048-bit modulus tomorrow?

--
Kristian Gj�steen
From: Simon Johnson on
> 1. Why should we expect factoring to be harder than breaking block
> ciphers like AES?  Factoring is generally believed to not be NP-hard and
> someone might discover a P-time factoring algorithm tomorrow.  

It probably doesn't actually matter which complexity class factoring
is in. The thing to remember about the big-oh is that it is the
complexity as the problem size tends to infinity. What really matters
is the complexity around about the problem size we're using!

For example, a few years ago a prime proving algorithm was shown to
have a run-time complexity of n^12. From a big-oh point of view, that
makes it the fastest algorithm yet discovered for verifying primes.
Yet, it made practically no dent on cryptography. Everyone choosing
primes for RSA stuck with the Rabin-Miller test and varients there of.

Why? Because at the problem sizes required for cryptography Rabin-
Miller's complexity is substantially smaller than the n^12. Yes,
eventually it loses out to the n^12 algorithm but that happens on
problem sizes way too big for cryptography [1].

Let's say for the sake of argument that you're right. Factoring *is*
in P and it has a complexity identical to prime verifying algorithm:
n^12. BBS would still be completely secure because this algorithm
would be slower than GNFS on the moduli we're trying to factor [2].

It would be brilliant if somebody showed factoring to be in P, but
like the prime verifying algorithm above it's impact on cryptography
could be extremely limited.

> Factoring
> is in NP/\co-NP, a more constricted complexity class than typical block
> ciphers, which don't hvae to be in co-NP.  

Has anyone proved that AES belongs to NP? Has anyone proved this for
*any* block cipher? What's to say they aren't really in P like BBS
could be?

Really, that was the point of my original post. If you don't care
about speed you can spend CPU cycles reducing the number of security
assumptions. AES could be felled by attacks that haven't even been
invented yet. However, due to BBS's reduction to factoring we know the
only way BBS can be broken is by a radical advance in factoring.

I think that conclusively shows that BBS has a better proof of
security than AES, does it not?

>
> 2. WHy is it astonishing that we know BBS reduces to factoring but we
> don't know the same for RSA?  BBS doesn't try to be a trapdoor function
> like RSA.  RSA performs a much harder trick and I'd think it
> unsurprising that its complexity is harder to analyze.

By astonishing I meant: "Wow, it has a better proof of security than
RSA." I didn't mean that I'm astonished it reduces to factoring.

Besides, you *can* use BBS as a trapdoor. You can publish N to the
world and anyone can encrypt using that modulus. They can send you the
cipher text and if you have the knowledge of the factors you can
recover the plain-text.

You can actually build an entire crypto-system around a single
security assumption: factoring is hard. It's possible to protect
privacy, do key exchange and signatures based on that one axiom.

Compare that to the number of assumptions required to ensure my TLS
session with Gmail is secure:

* Discrete logs are hard.
* There is no short-cut to the Diffie-Helman problem.
* AES is secure.
* SHA-256 is secure.

If any one of these assumptions is false then the security of session
is compromised.

This is why reduction proofs are important. It allows us to increase
the security, at the expense of speed, by reducing the number of eggs
in our basket.

Cheers,

Simon

[1] I'm in danger of being called out for comparing apples with
oranges. The n^12 algorithm verifies that numbers are primes and Rabin-
Miller is generally only used to find numbers that might be prime with
very high probability. That said, I don't think it damages my argument
too much.

[2] This may be wrong. I've not computed the complexity of n^12 versus
GNFS on a 2048-bit modulus. However, it seems reasonable to suggest
that this would be the case.
From: Simon Johnson on

> Also, speed/efficiency DOES matter.  Crypto is used in places where
> 4000MIPS processors are not common :-)

If you read the post carefully, I do pay homage to speed. Efficiency
matters a great deal in a great number of applications.

However, if you decide to completely ignore this important constraint
you can get a much better proof of security than is otherwise
possible.

Let me give you a particularly extreme example. I need to start by
defining the threat mode:

Alice and Bob want to communicate privately over the Internet. Alice
and Bob agree that once somebody is willing to break in to their
house, the game is up. Somebody can bug their computer, ruber hose
them and who knows what else? Their privacy will not withstand a home
invasion.

Therefore, the principle attack vector is Eve. Eve is a cracker who
wishes to compromise their secure communications but the only way she
can do this is by attacking Alice and Bob's machine over the Internet.

Is there a cipher that is unconditionally secure that defeats Eve?
Suprisingly, the answer is yes. There is a completely unbreakable
cipher that can protect their privacy.

The key to seeing this is to realise that there exists an asymmetry in
a bandwidth between Alice and Bob's hard-disks and Eve's bandwidth to
the machines she wants to attack. We can use this asymmetry to define
an instance of Maurer's cipher [1] that is unconditionally secure.

For those who don't want to read the original paper. Maurer's cipher
involves generating a giant, non-secret randomiser. The secret key is
simply the address of the pointer in to the randomiser. Security is
perfect providing the attacker can't access a significant proportion
of the randomiser.

Suppose that Alice and Bob agree on a randomiser of 31.5 terrabytes.
Also assume that the upstream bandwidth of Alice and Bob's machine is
500k per second. If Eve was able to access the randomiser on both
machines simulatenously then she could at most download one meg per
second. It would take her a shade under a year to get a complete copy
of the randomiser.

If you rotated the randomiser every six months, Eve could never get
enough of the randomiser to break a message even in principle.

Now, you might ask why exactly you would prefer this construction over
the one time pad (OTP)? After all, you have to generate 31 terrabytes
of random data every six months for this to be usable.

The answer is that the OTP is much more brittle than Maurer's cipher.
Compromise of the pad means compromise of security. However, with
Maurer style constructions compromise of the randomiser does not mean
compromise of the plain-text.

Suppose, that to increase the key-length you had two pointers instead
of one and you XORed the bytes at those pointers to produce key-
stream. We already know that this is very resistent to attack [2].
Even if Eve got a complete copy of the randomiser, computing a table
of all possible XORed pointers for a pad of this size requires more
storage than has even been created in the history of humanity.

This is a deployable, real world, cipher that could be used today that
is completely secure against a very realistic threat model. It is
completely secure when considering passive attackers, and degrades
nicely to the security of a normal cipher when the randomiser is fully
known. That is probably the strongest security proof of any cipher to
date.

The only draw back is that it requires 31 terrabytes of random data
every six months :)

My point being that if you *decide* efficiency isn't important to you,
suddenly you can do a much better job of getting provable security.

Cheers,

Simon

[1] - http://www.springerlink.com/content/pj8k74e6gtwrvxkh/
[2] - http://www.schneier.com/paper-maurer-stream.ps.gz



From: Kristian Gj�steen on
Simon Johnson <simon.johnson(a)gmail.com> wrote:
>For example, a few years ago a prime proving algorithm was shown to
>have a run-time complexity of n^12. From a big-oh point of view, that
>makes it the fastest algorithm yet discovered for verifying primes.
>Yet, it made practically no dent on cryptography. Everyone choosing
>primes for RSA stuck with the Rabin-Miller test and varients there of.
>
>Why? Because at the problem sizes required for cryptography Rabin-
>Miller's complexity is substantially smaller than the n^12. Yes,
>eventually it loses out to the n^12 algorithm but that happens on
>problem sizes way too big for cryptography [1].

You're wrong. Miller-Rabin is O(n^5), hence faster than AKS for (probably)
all n, and certainly asymptotically. The big idea about AKS is that it is
deterministic, while Miller-Rabin is only deterministic if you assume GRH.

>[...]
>Has anyone proved that AES belongs to NP?

Huh? AES in NP?

>[...]
>This is why reduction proofs are important. It allows us to increase
>the security, at the expense of speed, by reducing the number of eggs
>in our basket.

I would rather advance an economic argument for provable security.

We have available a fixed amount of analysis time. We could spend it
directly on cryptosystem analysis, but that would either give very few
cryptosystems, or very little analysis time per cryptosystem. Or we
can spend most of the analysis time on a small collection of problems,
and then spend the remaining time to build cryptosystems that reduce
to those problems. Building a provably secure cryptosystem is really
quite cheap, compared to analysing a hard problem.

--
Kristian Gj�steen
From: Phoenix on
Well, is encrypting twice much more secure?

Yes or no?

Alvo