From: D Yuniskis on
Hi Oliver,

Oliver Betz wrote:
> Hello Don,
>
>>>> I am leary of encryption as, historically, it has always managed
>>> You mean "leery"? So you are biased due to lack of knowledge?
>> On the contrary, I've used encryption successfully in past
>> projects. I've also used obfuscation. The above statement
>> merely addresses the reality of encryption techniques
>> failing as technology improves. In the 1970's, crypt(1)
>> was stong enough to deter deliberate password attacks.
>> Nowadays, you can break an even more secure password
>> in minutes using hardware you buy at your local department
>> store.
>
> I didn't expect the watermarking you are looking for has to resist for
> decades. As far as I understand, the goal is to identify preproduction
> samples.

The goal is to identify the *source* of any counterfeit
products that find their way onto the market. When you are
looking at products with 5-10+ man-year development efforts,
it is *quite* attractive for a class II thief to take the
"cheap road" and just copy your design intact. Especially
with all the pac-rim houses operating currently -- where
developing "from scratch" something with more than a man-year
or two (software... forget the hardware, packaging, etc.
as that is pretty straightforward to copy) just doesn't make
sense in markets that move as fast as today's.

>> "Anyone with FIRST HAND experience deploying watermarking
>> technologies? Comments re: static vs dynamic techniques?"
>>
>> But, as is so often the case, instead of answering the
>> question *asked*, folks reply with "Why do you want to
>> do *that*? Why don't you do THIS instead?"
>
> I gave _exactly_ a solution for your request, and I have first hand
> experience (besides I didn't yet embed any individual IDs, but that's
> trivial).

I've gone back through all of your posts. All I see is the
idea of using encryption. Have I missed something?

From the examples you gave, you aren't using a secure processor.
Rather, just using encryption as an "opaque envelope" to allow
you to distribute your executables without them being easily
inspected. This is no different than allowing the device to
fetch its updates via SSL, etc. -- it just hides the content
of the message WHILE IN TRANSIT (e.g., you can put Update5.zip
on your web site for folks to download without worrying about
*who* is downloading it).

To include a watermark in each different executable, you would
have to either:
- ensure each user is only "given" an (encrypted) executable
with a watermarked image having his *unique* fingerprint
(if the user could get his hands on *another* encrypted
instance of the binary FINGERPRINTED FOR SOME OTHER USER,
then the value of the fingerprint/watermark is lost -- how
can you assert that *his* instance was the instance used
to produce the counterfeit since he could "load" anyone
else's encrypted instance [i.e., all devices share a common
crypto key])
- ensure each user (device) has a unique crypto key such that
his instance (again, avoiding the term "copy") of the encrypted
binary is accessible to him, alone (i.e., to avoid the ambiguity
mentioned above). In which case, there is no need for the
binary to be "watermarked" as the device itself is already
uniquely identifiable -- by the unique crypto key (one would
still want to avoid having identical binaries as that would
probably provide an avenue for differential cryptanalysis
on the part of an attacker).

And, as I said before, this just protects the binary "during
transport". It does nothing for the binary once it resides
*in* the device(s) in question. (e.g., the binary is already
in prototypes distributed "from the manufacturer" so what does
encryption buy them?)

>> As I've said in the past, "assume I am competent". Do you
>> require *your* employers to justify each decision *they*
>> make when it is handed down to you?
>
> Why should they "justify"? But I even dare to tell our customers if
> they want something different from their needs or if their decision
> might be made on wrong assumptions.

When firms have a couple more commas in their income statements
than me, I tend to assume they are doing *something* right! :>
From: D Yuniskis on
Hi George,

George Neuner wrote:
> On Tue, 20 Apr 2010 15:42:13 -0700, D Yuniskis
> <not.going.to.be(a)seen.com> wrote:
>
>> ... in C++, you could probably play games with the definitions
>> of the vtable's without the developer ever needing to know
>> that is happening (or, *will* happen after code "release").
>
> AFAIK, you can't easily mess with a vtable except in a leaf class. The
> order of functions is determined partly by standard (new is 1st,
> delete is 2nd, etc.) and partly by the total ordering of definitions
> as seen over the entire class hierarchy.

Yes. But, if you analyze the class hierarchies, you can
tweek even the vtables of the base classes as long as you ensure
each derived class is "compensated accordingly".

Recall, each time you swap a pair of vtable entries, you get
a bit of "fingerprint"/"watermark". With a fair number
of classes, you can quickly get hundreds of bits. (I'm not
sure how sparsely populated "watermark space" needs to be in
order to make it "non-trivial" for one watermarked image
to be converted to another... I guess it would depend on
the techniques being used). For preproduction prototypes,
I suspect 100 bits would be more than enough (assuming each
bit isn't trivial to "flip")

> Also, there is no standard vtable implementation - some compilers use
> arrays, others use hash tables. And in an array implementation, it's
> possible for complicated classes to have more than 1 vtable.

Yes. This approach would require hacking the compiler
(i.e., *a* compiler). That makes it less attractive.
But, I don't see any way of manipulating tables that
the developer wouldn't otherwise be aware of.
From: Oliver Betz on
Hello Don,

[...]

> From the examples you gave, you aren't using a secure processor.

Well, your answer to my question:

|Have you any numbers about the cost to get the content of a flash
|microcontroller if it's "copy protection" is used? For example, we are
|using Freescale 9S08, S12, Coldfire V2 and I could also imagine to use
|a STM32.

....was somewhat vague. Therefore I wrote: "If you tell me what it
costs to get a Flash ROM image from one of these, we can continue the
effort / benefit discussion". This statement is still valid. If you
don't know how "secure" such a single chip device is, it makes no
sense to discuss it's suitability.

BTW my question whether your product uses external memory at all is
also unanswered. This would prevent the encryption approach.

[...]

>To include a watermark in each different executable, you would
>have to either:
>- ensure each user is only "given" an (encrypted) executable
>with a watermarked image having his *unique* fingerprint

this applies also to the other methods you discuss in this thread.

[...]

>>> As I've said in the past, "assume I am competent". Do you
>>> require *your* employers to justify each decision *they*
>>> make when it is handed down to you?
>>
>> Why should they "justify"? But I even dare to tell our customers if
>> they want something different from their needs or if their decision
>> might be made on wrong assumptions.
>
>When firms have a couple more commas in their income statements
>than me, I tend to assume they are doing *something* right! :>

from my experience, the correlation between income or comany size and
smartness is lower than many people might expect.

Oliver
--
Oliver Betz, Munich
despammed.com might be broken, use Reply-To:
From: George Neuner on
On Mon, 03 May 2010 14:14:59 -0700, D Yuniskis
<not.going.to.be(a)seen.com> wrote:

>George Neuner wrote:
>
>> Also, there is no standard vtable implementation - some compilers use
>> arrays, others use hash tables. And in an array implementation, it's
>> possible for complicated classes to have more than 1 vtable.
>
>Yes. This approach would require hacking the compiler
>(i.e., *a* compiler). That makes it less attractive.
>But, I don't see any way of manipulating tables that
>the developer wouldn't otherwise be aware of.

Definitely a compiler level hack - I don't see any simple way to do it
after the fact. Swapping vtable entries would also require changing
method call sites - the indexes or hash keys would need to match.

One thing I forgot to mention is that some [really good] C++ compilers
can incorporate the whole vtable hierarchy into each class so that the
class stands alone. This allows any method to be dispatched with a
single lookup regardless of the position of its implementing class in
the hierarchy.

George
From: George Neuner on
On Mon, 03 May 2010 14:08:38 -0700, D Yuniskis
<not.going.to.be(a)seen.com> wrote:

>I'm just hoping to find something that can
>be done "alongside" the development instead of injecting
>something into the process. Writing clear code is hard
>enough for most folks. Adding deliberate obfuscation
>just makes it that much more fragile and vulnerable to
>error. We used to call the "tamper-proofing" activity
>"RE-bugging" -- an indication of how hard it was to
>get it right -- and always did it *after* the executable
>was known to operate correctly... *two* test cycles! :<
>Obviously, if you can come up with a scheme that can be
>surreptitiously used *during* development, then the
>developer can actually debug "production code" instead of
>having to add this second "post-processing" step.

I write and hack compilers for fun, have hacked them for business and
I was part of a team that wrote a compiler for a user programmable
DSP+FPGA signal/image processing PC board. Hacking compilers is
tricky business and it is all too easy to wind up with an unreliable
tool.

I understand the reluctance to mess with a working executable, but I
think adding a post production step with a custom tool is preferable
to mucking with compilers. Maybe I've been blessed by good people,
but IME it isn't all that hard to get someone to commit to using a
provided macro or template system. Trusting them to extend or
maintain it is a different issue, but again, IME it hasn't been a big
deal.


I've never needed to obfuscate an executable, but I've dealt with very
flexible and adaptive programs and I understand your issues with other
developers and reliability.

I was principle developer for 3 different lines of industrial QA apps,
2 of which are FDA approved for food and pharmaceutical production as
well as general industrial use. These apps were developed and
maintained by teams of 3-8 people over the 10 years I was involved
with them. These programs have hundreds of runtime options: for
equipment enumeration and interfacing, for customizing the operator
UI, for inspection tuning, performance tuning, logging, security, etc.
Despite nearly every operation being conditional, they are still
required to have near perfect inspection reliability (zero false
negatives, less than 0.1% false positives) and 99.9% uptime.

[Knock wood, I've never once had a production system crash in the
field due to my software. Hardware reliability I can't control, but
my software can be as perfect as my manager allows. 8-)

The pharma apps are my masterpieces, but my claim to fame is compact
discs. If you bought any kind of pre-recorded CD or DVD - music,
game, program, etc. - between 1995 and 2001, the odds are about 50%
that it passed through one of my QA apps during production.

George