From: Lizanne Pires on
clear all
pack all
close all;
clc;
i = imread('cameraman.tif');
imshow(i),title('original image');
x = im2double(i);
ir = reshape(x,1,(256*256));
k =0.0002; %scaling factor
watermark1 = [ 0 0 1 1 1 1 0 0, %original watermark
0 1 0 0 0 0 1 0,
0 0 0 0 0 0 0 0,
1 0 1 0 0 0 0 1,
1 0 1 0 0 0 0 1,
1 0 1 1 1 1 0 1,
0 1 0 0 0 0 1 0,
0 0 0 0 0 0 0 0];
x = im2double(i);
watermarkfinal= reshape(watermark1,1,64);
irfinal =0;
seq = 0;
pn = 0;

for l = 1:64
rand('seed',l) %generating pn sequence
pn = rand(256);
seq=reshape(pn,1,65536);
if watermarkfinal(1,l) == 0
duck= k * seq;
irfinal = duck + ir;
else
duck = seq;
irfinal = duck +ir;
end
end

final = reshape(irfinal,256,256);
figure,imshow(final),title('Watermarked image');

%detection---

finalun = reshape(final,1,65536);
varsim = 0;

for l = 1:64
rand('seed',l)
pn1 = rand(256);
seq1=reshape(pn1,1,65536);

sim(l) = (seq1 .* finalun)/sqrt(seq1.*seq1); % similarity function to calculate correlation

meansim = mean(sim(l));
if meansim>= 0.4655
varsim(l) = 1;
else
varsim(l) = 0;

end
end

varsim1 = reshape(varsim,8,8);
figure,imshow(varsim1),title('extracted watermark');
figure,imshow(watermark1),title('original watermark');

above is the code for CDMA spread spectrum method for watermarking of images
according to the link http://arxiv.org/ftp/arxiv/papers/0909/0909.3554.pdf.
we would be grateful if anyone can give us an idea on how to improve the code so tha the extracted watermark and original watermark are the same.
Thank you



From: Walter Roberson on
Lizanne Pires wrote:

> above is the code for CDMA spread spectrum method for watermarking of
> images

The above statement rings bogosity bells for me.


CDMA just means Carrier Detect Multiple Access. CDMA describes any networked
system in which there are data access points connected to a common "in use"
sensing mechanism. CDMA says nothing about the physical layer describing the
kind of wiring (if wire is even used!) nor about the data signaling mechanism,
nor about the encoding of bauds of transmission. For example, 10 and 100
megabit ethernet over CAT4 / 5/ 6 cables transmit data using a 6/5 encoding of
bits into 12.5 and 125 megabit per second asynchronous pulse trains
(respectively), but gigabit ethernet and 10 gigabit ethernet use trellis
encoding (32/24 is what springs to mind at the moment) of four simultaneous
synchronous channels operating at different frequencies with nominal transfer
rates of 125 megabits and 1250 megabits per second (respectively).

Thus, it is impossible to construct a single algorithm that will transform
data into "spread spectrum" over all possible CDMA systems. Especially
considering that gigabit and higher ethernet automatically choose trellis
constellations with the aim of spreading the spectrum.

You can implement CDMA using sets of flutes (for the carrier) and drums (for
the data transmission) with Morse Code as your data encoding mechanism. How
your algorithm is going to manage to get "spread spectrum" out of that is
beyond me.