From: Baar Boris on
Hello!
I want to take a picture of an uniformly colored surface (white) from a known distance (my webcam is perpendicular to the objects) , detect all the object on it and convert their sizes in pixel to sizes in cm/mm.
I managed to detect the object, get their properties using "regionprops" but I am NOT able to convert the measured values (position of the Centroid, length/width) to real value (cm).
I tried to measure number of pixels equivalent to 1 cm at different distances 15-30 cm, and the plot of these result showed me that in a region 20-30cm this change is linear.
So I calculated the equation of the straight line equivalent to change in pixel number , applied the result but the resulting object size in mm was not accurate at all, up to 1cm for a 5cm length object (picture taken from 25cm).

I am almost surtain that I am doing something wrong with the pixel-to-cm conversion.

Any suggestions are welcomed.
From: Walter Roberson on
Baar Boris wrote:
> Hello! I want to take a picture of an uniformly colored surface (white)
> from a known distance (my webcam is perpendicular to the objects) ,
> detect all the object on it and convert their sizes in pixel to sizes in
> cm/mm. I managed to detect the object, get their properties using
> "regionprops" but I am NOT able to convert the measured values (position
> of the Centroid, length/width) to real value (cm).
> I tried to measure number of pixels equivalent to 1 cm at different
> distances 15-30 cm, and the plot of these result showed me that in a
> region 20-30cm this change is linear.

If the viewing angle and distance from the lens to the focal plane is fixed,
then by simple trig definitions, the linear distances in the image should be
proportional to the distance to the surface (presuming the objects on the
surface have negligible height.)

However, as you changed the distance to the surface, you probably had to
refocus, which would have changed the distance between your lens and your
focal plane. I do not remember what the correction for that would have to be;
I suspect it would be either an 1/L relationship or 1/(sin(theta)*L) relationship.

In turn, the above is not really sufficiently general in that it ignores the
effect of "depth of field". It could be that your depth of field was about +/-
5 cm from 25 cm with a particular focal adjustment, and that outside that
range that you (or your equipment) refocused, leading to the introduction of a
non-linearity in the test results.

> So I calculated the equation of the straight line equivalent to change
> in pixel number , applied the result but the resulting object size in mm
> was not accurate at all, up to 1cm for a 5cm length object (picture
> taken from 25cm).
>
> I am almost surtain that I am doing something wrong with the pixel-to-cm
> conversion.

Possibly, but your method sounds reasonable for a given distance and fixed
focus. If you show your code we could examine it to see if we can see anything
obviously wrong.
From: Baar Boris on
Thanks for the quick response,


dist=str2double(get(handles.distance,'String'));
%equation of the pixel/cm variance for longitudinal and vertical (dstV) measurments
dst=29.469-1.480*(dist-29);
dstV=28.105-1.385*(dist-29);

%extract data

hfull=getsnapshot(vid);
tmp=mat2gray(hfull);
threshold = graythresh(tmp);
h2image = im2bw(tmp, threshold);
axes(handles.video2);
hold off;
BW=~h2image;
imshow(BW);

[B,L] = bwboundaries(BW, 'noholes');
STATS = regionprops(L, 'all');

%Scroll boxes to display readings
set(handles.coord1,'String','');
set(handles.stats,'String','');

for i = 1 : length(STATS)
if (STATS(i).MinorAxisLength>20)&&(STATS(i).MajorAxisLength<640)

centroid = STATS(i).Centroid;
bounding=STATS(i).BoundingBox;

hold on;
%plot object nr on image
text('String',strcat('\color{red}',num2str(i)),'Position',[centroid(1) centroid(2)],'HorizontalAlignment','center');
%set(handles.stats,'String',strcat(num2str(STATS(i).MajorAxisLength),'<->',num2str(STATS(i).MinorAxisLength)));

%plot bounding box
rt=rectangle('Position',bounding,'EdgeColor','r');
%plot elipse to check Minor/MajorAxisLength
ellipse(STATS(i).MajorAxisLength/2,STATS(i).MinorAxisLength/2,-STATS(i).Orientation*pi/180,centroid(1),centroid(2),50);
end;
end
translateNdisplay(STATS,handles);


function translateNdisplay(data,handles)
global h2image dst dstV;
set(handles.coord1,'String','');
set(handles.coord2,'String','');
for i=1 : length(data)
if (data(i).MinorAxisLength>10)
ss=size(h2image);
%display coordinates(pixel) /////////reference:camera position=center of the image
set(handles.coord1,'String',cat(1,get(handles.coord1,'String'),{strcat(num2str(i),'|| ' ,num2str(data(i).Centroid))}));
transX=(data(i).Centroid(1)-320)/dst;
transY=(data(i).Centroid(2)-240)/dstV;
coords=[num2str(i) ' ' num2str(transX) ' ' num2str(transY)];

%calculate size(cm) taking in acount that the object is not in the middle of the image

%set(handles.stats,'String',cat(1,get(handles.stats,'String'),{strcat(num2str(i),' || ',num2str(data(i).MajorAxisLength/sqrt(transX^2+dst^2)),'<->',num2str(data(i).MinorAxisLength/sqrt(transY^2+dst^2)),'(',num2str(data(i).Orientation),')')}));
%area display , used to take measurments and different distances

set(handles.stats,'String',cat(1,get(handles.stats,'String'),{strcat( num2str(i),' || ',num2str(data(i).Area))}));

%display size(cm) neglecting that the object is not in the center of the image

% set(handles.stats,'String',cat(1,get(handles.stats,'String'),{strcat(num2str(i),' || ',num2str(data(i).MajorAxisLength/dst),'<->',num2str(data(i).MinorAxisLength/dst),'(',num2str(data(i).Orientation),')')}));

%displaying coordinates (in cm)

set(handles.coord2,'String',cat(1,get(handles.coord2,'String'),{coords}));
end;
%set(handles.coord1,'String',strcat(get(handles.coord1, 'String'),'<||>'));

end




when calculating the change of pixel/cm ratio I took in acount the Area of an object,after that I calculated how many pixels are in a cm^2 and sqrt() it I got the 1cm equivalent in pixels.

I wonder if it would be possible to connect the focal length of the camera to the pixel/cm ratio with respect to the distance. I would like to have it working in the 15-30cm distance interval , so no refocusing is needed.
From: Walter Roberson on
Baar Boris wrote:

> dist=str2double(get(handles.distance,'String')); %equation of the
> pixel/cm variance for longitudinal and vertical (dstV) measurments
> dst=29.469-1.480*(dist-29);
> dstV=28.105-1.385*(dist-29);

That disturbs me: the implication is that either your focal plane is not
parallel to your surface, or else that your lens has significant axle
distortion. Either case would make your calculation non-viable.

The difference *is* a lot: it ranges from 2.7 cm at 15 cm distance, to 1.3 cm
at 30 cm distance. The place where the two equations would come into
congruence is around 43 cm distance, which is 3 to 1 1/2 times your target
distances. That is too much of a difference to ignore.

I do see the notes in your code that the image is off-center, but that would
not affect the linearity.

I think you need to track down why your horizontal and vertical scaling are
not the same to within measurement error.


> I wonder if it would be possible to connect the focal length of the camera
> to the pixel/cm ratio with respect to the distance. I would like to have
> it working in the 15-30cm distance interval , so no refocusing is needed.


It should theoretically be possible, but when the calculation is done you need
to determine the depth of field for your lens; considering the measurements
quoted above I believe it unlikely that you can fit 15 to 30 cm within a
single fixed focus with your lens. Such a thing _is_ possible for a
sufficiently small imaging area, as "15 to 30 cm" is a subset of "any distance
past X is effectively infinite", but the formulae you have above hint to me
that you are not facing that situation.

30 cm is about the extreme closest capability of typical consumer cameras, and
that is with a quite narrow depth of field. To get 15 to 30 cm in a single
depth of field without refocusing and with linear distance-calculation
response, I believe that you would need a lens that was relatively close in
size to the imaging area -- e.g., a large lens with a big imaging area, or a
fairly small lens with a small imaging area.

Is your lens and imaging "pinhole-like", such as etched in to a circuit?
Reaching _way_ back in my memory, it seems to me that if it is then you could
have significant "fringing" (slit defraction); as you did not mention anything
like that, I would tend to expect that you are using a more standard optical
system and that you have depth of field problems.
From: Baar Boris on
I don't really understand how I should take into account the depth of field of my camera.
I make another series of test measurment in the range 20-40 , I set my focus at 25cm and not changed it. This time I used a squar as test image. I calculated the area of it and sqrt-ing it I got the pixel equivalent of one side. after divided by the real length and got the cm/px ratio, this time I saw from the graphic that the change was linear in the range 30-40cm (U said 30cm should be the minim distance). The results....well they were worser than before,
I am using Labtec webcam and I need to locate objects on a surface (object size max 6cm) and have them picked up by a robot arm after the measured data.

The camera will be mounted on the end-effector of the robot, the arm will go to a designated place , take a picture , let the user schoose the object to pick up, and then pick up the selected object.