From: Jean-Philip Dumont on
Hi,

I'm doing a loop from "i"=0 to 0.5 with steps of 0.01. (0:0.01:0.5)

For every "i", i'm putting the maximum likelihood value I calculated into a matrice. In order to have real indices, I multiply "i" by a hundred and add 1. (100*i+1). By doing this, I end up with indices of 1,2,3,4,.....,51 which is allowing me to put my values calculated in the right place.

But something weard is happening while doing this really simple multiplication.

when i=0.14, 100*0.14+1 it is giving me 15.0000 instead of just 15. Because of this, when the loop hits i=0.14, it gives me an error saying Subscript indices must be real. The same thing happens at the value i=28, 29 and 33 but nowhere else.

Can someone help me on that and tell me why it does this?

Thank you very much!