From: Marco on
HI everyone.

Have the following question, would be glad to know if someone of you could help me.

Let x_obs and x_sim be, respectively, simulations and observations forwave heights, month of October 2008, sampled with a 3 Hours time-step.

Both (time)series are time related, meaning that both files have two columns. For instance, x_obs is:

date-hour WAVE HEIGHT (M)
733123.125 3.5
733123.250 1.5
733123.500 2.1
733123.750 2.2
.. .
.. .
.. .
.. .

FIRST COLUMN reports date and hour according to MatLab's datenum format.
LATTER COLUMN reports the value recorded for that day, that hour.

Now, I wabt retrieve a numerical measure for discrepancy when comparing obs and simulations. Let

RMS ('root mean square')=sum (sqrt(x_obs^2-x_sim^2 ) ) (1)

and assume, also, that simulations might have some gaps (i.e., for some dates and hours there are 'Missing Data').

Question:
Is there any way of implementing the APPLICATION of (user defined function) RMS for both series (1) ACCORDING TO DATNUM format, in order to skip 'Missing data'?

Solution should be something in this fashion:

[rms]=function RMS [x_obs(t), x_sim(t)]
or, alternatively,
[rms]=function [x_obs, x_sim, t_sim, t_obs]

Thanks
Kind regards from Lisbon
Marco