From: TideMan on
On Apr 29, 4:11 am, "Andy " <andrew.h...(a)uea.ac.uk> wrote:
> Thanks for your replies.
>
> I don't think those methods would be quite right. As there is some irregularity in the data (instrument downtime etc) so I'd prefer a mean over a particular minute rather than a running mean
> Any thoughts? Cheers, Andy

Well, that's what my algorithm does.
It takes the mean of each minute of data, producing a result every
minute.
That's not a running mean.
Actually, it's called "decimation".

If you want to account for bad data, just replace the bad data with
NaN and use the same algorithm. The minutes that have bad data in
them will be NaN.
From: Bruno Luong on
"Andy " <andrew.hind(a)uea.ac.uk> wrote in message <hr5foj$q39$1(a)fred.mathworks.com>...
> A very straightforward question I think, I've just not been able to find a solution
>
> I fairly high frequency (~10 measurements/minute) multiparameter environmental (temperature etc) data that I need to condense. At the moment I think I would like a mean value for each parameter every minute. How should I go about doing this?

% time stamp, second
t = 6+zeros(1,100);
t = t+2.5*rand(size(t)); % jitter
t = cumsum(t);

% generate some data
y = t/10+sin(t/10)+0.1*rand(size(t));

% Vector of time separate by 1 minute
dt = 60;
tedge = t(1):dt:t(end)+dt;
tmid = (tedge(1:end-1)+tedge(2:end))/2; % mid of the edges

% Mean in 1 minute interval
[trash bin] = histc(t,tedge);
meany = accumarray(bin(:),y(:))./accumarray(bin(:),1);

% Check
plot(t,y,'b',tmid,meany,'ro')

% Bruno
From: Andy on
Finally got back to this.... thanks very much for you help!
Cheers, Andy