From: Oluwa KuIse on
Hello,
I want to have a program that can increase or decrease my time step as appropriate. The main program is:
t = 0; %initialize time
T = 2000; %Simulation Time in seconds
del_t = 1 ; %arbitrarily chosen, as the program develops, we will get a better 1st estimate
grid_size = 5; %assume the DEM has a resolution of 5meters
g = 9.81;
H_o = 0.001*ones(11,10); %fluid depth
U_o = zeros(11,10); % x-direction flux
V_o = zeros(11,10); % y-direction flux
Z =10*ones(11,10); % sample topography
R = 0.01*ones(11,10); % rainfall
I = 0.001*ones(11,10); %infiltration
t = t + del_t;

while t < T
[H_c,U_c,V_c] = operator(H_o,U_o,V_o,Z,R,I,del_t,grid_size); %this calls the function "operator"
t = t + del_t;
end

Error is defined as norm((H_c-H_o)./H_o) and the Tolerance is 1e-3.
The pseudo code I am having difficulty with translating into MATLAB is:

if Error < 0.5*Tol
Accept current solution, and increase the size (of del_t) at the next step
esleif Error is between 0.5*Tol and Tol
Accept current solution, and hold the size (of del_t) at the next step
esleif Error is between Tol and 2*Tol
Accept current solution, and but decrease the size (of del_t) at the next step
elseif Error > 2*Tol
Throw current results away, cut the step size in half and retry

How do I implement this algorithm in the context of my program in Matlab?
Thank you,
Michael