From: seliz Karadogan on
Hi,

I am trying to use the neural network toolbox. My code is like that:

X: input vectors (96*10000)
T: target vectors (11*10000)
X_test: Test inputs(96*10000)

net = newff(X,T,[10],{'tansig' 'softmax'},'traingd','learngd')

[net,tr]=train (net,X,T);

out = sim(net,X_test);

1) Although I force the network to use 'softmax' transfer function for output, 'out' after the simulation('sim')does not give results between 0 and 1. Why`?

2) Altough, I input input vectors with 96 elements and force it output of 11 elements,in net architecture numInputs is 1, numOutputs is also 1. What is wrong?

3) When I simulate the network 'nntraintool' opens. There is a 'Validation Checks' part. What is that?

Thanks

Regards
Seliz
From: Mark on

"seliz Karadogan" <slzgk(a)yahoo.com> wrote in message
> net = newff(X,T,[10],{'tansig' 'softmax'},'traingd','learngd')

The output of NEWFF from layer 2 has processing functions which automatically map the output range of SOFTMAX to the range of T. To see the processing functions and their settings:

net.outputs{2}.processFcns
net.outputs{2}.processSettings

To ensure that the output returns the values of the layer 2 SOFTMAX without any mappings, remove the processing functions:

net.outputs{2}.processFcns = {};
From: Greg Heath on
On Jan 11, 12:43 pm, "seliz Karadogan" <sl...(a)yahoo.com> wrote:
> I am trying to use the neural network toolbox. My code is like that:
>
> X: input vectors (96*10000)
> T: target vectors (11*10000)
> X_test: Test inputs(96*10000)
>
> net = newff(X,T,[10],{'tansig' 'softmax'},'traingd','learngd')

There are much better learning algorithms. See the documentation

doc newff
help newff

> [net,tr]=train (net,X,T);
>
> out = sim(net,X_test);
>
> 1) Although I force the network to use 'softmax' transfer function
> for output, 'out' after the simulation('sim')does not give results
> between 0 and 1. Why`?

For some silly reason MATLAB does not offer SOFTMAX as an
activation function option during backpropagation. Again, see the
newff documentation.

As far as I can determine, it is because MATLAB did not define
the derivative function DSOFTMAX. As I have written in previous
posts, there is a trivial fix:

1. Define two new functions: SOFTMAXGH and DSOFTMAXGH
2. SOFTMAXGH is the same as SOFTMAX except the derivative
is defined as DSOFTMAXGH
3. DSOFTMAXGH IS THE SAME AS DLOGSIG

> 2) Altough, I input input vectors with 96 elements and force it
> output of 11 elements,in net architecture numInputs is 1,
> numOutputs is also 1. What is wrong?

Don't know.

> 3) When I simulate the network 'nntraintool' opens. There is a
> 'Validation Checks' part. What is that?

Don't know.

Hope this helps.

Greg
From: Greg Heath on
On Jan 11, 12:43 pm, "seliz Karadogan" <sl...(a)yahoo.com> wrote:
> Hi,
>
> I am trying to use the neural network toolbox.  My code is like that:
>
> X: input vectors (96*10000)
> T: target vectors (11*10000)
> X_test: Test inputs(96*10000)
>
> net = newff(X,T,[10],{'tansig' 'softmax'},'traingd','learngd')

It is doubtful that you need 96 input variables to separate 11
classes.
It is doubtful that you need 10^4 input vectors for learning.
However, you may need more than H = 10 hidden nodes.

See my post on pretraining advice.

With an I-H-O = 96-10-11 MLP there are

Neq = Ntrn*O = 11*Ntrn training equations to solve for
Nw = (I+1)*H+(H+1)*O = 970+132 = 1102 unknown weights

Since Neq = 11,000 >> Nw = 1,102,
using Ntrn = 10^3 is a reasonable choice for starters.

Use STEPWISEFIT on 3 or more training subsets of size
Ntrn = 10^3 to design linear and quadratic models. The
results should indicate how many input variables
appear to be redundant or irrelevant.

Hope this helps.

Greg