From: Kadi on
"Kadi " <n_kadambari(a)yahoo.com> wrote in message <hr9jjd$3ej$1(a)fred.mathworks.com>...
> Greg Heath <heath(a)alumni.brown.edu> wrote in message <34b9e249-cac9-4183-a3a2-57820f6382e8(a)r9g2000vbk.googlegroups.com>...
> > PLEASE DO NOT TOP POST
> > POST YOUR REPLIES WITHIN AND/OR AFTER THE PREVIOUS POST.
> >
> > On Apr 27, 12:38 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > > Hi
> > >
> > > Instead of using newpnn, I tried using newrb for creating the network parameters. By adjusting the goal, spread, MN and DF I was able to achieve good classification but I do have one question. Is the 'net' obtained from newrb a trained network?
> >
> > Yes
> >
> > > and can I use the train command to train this network?
> >
> > No.
> >
> > Hope this helps.
> >
> > Greg
>
> Hi
>
> I want to use 'softmax' transfer function instead of 'purelin' for the last layer of the radial basis network because I think, and correct me if I am wrong, that softmax would give out Bayes posterior probabilities. I would implement this as follows
>
> net_rb = newrb(P,T,goal,spread,MN,DF);
> net_rb.layers{2}.transferFcn = 'softmax';
>
> Is this a right way of doing it? Can we force the transfer function of the second layer to be a softmax function. I was looking at some of your old posts and in one of them you said Matlab 2007 does not allow softmax to be used as the transfer function of the output layer. Does this hold true for 2009b version too?
>
> Thanks
> Kadi

I have been reading more into this and I think I now understand what you meant when you said that Matlab does not allow the use of 'softmax' as the transfer function for output layer of pnn or rbf models. Looks like the output layer transfer function for rbf is always fixed as purelin and can be changed to softmax only post training.

I looked at the net parameter generated by the newrb command and I see that under functions
functions:

adaptFcn: (none)
divideFcn: (none)
gradientFcn: (none)
initFcn: (none)
performFcn: 'mse'
plotFcns: {}
trainFcn: (none)

the trainFcn is listed as 'none' and this is the reason why I was wondering if the network obtained from newrb is trained or if I have to train it separately using the train command.

Thanks
Kadi
From: Greg Heath on
On Apr 28, 11:18 am, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> Greg Heath <he...(a)alumni.brown.edu> wrote in message <34b9e249-cac9-4183-a3a2-57820f638...(a)r9g2000vbk.googlegroups.com>...
> > PLEASE DO NOT TOP POST
> > POST YOUR REPLIES WITHIN AND/OR AFTER THE PREVIOUS POST.
>
> > On Apr 27, 12:38 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > > Hi
>
> > > Instead of using newpnn, I tried using newrb for creating the network parameters. By adjusting the goal, spread, MN and DF I was able to achieve good classification but I do have one question. Is the 'net' obtained from newrb a trained network?
>
> > Yes
>
> > > and can I use the train command to train this network?
>
> > No.
>
> > Hope this helps.
>
> > Greg
>
> Hi
>
> I want to use 'softmax' transfer function instead of 'purelin' for the last layer of the radial basis network because I think, and correct me if I am wrong, that softmax would give out Bayes posterior probabilities.

As long as the tatgets are {0,1}, purelin, logsig and softmax
would all yield estimates of the class probabilities conditonal
on the input.

However, purelin does not yield 0<=y<=1 and logsig
does not satisfy sum(y) = 1.

>I would implement this as follows
>
> net_rb = newrb(P,T,goal,spread,MN,DF);
> net_rb.layers{2}.transferFcn = 'softmax';
>
> Is this a right way of doing it? Can we force the transfer function of the second layer to be a softmax function. I was looking at some of your old posts and in one of them you said Matlab 2007 does not allow softmax to be used as the transfer function of the output layer. Does this hold true for 2009b version too?

AFAIK neither softmax nor crossentropy can be used in any
MATLAB training algorithm without modifying source code.
Although I have posted instructions on how to define dsoftmax
for use with gradient training, it obviously doesn't apply to newrb.

greg heath dsoftmax
greg heath softmax

Hope this helps.

Greg