From: Greg Heath on
CORRECTED FOR THE HEINOUS SIN OF TOP POSTING

On Apr 23, 3:51 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> Greg Heath <he...(a)alumni.brown.edu> wrote in message
<ac672c79-2db1-41ed-abf7-831246e9e...(a)g11g2000yqe.googlegroups.com>...
> > On Apr 23, 3:04 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > > Well, what I am actually trying to do is, I have input data (P) and the target class information (T). I built a pnn model using newpnn command of NN toolbox that uses the data and builds a neural network model. I used the entire data for validation and by trail and error found the spread value that gives the best classification. Then instead of using the entire data for training and validation I implemented crossvalidation. I am getting good results but when I used new unknown data that doesn't exactly match any of the samples in the training data, for validation the results were not good at all. And I told that the PNN is not really trained. The training set input vectors simply become the first layer weights and the second layer weights are set to the target vectors in the training set. So I tried to train the network obtained from the newpnn command using the train function. But I
> am
> > > not sure if we can use train command with newpnn because all of the examples that I have seen used newff instead of newpnn.
>
> > The only training that can be done on NEWPNN is to
> > change the spread. There is no special function for that.
>
> > Try NEWRB. It is better. However, it still has deficiencies.
> > Search on
>
> > greg heath RBFNN

> Thanks for the immediate response. Just one more question. Is
newrb
> better than newff?

The MLP and RBF are universal aprroximators. Which one is better
for a particular data set depends on the data. Typically, some
knowledge
of the data via prior information and visual displays leads the
designer
to favor one over the other. In general, trial and error is king.

The MATLAB versions of both NEWRB and NEWFF have annoying
limitations on which I have commented in previous posts. Most notably
is the inability to use cross-entropy performance functions and the
softmax activation function. In addition, NEWRB would be considerable
improved if a nonempty initial hidden node configuration could be
specified and hidden nodes were not constrained to have identical
spreads.

Hope this helps.

Greg.

From: Kadi on
Thanks for the info. That was helpful

Kadi

Greg Heath <heath(a)alumni.brown.edu> wrote in message <d44839e9-e66e-4504-a7cf-d7828758beba(a)g23g2000yqn.googlegroups.com>...
> CORRECTED FOR THE HEINOUS SIN OF TOP POSTING
>
> On Apr 23, 3:51 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > Greg Heath <he...(a)alumni.brown.edu> wrote in message
> <ac672c79-2db1-41ed-abf7-831246e9e...(a)g11g2000yqe.googlegroups.com>...
> > > On Apr 23, 3:04 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > > > Well, what I am actually trying to do is, I have input data (P) and the target class information (T). I built a pnn model using newpnn command of NN toolbox that uses the data and builds a neural network model. I used the entire data for validation and by trail and error found the spread value that gives the best classification. Then instead of using the entire data for training and validation I implemented crossvalidation. I am getting good results but when I used new unknown data that doesn't exactly match any of the samples in the training data, for validation the results were not good at all. And I told that the PNN is not really trained. The training set input vectors simply become the first layer weights and the second layer weights are set to the target vectors in the training set. So I tried to train the network obtained from the newpnn command using the train function.
But I
> > am
> > > > not sure if we can use train command with newpnn because all of the examples that I have seen used newff instead of newpnn.
> >
> > > The only training that can be done on NEWPNN is to
> > > change the spread. There is no special function for that.
> >
> > > Try NEWRB. It is better. However, it still has deficiencies.
> > > Search on
> >
> > > greg heath RBFNN
>
> > Thanks for the immediate response. Just one more question. Is
> newrb
> > better than newff?
>
> The MLP and RBF are universal aprroximators. Which one is better
> for a particular data set depends on the data. Typically, some
> knowledge
> of the data via prior information and visual displays leads the
> designer
> to favor one over the other. In general, trial and error is king.
>
> The MATLAB versions of both NEWRB and NEWFF have annoying
> limitations on which I have commented in previous posts. Most notably
> is the inability to use cross-entropy performance functions and the
> softmax activation function. In addition, NEWRB would be considerable
> improved if a nonempty initial hidden node configuration could be
> specified and hidden nodes were not constrained to have identical
> spreads.
>
> Hope this helps.
>
> Greg.
From: Kadi on
Hi

Instead of using newpnn, I tried using newrb for creating the network parameters. By adjusting the goal, spread, MN and DF I was able to achieve good classification but I do have one question. Is the 'net' obtained from newrb a trained network? and can I use the train command to train this network?

Thanks
Kadi

"Kadi " <n_kadambari(a)yahoo.com> wrote in message <hr47fb$rs7$1(a)fred.mathworks.com>...
> Thanks for the info. That was helpful
>
> Kadi
>
> Greg Heath <heath(a)alumni.brown.edu> wrote in message <d44839e9-e66e-4504-a7cf-d7828758beba(a)g23g2000yqn.googlegroups.com>...
> > CORRECTED FOR THE HEINOUS SIN OF TOP POSTING
> >
> > On Apr 23, 3:51 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > > Greg Heath <he...(a)alumni.brown.edu> wrote in message
> > <ac672c79-2db1-41ed-abf7-831246e9e...(a)g11g2000yqe.googlegroups.com>...
> > > > On Apr 23, 3:04 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > > > > Well, what I am actually trying to do is, I have input data (P) and the target class information (T). I built a pnn model using newpnn command of NN toolbox that uses the data and builds a neural network model. I used the entire data for validation and by trail and error found the spread value that gives the best classification. Then instead of using the entire data for training and validation I implemented crossvalidation. I am getting good results but when I used new unknown data that doesn't exactly match any of the samples in the training data, for validation the results were not good at all. And I told that the PNN is not really trained. The training set input vectors simply become the first layer weights and the second layer weights are set to the target vectors in the training set. So I tried to train the network obtained from the newpnn command using the train function.

> But I
> > > am
> > > > > not sure if we can use train command with newpnn because all of the examples that I have seen used newff instead of newpnn.
> > >
> > > > The only training that can be done on NEWPNN is to
> > > > change the spread. There is no special function for that.
> > >
> > > > Try NEWRB. It is better. However, it still has deficiencies.
> > > > Search on
> > >
> > > > greg heath RBFNN
> >
> > > Thanks for the immediate response. Just one more question. Is
> > newrb
> > > better than newff?
> >
> > The MLP and RBF are universal aprroximators. Which one is better
> > for a particular data set depends on the data. Typically, some
> > knowledge
> > of the data via prior information and visual displays leads the
> > designer
> > to favor one over the other. In general, trial and error is king.
> >
> > The MATLAB versions of both NEWRB and NEWFF have annoying
> > limitations on which I have commented in previous posts. Most notably
> > is the inability to use cross-entropy performance functions and the
> > softmax activation function. In addition, NEWRB would be considerable
> > improved if a nonempty initial hidden node configuration could be
> > specified and hidden nodes were not constrained to have identical
> > spreads.
> >
> > Hope this helps.
> >
> > Greg.
From: Greg Heath on
PLEASE DO NOT TOP POST
POST YOUR REPLIES WITHIN AND/OR AFTER THE PREVIOUS POST.

On Apr 27, 12:38 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> Hi
>
> Instead of using newpnn, I tried using newrb for creating the network parameters. By adjusting the goal, spread, MN and DF I was able to achieve good classification but I >do have one question. Is the 'net' obtained from newrb a trained network?

Yes

> and can I use the train command to train this network?

No.

Hope this helps.

Greg
From: Kadi on
Greg Heath <heath(a)alumni.brown.edu> wrote in message <34b9e249-cac9-4183-a3a2-57820f6382e8(a)r9g2000vbk.googlegroups.com>...
> PLEASE DO NOT TOP POST
> POST YOUR REPLIES WITHIN AND/OR AFTER THE PREVIOUS POST.
>
> On Apr 27, 12:38 pm, "Kadi " <n_kadamb...(a)yahoo.com> wrote:
> > Hi
> >
> > Instead of using newpnn, I tried using newrb for creating the network parameters. By adjusting the goal, spread, MN and DF I was able to achieve good classification but I do have one question. Is the 'net' obtained from newrb a trained network?
>
> Yes
>
> > and can I use the train command to train this network?
>
> No.
>
> Hope this helps.
>
> Greg

Hi

I want to use 'softmax' transfer function instead of 'purelin' for the last layer of the radial basis network because I think, and correct me if I am wrong, that softmax would give out Bayes posterior probabilities. I would implement this as follows

net_rb = newrb(P,T,goal,spread,MN,DF);
net_rb.layers{2}.transferFcn = 'softmax';

Is this a right way of doing it? Can we force the transfer function of the second layer to be a softmax function. I was looking at some of your old posts and in one of them you said Matlab 2007 does not allow softmax to be used as the transfer function of the output layer. Does this hold true for 2009b version too?

Thanks
Kadi