Prev: Basic Plot Help
Next: increase the brightness in image
From: Guillaume Ardaud on 29 Apr 2010 13:17 Hello all, I'm quite new to neural networks and Matlab, and I'm trying to get the hang of it. I'm quite stuck on one problem, however. I need to build a network that will recognize colors from an input consisting of the numerical proportions of red, green and blue light in the color. The output will be a digit between 0 and 8, which indicated to which 'standard color' the given input is closest to. 0= white, 1= black, 2=red, 3=green, 4=blue 5=yellow, 6=orange, 7=indigo, 8=violet. So here is my input training data: RGBInput= [0.33 0 1 0 0 0.5 0 0.35 0.3; 0 0 0 1 0 0.5 0.2 0 0.7; 0.33 0 0 0 1 0 0 0.65 0.7] And my output training data: [0 1 2 3 4 5 6 7 8] So for example, if we look at the 5th column, [0,0,1], it corresponds to 4 which is blue. I am using NNTool; I have entered my input data, and my target data. I then create a Feed-forward backprop network, with RGBInput and RGBOutput as its input and target, TRAINLM as training function, LEARNGDM as adaption learning function and MSE as performance function. I set 2 layers, the hidden one being TANSIG with 30 neurons, and the output one being LOGSIG. I create the network, go in its properties, and train it over 1000 epochs with a min_grad of 1e-0100. The other values are default. After training, I run a simulation using RGBInput- and the results don't correspond at all. Here are the results of my last simulation: [4.0003 4 4.004 4 4.2458 4 4.0001 4.2664 4] Which is not what's expected at all. What I am doing wrong? Do I need to set weights or biases, or to adapt the network first? I've been trying for the past 2 hours or so, and can't wrap my mind around it. I understand perceptrons fine, but I have a hard time dealing with the feed forward backprop networks. Thanks a lot for any help!
From: David Young on 29 Apr 2010 15:18 I think the problem lies in the default "early stopping" behaviour of the net, which causes it to train on only a subset of your data. I tried the following: RGBInput= ... [0.33 0 1 0 0 0.5 0 0.35 0.3; 0 0 0 1 0 0.5 0.2 0 0.7; 0.33 0 0 0 1 0 0 0.65 0.7]; Coutput = [0 1 2 3 4 5 6 7 8]; net = newff(RGBInput, Coutput, 5); net.trainParam.epochs = 100; net.divideParam.trainRatio = 1; % use all inputs for training net.divideParam.valRatio = 0; % and none for validation net.divideParam.testRatio = 0; % or testing net = train(net, RGBInput, Coutput); result = sim(net, RGBInput) which produces the right answer to high precision. It probably doesn't need as many as 100 epochs to do this. It may not need as many as 5 units in the hidden layer either, but you can fiddle with these. There may be a simpler way to turn off early stopping than the fiddly stuff I did above, setting the ratios. Another approach which I haven't tried might be to leave early stopping switched on, but to replicate your data and targets many times so that the net samples all of the cases. To guarantee that it sees every case you probably have to change the divideFcn to something other than divideRand, so it's still fiddly. (As an aside: it seems a pity that the default training parameters set up this complex early stopping behaviour with validation and testing as the default. This surely must trip lots of people up, since, as is clearly demonstrated in this example, it doesn't work for the kind of examples people use when they are learning about neural nets and verifying their expectations. I would have thought that most users would have preferred to get something that just implemented the basic algorithm, giving the result they would expect from reading a textbook or paper, with the option to add in early stopping or other elaborations as needed. Similarly, the appearance of a graphical display by default indicates a toolbox philosophy very different to the one I'd adopt. Maybe that's just me - I guess the MathWorks knows their customers better than I do.)
From: Guillaume Ardaud on 30 Apr 2010 03:46 Hello David, Thank you so much for your answer! It solves my problem, and other ones I've been having too. I was using a MATLAB neural network document written by my university, and it totally skipped this part of the process. I'm guessing the default early stopping behavior was implemented in MATLAB after the document was written... Thanks again!
|
Pages: 1 Prev: Basic Plot Help Next: increase the brightness in image |