0

I'm using this code to do a NN in order to train my network to give me the classifications on images:

net = newff(p,t,15,{},'traingd');
net.divideParam.trainRatio = 70/100;  % Adjust as desired
net.divideParam.valRatio = 15/100;  % Adjust as desired
net.divideParam.testRatio = 15/100;  % Adjust as desired
net.trainParam.epochs = 10000;
net.trainParam.goal = 0.01;
net.trainParam.show = 25;
net.trainParam.time = inf;
net.trainParam.min_grad = 1e-10;
net.trainParam.max_fail = 10;
net.trainParam.sigma = 5.0e-5;
net.trainParam.lambda = 5.0e-7;
net.trainParam.mu_max = 1e-20;
net.trainParam.lr = 0.001; 

% Train and Apply Network
[net,tr] = train(net,p,t);
outputs = sim(net,p);

% Create P.

% Plot
plotperf(tr)
plotfit(net,p,t)
plotregression(t,outpts)

But my performance never goes bellow 0.5. Tryed to do PCA on the data but I think something is not right on the code? Is it possible to change the initial value of the performance that shows on the nntraintool?

thank you Paulo

1 Answer 1

1

It's hard to say without having your data, but from my experience with neural nets only one of a few things can possibly be happening:

  1. You don't have enough hidden nodes to represent your data
  2. Your time step is too high
  3. Your error space is complicated due to your data and you're reaching lots of local minima. This is a similar but slightly different way of saying 1.
  4. Your data is degenerate, in that you have training samples with different labels but exactly the same features.

If 1, then increase the number of hidden nodes. If 2, decrease the time step If 3, you can try initializing better with Nguyen-Widrow initialization perhaps (this used to be in the function initnw.) If 4, figure out why your data is like this and fix it.

Thanks to @sazary for pointing out some details about initnw being the default when you create a new network with newff or newcf.

Sign up to request clarification or add additional context in comments.

5 Comments

Thanks you for your help Chris. Im training with 1, 5, 10 and 15 hn an stil no feedback. I will try your suggestions
Did you try lowering net.trainParam.lr?
Yes I did. But it can silly, do you think it has to do with the weights? Twice out of no were the network worked without changing nothing.
It's likely that you're reaching some local minima due to the complexity of your search space. Did you try to initialize the network with something like nwinit? In my experience, this method produces more consistent results.
also, it is automatically called when you use newff. this is from matlab's help: You can create a standard network that uses initnw by calling newff or newcf

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.