Hello. I tried to implement this algorithm http://robocraft.ru/blog/algorithm/560.html, but when I conducted network training, I found out that the output data is not even close to reality. Where did I make a mistake? Or the algorithm for the link is wrong? Help me to understand.
double OpenNNL::_changeWeightsByBP(double * trainingInputs, double *trainingOutputs, double speed, double sample_weight) { double * localGradients = new double[_neuronsCount]; double * outputs = new double[_neuronsCount]; double * derivatives = new double[_neuronsCount]; calculateNeuronsOutputsAndDerivatives(trainingInputs, outputs, derivatives); for(int j=0;j<_neuronsPerLayerCount[_layersCount-1];j++) { localGradients[indexByLayerAndNeuron(_layersCount-1, j)] = trainingOutputs[j] - outputs[indexByLayerAndNeuron(_layersCount-1, j)]; } if(_layersCount > 1) { for(int i=_layersCount-2;i>=0;i--) { for(int j=0;j<_neuronsPerLayerCount[i];j++) { localGradients[indexByLayerAndNeuron(i, j)] = 0; for(int k=0;k<_neuronsPerLayerCount[i+1];k++) { localGradients[indexByLayerAndNeuron(i, j)] += _neuronsInputsWeights[indexByLayerNeuronAndInput(i+1, k, j)] * localGradients[indexByLayerAndNeuron(i+1, k)]; } } } } for(int j=0;j<_neuronsPerLayerCount[0];j++) { for(int k=0;k<_inputsCount;k++) { _neuronsInputsWeights[indexByLayerNeuronAndInput(0, j, k)] += speed * localGradients[indexByLayerAndNeuron(0, j)] * derivatives[indexByLayerAndNeuron(0, j)] * trainingInputs[k]; } } for(int i=1;i<_layersCount;i++) { for(int j=0;j<_neuronsPerLayerCount[i];j++) { for(int k=0;k<_neuronsPerLayerCount[i-1];k++) { _neuronsInputsWeights[indexByLayerNeuronAndInput(i, j, k)] += speed * localGradients[indexByLayerAndNeuron(i, j)] * derivatives[indexByLayerAndNeuron(i, j)] * outputs[indexByLayerAndNeuron(i, j)]; } } } delete[] localGradients; delete[] outputs; delete[] derivatives; }
And that algorithm does not say how to adjust the displacement of neurons. Can someone tell me how to do this?
If you need the complete code, it is here: https://github.com/NicholasShatokhin/OpenNNL