Good time of day!

Am I writing the "backpropagation" method correctly? I incorrectly thinks it considers the weight for neurons.

Variables:

int inputs[3];//входные нейроны double hidN[2];//нейроны скрытого слоя double output;//выходной нейрон double weight_1[3][2], weight_1_2[3][2];// веса для скрытого слоя double weight_2[2], weight_2_2[2];//веса для выходного нейрона double actual_predict;// актуальное предсказание double learning_rate;// равен 0.08 double error_layer_2, gradient_layer_2, weights_delta_layer_2; double error_layer_1[2], gradient_layer_1[2], weights_delta_layer_1[2]; 

Implementation:

 void train(double expected_predict) { // делает предсказание for (int i = 0; i < 2; i++) hidN[i] = sigmoid((inputs[0] * weight_1[0][i]) + (inputs[1] * weight_1[1][i]) + (inputs[2] * weight_1[2][i])); output = sigmoid((hidN[0] * weight_2[0]) + (hidN[1] * weight_2[1])); actual_predict = output; //корректирует веса для весов скрытого слоя(hidN) error_layer_2 = actual_predict - expected_predict; gradient_layer_2 = actual_predict*(1 - actual_predict); weights_delta_layer_2 = error_layer_2 * gradient_layer_2; for (int i = 0; i < 2; i++)// скорее всего здесь ошибка( weight_2[i] = weight_2[i] - hidN[i] * weights_delta_layer_2 * learning_rate; // корректирует веса для входных нейронов (inputs) for (int i = 0; i < 2; i++) { error_layer_1[i] = weights_delta_layer_2 * weight_2[i]; gradient_layer_1[i] = hidN[i] * (1 - hidN[i]); weights_delta_layer_1[i] = error_layer_1[i] * gradient_layer_1[i]; } for (int i = 0; i < 3; i++)//или здесь for (int j = 0; j < 2; j++) weight_1[i][j] = weight_1_2[i][j] - inputs[i] * weights_delta_layer_1[j] * learning_rate; } 

Algorithm source: https://www.youtube.com/watch?v=HA-F6cZPvrg

Full code question: I can’t correctly implement backpropagation

  • At least you have specified the dimension and type weight_1, inputs, weight_1_2, error_layer_1, hidN. And from which source the algorithm itself was taken. - Unick
  • Why don't you use the sigmoid function for the output layer? And what is the structure of your neural network? - Unick
  • mistake, I stupidly sort through combinations ( - Roman Khudoberdin
  • There are enough oddities in the code. No offhand mistakes are visible ... - AnT
  • @AnT for example? - Roman Khudoberdin

1 answer 1

Error code:

 weight_1[i][j] = weight_1_2[i][j] - inputs[i] * weights_delta_layer_1[j] * learning_rate; 

Code without error:

 weight_1[i][j] = weight_1[i][j] - inputs[i] * weights_delta_layer_1[j] * learning_rate;