I am trying to implement a simple neural network of direct distribution. But the result that it gives after training (the method of back-propagation of error) tends to a value of 0.5. Though after 1000 epochs of learning, even after 10,000. It has 3 inputs, 2 neurons in the hidden layer and 1 resultant neuron.
And if you do not train her, and with just generated weights, you can make a decision, then her answer ranges from 0.5 to 1.
Here is the code of what is at the moment.
/*var weight_1 = [[0.79, 0.44, 0.43], [0.85, 0.43, 0.29]], weight_2 = [[0.5, 0.52]],*/ var weight_1 = randArr(2, 3), weight_2 = randArr(1, 2), learning_rate = 0.05, data = [ [[0,0,0], 0], [[0,0,1], 1], [[0,1,0], 1], [[0,1,1], 0], [[1,0,0], 1], [[1,0,1], 0], [[1,1,0], 0], [[1,1,1], 1] ]; console.log("---------До тренировки---------"); console.groupCollapsed("Prediction before"); predictSet(data); console.groupEnd(); function randArr(rows, cols) { arr = []; for (var i = 0; i<rows; i++) { arr[i] = []; for (var j = 0; j<cols; j++) { arr[i][j] = Math.random(); } } return arr; } function activation(x) { return 1 / (1 + Math.exp(-x)); } function sumMultVector(A, B) { var C = 0; for (var i=0;i<A.length;i++) { C += A[i] * B[i]; } return C; } function getWeightLayer(inputs, weight) { var getWeightLayer = []; for (var i = 0; i<weight.length; i++) { getWeightLayer[i] = activation(sumMultVector(inputs,weight[i])); } return getWeightLayer; } function predict(inputs) { outputs_1 = getWeightLayer(inputs, weight_1); outputs_2 = getWeightLayer(outputs_1, weight_2); return outputs_2[0]; } function train(inputs, expected) { weights = getWeightLayer(inputs, weight_1); weights_2 = getWeightLayer(weights, weight_2); predict_2 = weights_2[0]; error_2 = predict_2 - expected; sigmoid_2 = predict_2 * (1 - predict_2); delta_2 = error_2 * sigmoid_2; weight_2[0][0] -= weights_2[0] * delta_2 * learning_rate; weight_2[0][1] -= weights_2[0] * delta_2 * learning_rate; for (var j = 0; j < 2; j++) for (var i = 0; i < 3; i++) { error_1 = weight_2[0][j] * delta_2; sigmoid_1 = weights[j] * (1 - weights[j]); delta_2 = error_1 * sigmoid_1; weight_1[j][i] -= inputs[i] * delta_2 * learning_rate; } } function compareRandom() { return Math.random() - 0.5; } function trainSet(data) { for (var k = 0; k < 5000; k++) { data.sort(compareRandom); for (var i = 0; i < data.length; i++) { train(data[i][0], data[i][1]) } } } function predictSet(data) { for (var i = 0; i < data.length; i++) { console.log("спрогнозировало:", predict(data[i][0]), "ожидается:",data[i][1]); } } trainSet(data); console.log("---------После тренировки---------"); console.groupCollapsed("Prediction after"); predictSet(data); console.groupEnd(); console.groupCollapsed("Weights"); console.log("От входного слоя к скрытому: ", weight_1); console.log("От скрытого к выходному: ", weight_2); console.groupEnd(); And my problem is that all the results that this algorithm produces are close to 0.5 and rather random, no matter what arguments I use.
It would be nice if someone has a similar, but working neural network. Then I could get the result of training on certain static scales and focus on the correctness of my neural network.