Hello, machine learning experts, I have a question about retraining of classifiers, how to understand that the model is being retrained, what techniques can be used to avoid retraining, if there are articles or good literature please skip, I don’t really rummage in machine learning.

Closed due to the fact that the issue is too common for the participants andreymal , aleksandr barakin , Kromster , Viktorov , cheops Jan 6 , ' 5 at 5:31 .

Please correct the question so that it describes the specific problem with sufficient detail to determine the appropriate answer. Do not ask a few questions at once. See “How to ask a good question?” For clarification. If the question can be reformulated according to the rules set out in the certificate , edit it .

  • Articles and literature, which offer search engines for the request of "retraining", something not arranged? - andreymal
  • Andreymal, I do not find - Alexey Navalny

1 answer 1

Alexey, retraining happens simply because the training sample is too small, and the neural network easily “remembers” it, not trying to find any regularity. To combat this, it is sufficient to simply use a large amount of materials to train the neural network.

There is a lot of information in Google, look there if you want.