I can not understand the difference between working:

a = int(input()) b = int(input()) i = 1 while i % a != 0 or i % b != 0: i += 1 print(i) 

and leaving in an infinite loop:

 a = int(input()) b = int(input()) i = 2 while a % i != 0 or b % i != 0: i += 1 print(i) 

Logically, both cycles must lead to the same result and end with this ...

  • Instead of int(input()) it would be better to write specific values ​​:) - gil9red
  • Suppose a = 5, b = 3. - freak-js
  • “According to the logic, both cycles should lead to the same result” - according to the logic of this, it should not happen. - Enikeyschik
  • I already understood. Sleepless night just gives its fruits. - freak-js

2 answers 2

In the first example, you have the remainder of dividing variable i into variables a and b:

 while i % a != 0 or i % b != 0: 

In the second example, you are trying to find the remainder of dividing the variables a and b by the variable i:

 while a % i != 0 or b % i != 0: 

The condition in the loop is constantly executed, so it becomes infinite.

    I slept well and realized with horror how inattention was disastrous. As one good man said: "The best debug is a healthy dream." If anyone has a similar case, this is the answer: dividing the variable i into inputs always works, dividing the input by i - only if a == b