The Random.Range () method with an interval from 0 to 1 for some reason doesn't work for me at all.
Here is the code, and below is the result of its execution:
for (int i = 0; i < 100;i++ ) { Debug.Log(Random.Range(0, 1)); } 
The Random.Range () method with an interval from 0 to 1 for some reason doesn't work for me at all.
Here is the code, and below is the result of its execution:
for (int i = 0; i < 100;i++ ) { Debug.Log(Random.Range(0, 1)); } 
The Random.Range method Random.Range 2 options: float Random.Range(float min, float max) and int Random.Range(int min, int max) . In your case, you pass 2 int 'a, so an overload is used that returns an int . Since Random.Range returns numbers from the interval [min, max) , the only integer that it can return is 0. To return real numbers from the interval [0, 1) , you need to pass to the float method:
for (int i = 0; i < 100;i++ ) { Debug.Log(Random.Range(0f, 1f)); } Source: https://ru.stackoverflow.com/questions/429239/
All Articles