This question has already been answered:

Hello

There is a code

int start = 2; int len = 5; for (int i = 0; i < len; i++) { decimal d = 1 / start; Console.WriteLine(@"1/" + start + " = " + d); start = start * 2; } 

Logically it should be

 1/2 = 0.5 1/4 = 0.25 1/8 = 0.125 ... 

Why does decimal at division 1/2 print 0?

How to fix it?

Reported as a duplicate by Grundy , pavel , Mirdin , Alexey Shimansky , default locale Sep 5 '17 at 6:42 .

A similar question was asked earlier and an answer has already been received. If the answers provided are not exhaustive, please ask a new question .

  • 2
    because you generally have decimal Not in the code, well, you use integer division, since all arguments are integers - Grundy
  • one
    Because 1 and 2 are integers. - pavel
  • one
    1) double start = 2.0; or 2) d = 1.0 / start; - andy.37
  • one
    Thank you very much replaced 1 for 1.0d and it all worked! Thank! - Sharpogay Vasily
  • d is double, if you want decimal - put m : decimal d = 1m / start; - Grundy

1 answer 1

You have a division of two integer types gives an integer result. Change your code to: decimal d = (decimal)1 / start;

Result:

enter image description here

  • Thank you very much, now I will read the answer - Sharpogo Vasily
  • Why create an int, and then result in decimal (decimal)1 ?? You can immediately say that 1 is decimal, for example var d = 1m / start; - Vadim Prokopchuk
  • @VadimProkopchuk in place of the unit could be a variable, I gave a universal answer. Options indicating the type are in the comments to the question - mirypoko
  • oops, did not notice :) - Vadim Prokopchuk