Tell me, how can I set the size of the image in WinForms in millimeters?

It is necessary for the image to keep its size regardless of its size and screen resolution. The option with a certain percentage of the screen area is not suitable.

  • one
    And what have millimeters then? The screen resolution is always in pixels, and the percentage always works correctly. What have you got with them? Look in the direction of the monitor settings, where it is indicated how many dots per inch, convert to millimeters and get your values. - Dex
  • How do you imagine that?) Computer gets a ruler and measures how many millimeters you have on the screen takes an image and adjusts the size?) - Sergey
  • @ Sergey, in general, this has a little meaning. For example, a picture may look differently on a 1920 * 1080 TV but with a diagonal of 40 inches, and on a monitor with the same resolution, only with a diagonal of 22 inches. Although I imagine this is weak. But you can easily do it, though I don’t know how to do this in csharp. - Dex
  • I would not put such a killer resolution on a 40 inch monica. Yes, and I basically can not imagine the urgent need to match the size of the image on the screen in millimeters. - Sergey
  • What do you mean? - Dex


2 answers 2

Think about this:

  1. Get the screen resolution (1280 * 800)
  2. We get the number of pixels per inch (96dpi = 3.779dpmm) (respectively, the actual size of the monitor in millimeters = (1280 / 3.779 = 338.71) by (800 / 3.779 = 211.69) (33.8 * 21.1 cm)
  3. Now we know the size of the image, suppose 200 * 200
  4. We also get on our screen 52.92 * 52.92
  5. Get percentages of 15.65% * 25.08% values ​​obtained and apply them to pixels (although not sure, you can return to pixels in a different way)

And now let's look here: 200 / (1280/100) = 15.625% at 200 / (800/100) = 25%. Incredible, isn't it? Only if you remember that there is such a thing called “calculation error” and, do you know, what error will give 5 division operations instead of two?

True, if you get a grasp of your question, then such an action is planned:

  1. Define a constant that defines an “ideal” monitor, i.e. on which your picture is of perfect size (describing dimensions)
  2. Determine the actual size of the picture on another screen.
  3. Now calculate whether to reduce or increase it relative to the ideal monitor.

Something like this. You can make amendments, this is the first thing that comes to mind. Now it is worth formalizing how to calculate the value of the “ideal monitor” instead of a constant, but so far nothing climbs into your head.

    wiki wake up :)

    ps: dex says everything correctly in the comment to the question.

    pss: if it comes to the printer, then it seems you can put the PPI on your own, well, and do the appropriate calculations