Here it is well written about this:
What is “font size”? This is not at all the “size of the largest letter in it,” as one might think.
The font size is some kind of "conventional unit" that is embedded in the font.
It is usually slightly larger than the distance from the top of the largest letter to the bottom of the smallest. That is, it is assumed that any letter or their combination is placed in this height. But at the same time, the "tails" of letters, such as p, g, can go beyond this value, that is, crawl out from below. Therefore, usually the height of the line is made slightly larger than the font size.
So, it’s impossible to measure the font size with the help of some standard formulas; moreover, different fonts have different sizes of these spare indents.
I myself usually use the selection method for the font, it is that the debugger selects the appropriate value in px, which corresponds to the value in vh or in vw, and then the coefficient is used. For example, if 14px = 2.8vh, then a coefficient of 5 is obtained, the last 20px / 5 = 4vh.
UPD.
In the role of the debugger to compare fonts is a regular browser debugger, for example, in Chrome. Write to css like this:
.class { font-size: 14px; font-size: 2.8vh; }
And then you already play with checkboxes in the debugger by turning on / off the lower value and adjusting the size to vh, until you stop noticing the difference from on / off.