Minimizing relative error

Suppose you know a number is between 30 and 42. You want to guess the number while minimizing how wrong you could be in the worst case. Then you’d guess the midpoint of the two ends, which gives you 36.

But suppose you want to minimize the worst case relative error? If you chose 36, as above, but the correct answer turned out to be 30, then your relative error is  1/5.

But suppose you had guessed 35. The numbers in the interval [30, 42] furthest away from 35 are of course the end points, 30 and 42. In either case, the relative error would be 1/6.

Let’s look at the general problem for an interval [ab]. It seems the thing to do is to pick x so that the relative error is the same whether the true value is either extreme, a or b. In that case

(xa) / a = (bx) / b

and if we solve for x we get

x = 2ab/(ab).

In other words, the worst case error is minimized when we pick x to be the harmonic mean of a and b.

 

Related post: Where to wait for an elevator

4 thoughts on “Minimizing relative error

  1. Hi John, another way to present this is to take the log of all values. The relative error in the original value space becomes the absolute error in the log value space. The value that minimizes that error is the mean of log values, which, once mapped tot he original value space, is the harmonic mean.

  2. If you take the mean of the log values, you get the geometric mean, not the harmonic mean. That minimizes “relative” but not “relative error,” i.e. it doesn’t involve a subtraction.

    In more detail, say the true value is x and your guess is x_hat. The geometric mean minimizes max(x, x_hat) / min(x, x_hat). But the relative error is abs((x – x_hat)/x).

Comments are closed.