Mentally multiply by π

This post will give three ways to multiply by π taken from [1].

Simplest approach

Here’s a very simple observation about π :

π ≈ 3 + 0.14 + 0.0014.

So if you need to multiply by π, you need to multiply by 3 and by 14. Once you’ve multiplied by 14 once, you can reuse your work.

For example, to compute 4π, you’d compute 4 × 3 = 12 and 4 × 14 = 56. Then

4π ≈ 12 + 0.56 + 0.0056 = 12.5656.

The correct value is 12.56637… and so the error is .00077.

First refinement

Now of course π = 3.14159… and so the approximation above is wrong in the fourth decimal place. But you can squeeze out a little more accuracy with the observation

π ≈ 3 + 0.14 + 0.0014 + 0.00014 = 3.14154.

Now if we redo our calculation of 4π we get

4π ≈ 12 + 0.56 + 0.0056 + 0.00056 = 12.56616.

Now our error is .00021, which is 3.6 times smaller.

Second refinement

The approximation above is based on an underestimate of π. We can improve it a bit by adding half of our last term, based on

π ≈ 3 + 0.14 + 0.0014 + 0.00014 + 0.00014/2 = 3.14157

So in our running example,

4π ≈ 12 + 0.56 + 0.0056 + 0.00056 + 00028 = 12.5656 = 12.56654.

which has an error of 0.00007, which is three times smaller than above.

Related posts

[1] Trevor Lipscombe. Mental mathematics for multiples of π. The Mathematical Gazette, Vol. 97, No. 538 (March 2013), pp. 167–169

Handy approximation for roots of fractions

This post will discuss a curious approximation with a curious history.

Approximation

Let x be a number near 1, written as a fraction

x = p / q.

Then define s and d as the sum and difference of the numerator and denominator.

s = p + q
d = pq

Since we are assuming x is near 1, s is larger relative to d.

We have the following approximation for the nth root of x.

nx ≈ (ns + d) / (ns − d).

This comes from a paper written in 1897 [1]. At the time there was great interest in approximations that are easy to carry out by hand, and this formula would have been very convenient.

The approximation assumes x is near 1. If not, you can multiply by a number of known square root to make x near 1. There will be an example below.

Examples

Positive d

Let’s find the cube root of x = 112/97. We have n = 3, p = 112, q = 97, s = 209, and d = 15. The approximation tells says

3x ≈ 642/612 = 107/102 = 1.049019…

while the exact value is 1.049096… .

Negative d

The value of d might be negative, as when x = 31/32. If we want to find the fifth root, n = 5, p = 31, q = 32, s = 63, and d = −1.

5x ≈ 312/314= 156/157 = 0.9936708…

while the exact value is 0.9936703… .

x not near 1

If x is not near 1, you can make it near 1. For example, suppose you wanted to compute the square root of 3. Since 17² = 289, 300/289 is near 1. You could find the square root of 300/289, then multiply the result by 17/10 to get an approximation to √3.

History

The author refers to this approximation as Mercator’s formula, presumable Gerardus Mercator (1512–1594) [2] of map projection fame. A brief search did not find this formula because Mercator’s projection drowns out Mercator’s formula in search results.

The author says a proof is given in Hutton’s Tracts on Mathematics, Vol 1. I tracked down this reference, and the full title in all its 19th century charm is

TRACTS
ON
MATHEMATICAL
AND
PHILOSOPHICAL SUBJECTS,
COMPRISING,
AMONG NUMEROUS IMPORTANT ARTICLES,
THE THEORY OF BRIDGES,
WITH SEVERAL PLANS OF IMPROVEMENT,
ALSO,
THE RESULTS OF NUMEROUS EXPERIMENTS ON
THE FORCE OF GUNPOWER,
WITH APPLICATIONS TO
THE MODERN PRACTICE OF ARTILLERY.
IN THREE VOLUMES
BY CHARLES HUTTON, LL.D. AND F.R.S. &c.
Late Professor of Mathematics in the Royal Military Academy, Woolwich.

Hutton’s book looks interesting. You can find it on Archive.org. Besides bridges and gunpowder, the book has a lot to say about what we’d now call numerical analysis, such as ways to accelerate the convergence of series. Hutton’s version of the formula above does not require that x be near 1.

Related posts

[1] Ansel N. Kellogg. Empirical formulæ; for Approximate Computation. The American Mathematical Monthly. February 1897, Vol. 4 No. 2, pp. 39–49.

[2] Mercator’s projection is so familiar that we may not appreciate what a clever man he was. We can derive his projection now using calculus and logarithms, but Mercator developed it before Napier developed logarithms or Newton developed calculus. More on that here.

Doomsday 2024

John Conway’s “Doomsday” rule observes that that every year, the dates 4/4, 6/6, 8/8, 10/10, 12/12, 5/9, 9/5, 7/11, and 11/7 fall on the same day of the week. In 2024 all these dates fall on a Thursday.

These dates are easy to memorize because they break down in double pairs of even digits—4/4, 6/6, 8/8, 10/10, and 12/12— and symmetric pairs of odd digits—5/9 and 9/5, 7/11 and 11/7. Because of their symmetry, this set of dates is the same whether interpreted according to the American or European convention.

In addition, the last day of February falls on “Doomsday.” Since 2024 is a leap year, this will be February 29.

Related post: Mentally calculating the day of the week

Mentally approximating factorials

Suppose you’d like to have a very rough idea how large n! is for, say, n less than 100.

If you’re familiar with such things, your Pavlovian response to “factorial” and “approximation” is Stirling’s approximation. Although Stirling’s approximation is extremely useful—I probably use it every few weeks—it is not conducive to mental calculation.

The cut point

It’s useful to know that 24! ≈ 1024 and 25! ≈ 1025.

Said another way, the curves for n! and 10n cross approximately midway between 24 and 25. To the left of the crossing, n! < 10n and to the right of the crossing n! > 10n.

So, for example, if you hear someone refer to permutations of the English alphabet, you know the number of permutations is going to be north of 1026.

Left of the cut

Suppose you want to estimate n! for n < 24. You know n! < 10n, but maybe you’d like to be a little more precise.

I’ll suppose you know n! for n up to 6. The approximation

log10 n! ≈ n − 2

has an absolute error of less than 1.5 for n = 7, 8, 9, …, 23.

Right of cut

For n = 26, 27, 28, …, 100 the approximation

log10 n! ≈ 7n/4 − 20

has an absolute error less than 3.

Note that calculating 7n/4 as n + n/2 + n/4 is probably easier than calculating (7n)/4.

Related posts

How Mr. Benjamin squares numbers

This post is a sequel to the post How Mr. Bidder calculated logarithms published a few days ago. As with that post, this post is based on an excerpt from The Great Mental Calculators by Steven B. Smith.

Smith’s book says Arthur Benjamin squares large numbers using the formula

n² = (n + a)(na) + a²

where a is chosen to make the multiplication easier, i.e. to make n + a or na a round number. The method is then applied recursively to compute a², and the process terminates when you get to a square you have memorized. There are nuances to using this method in practice, but that’s the core idea.

The Great Mental Calculators was written in 1983 when Benjamin was still a student. He is now a mathematics professor, working in combinatorics, and is also well known as a mathemagician.

Major system

Smith quotes Benjamin giving an example of how he would square 4273. Along the way he needs to remember 184 as an intermediate result. He says

The way I remember it is by converting 184 to the word ‘dover’ using the phonetic code.

I found this interesting because I had not heard of anyone using the Major system (“the phonetic code”) in real time. This system is commonly used to commit numbers to long-term memory, but you’d need to be very fluent in the system to encode and decode a number in the middle of a calculation.

Maybe a lot of mental calculators use the Major system, or some variation on it, during calculations. Most calculators are not as candid as Benjamin in explaining how they think.

Related posts

How Mr. Bidder calculated logarithms

George Parker Bidder (1806–1878) was a calculating prodigy. One of his feats was mentally calculating logarithms to eight decimal places. This post will explain his approach.

I’ll use “log” when the base of the logarithm doesn’t matter, and add a subscript when it’s necessary to specify the base. Bidder was only concerned with logarithms base 10.

Smooth numbers

If you wanted to calculate logarithms, a fairly obvious strategy would be to memorize the logarithms of small prime numbers. Then you could calculate the logarithm of a large integer by adding the logarithms of its factors. And this was indeed Bidder’s approach. And since he was calculating logarithms base 10, so he could make any number into a integer by shifting the decimal and then subtracting off the number of places he moved the decimal after taking the log of the integer.

Numbers with only small prime factors are called “smooth.” The meaning of “small” depends on context, but we’ll say numbers are smooth if all the prime divisors are less than 100. Bidder knew the logs of all numbers less than 100 and the logs of some larger numbers.

Rough numbers

But what to do with numbers that have a large prime factor? In this case he used the equation

log(a + b) = log(a(1 + b/a)) = log(a) + log(1 + b/a).

So if a number n doesn’t factor into small primes, but it is close to a number a that does factor into small primes, you’re left with the problem of finding log(1 + b/a) where b = na and the fraction b/a is small.

At this point you may be thinking that now you could use the fact that

log(1 + x) ≈ x

for small x. However, Bidder was interested in logarithms base 10, and the equation above is only valid for natural logarithms, logarithms base e. It is true that

loge(1 + x) ≈ x

but

log10(1 + x) ≈ log10(e) x = 0.43429448 x.

Bidder could have used 0.43429448 x as his approximation, but instead he apparently [1] used a similar approximation, namely

log(1 + b/a) ≈ log(1 + 10m) 10m b/a

where b/a is between 10m−1 and 10m. This approximation is valid for logarithms with any base, though Bidder was interested in logs base 10 and had memorized log101.01, log101.001, log101.0001, etc. [2]

Finding nearby smooth numbers

To get eight significant figures, the fraction b/a must be on the order of 0.001 or less. But not every number n whose log Bidder might want to calculate is so close to a smooth number. In that case Bidder might multiply n by a constant k to find a number so that kn is close to a smooth number, take the log of kn, then subtract log k.

Smith says in [1] “For example, to obtain log 877, he would multiply 877 by 13, which gives 11,401, or 600 × 19 + 1.” Then he could calculate

log(877) = log(13*877) −- log(13) = log(11400) + log(1/11400) − log(13)

and use his algorithm for approximating log(1/11400).

I can’t imagine thinking “Hmm. 877 isn’t close enough to a smooth number, but if I multiply it by 13 it will be.” But apparently Bidder thought that way.

Related posts

Here is a simple way to approximate logarithms to a couple significant figures without having to memorize anything.

See also this post on mentally calculating other functions to similar accuracy.

Notes

[1] This equation comes from The Great Mental Calculators by Steven B. Smith. Bidders methods are not entirely known, and Smith prefaces the approximation above by saying “it appears that Bidder’s approximation was …”.

[2]

|-------+-------------|
|     x | log_10(1+x) |
|-------+-------------|
| 10^-2 |  0.00432137 |
| 10^-3 |  0.00043408 |
| 10^-4 |  0.00004342 |
| 10^-5 |  0.00000434 |
| 10^-6 |  0.00000043 |
| 10^-7 |  0.00000004 |
|------+--------------|

Mentally calculating the day of the week in 2023

Mentally calculating the day of the week will be especially easy in 2023. The five-step process discussed here reduces to three steps in 2023.

One of the steps involves leap years, and 2023 is not a leap year. Another step involves calculating and adding in the “year share,” and the year share for 2023 is zero. So the five steps reduce to these three steps.

  1. Start with a constant corresponding to the month.
  2. Add the day of the month.
  3. Take the remainder by 7.

The constants for each month are as follows.

    | January | 6 | February | 2 | March     | 2 |
    | April   | 5 | May      | 0 | June      | 3 |
    | July    | 5 | August   | 1 | September | 4 |
    | October | 6 | November | 2 | December  | 4 |

You have to come up with your own mnemonics for memorizing this table. As I commented here,

It’s typically much easier to memorize something than to come up with a mnemonic that other people would find acceptable. No matter how natural your mnemonics sound to you, they usually sound like nonsense to anyone else.

Examples

To find the day of the week for July 4, 2023 we start with 5 for July, add 4, and get 9, which leaves a remainder of 2 when divided by 7. Days of the week are numbered starting with Sunday as 0, so 2 corresponds to Tuesday.

Pearl Harbor Day, December 7, will be on a Thursday next year because 4 + 7 leaves a remainder of 4 when divided by 7.

Doomsday

An alternative to the method above is John Conway’s “Doomsday” rule. This rule observes that that every year, the dates 4/4, 6/6, 8/8, 10/10, 12/12, 5/9, 9/5, 7/11, and 11/7 fall on the same day of the week. These dates are easy to memorize because they break down in double pairs of even digits—4/4, 6/6, 8/8, 10/10, and 12/12— and symmetric pairs of odd digits—5/9 and 9/5, 7/11 and 11/7.

For 2023 the Doomsday dates all fall on Tuesday. You can bootstrap your way to calculating other days of the week starting from these landmarks.

You can add “March 0”, i.e. the last day of February, to the list. In 2023 this is February 28 though of course it is sometimes February 29. And when March 0 is February 28, you can add February 0, i.e. January 31, to the list of Doomsday dates.

The Doomsday rule is less work up front than the rule above, but it’s more work in the long run.

Beyond 2023

In general you have to compute the year share in order to find the day of the week of a date, though in 2023 the year share is zero. This post gives multiple ways to compute the year share in your head.

Related posts

Mental hash function

A few years ago I wrote about Manual Blum’s proposed method for mentally computing a secure hash function. He proposed using this method as a password manager, using the hash of a web site’s name as the password for the site.

I first wrote about Blum’s method on the Heidelberg Laureate Forum blog, then wrote some footnotes to the post here.

NB: This method is insecure for reasons Rob Shearer describes in the comments.

Algorithm

Blum’s algorithm requires creating and memorizing two things: a random map from letters to digits, and a random permutation of digits.

Let f be a function that maps the set {A, B, C, …, Z} to the set {0, 1, 2, …, 9} created by a random number generator, and let g be a random permutation of {0, 1, 2, …, 9}.

Start with the text you want to hash. Then use f to convert the text to a sequence of digits, replacing each character c with f(c). Label this sequence of digits

a0 a1 a2an.

To create the first digit of your hash value, add the first and last digits of your sequence of digits, take the last digit of the sum, and apply the permutation g to it. In symbols, the first digit of the output, b0, is given by

b0 = g( (a0 + an) % 10 )

where a % 10 is the remainder when a is divided by 10.

For the rest of the digits, add the previous output to the current input, take the last digit, and apply the permutation. That is, for k > 0,

bk = g( (bk−1 + ak) % 10 ).

A few years ago Blum recommended using at least 12 characters.

Python code

Here’s some Python code to implement the method above and make all the details explicit.

    from numpy.random import default_rng
    
    rng = default_rng(20220516)
    
    fdata = rng.integers(10, size=26)
    def f(ch):
        ch = ch.lower()
        return fdata[ord(ch) - ord('a')]
    
    gdata = rng.permutation(10)
    def g(n):
        i = list(gdata).index(n)
        return gdata[(i+1) % 10]
    def hash(text):
        digits = [f(c) for c in text]
        N = len(text)
        out = [0] * N
        out[0] = g((digits[0] + digits[-1]) % 10)
        for i in range(1, N):
            out[i] = g((out[i-1] + digits[i]) % 10)
        return "".join(str(d) for d in out)
    
    print(hash("hello"))

This prints 26752.

Is mental cryptography hopeless?

It’s been eight years since I talked to Manuel Blum. Perhaps he realized years ago that this system is flawed. But I still find his initial question interesting: are there encryption algorithms that humans can execute mentally that cannot be easily broken by computers?

It would be interesting if such an algorithm could at least put up a good fight. Cryptographic algorithms are considered broken if they can be compromised given a few weeks of computing. Could a mental hashing algorithm at least withstand an hour of computer attack?

Mentally calculating the day of the week

In my previous post I mentioned John Conway’s Doomsday rule for calculating the day of the week for any date. This method starts off very simple, but gets more complicated when you actually use it.

This post will present an alternative method that’s easier to use in practice and can be described more succinctly.

Here’s the algorithm.

  1. Take the last two digits of the year and add the number of times 4 divides that number.
  2. Add a constant corresponding to the month.
  3. Add the day of the month.
  4. Subtract 1 in January or February of a leap year.
  5. Take the remainder by 7.

The result is a number that tell you the day of the week.

To be even more succinct, let y be the number formed by the last two digits of the date. Let d be the day of the month. Let L equal 1 if the year is a leap year and the date is in January or February and 0 otherwise. Then the algorithm above can be written as

(y + ⌊y/4⌋ + dL + month constant) % 7.

Here ⌊x⌋ is the floor of x, the greatest integer no greater than x, and x % 7 is the remainder when x is divided by 7.

Update: There are multiple ways to compute the year share, i.e. y + ⌊y/4⌋, that are equivalent mod 7 and easier to carry out mentally than the direct approach.

I’ve deliberately left out a couple details above. What are these month constants, and how does the number at the end give you a day of the week?

Customizing for the 21st century

I learned the method above in the 20th century, and the rule was optimized for the 20th century. You had to subtract 1 for dates in the 21st century.

Also, when I learned the method, it numbered the days of the week with Sunday = 1, Monday = 2, etc. Now I would find it easier to start with Sunday = 0.

Here’s an updated table of month constants that eliminates the need to adjust for dates in the 20th century, and numbers the days of the week from 0.

    | January | 6 | February | 2 | March     | 2 |
    | April   | 5 | May      | 0 | June      | 3 |
    | July    | 5 | August   | 1 | September | 4 |
    | October | 6 | November | 2 | December  | 4 |

If you’d like to number Sunday as 1 rather than 0, add 1 to all the numbers above.

The article I learned this method from came with suggested mnemonics for remembering the month constants. But when you change the constants like I’ve done here, you have to come up with new mnemonics.

Examples

I’m writing this post on Saturday, May 7, 2022. Let’s verify that the method gives the correct day of the week today.

Start with 22, and add 5 because ⌊22/4⌋ = 5. The number for May is 0, so there’s nothing to add for the month. We add 7 because today’s the 7th. This gives us

22 + 5 + 7 = 34

which gives a remainder of 6 mod 7, so this is day 6. Since we started with 0 for Sunday, 6 is Saturday.

Next, let’s do January 15, 2036. We get

36 + 9 + 6 + 15 − 1 = 65

which is 2 mod 7. so January 15 will fall on a Tuesday in 2036. Note that we subtracted 1 because 2036 will be a leap year and our date is in January.

If we want to look at Christmas 2036, we have

36 + 9 + 4 + 25 = 74

which is 4 mod 7, so December 25 will fall on a Thursday in 2036. We didn’t subtract 1 because although 2036 is a leap year, we’re looking at a date after February.

Other centuries

For dates in the 20th century, add 1.

For dates in the 22nd century, subtract 1.

If by some reason you’re reading this article in the 22nd century, make your own table of month constants by subtracting 1 from the numbers in the table above.

More details

When I was at University of Texas, humanities majors had to take a class we called “math for poets.” The class was a hodgepodge of various topics, and the most popular topic when I taught the class was this method. Here’s a handout I gave the class.

That handout is a little slower paced than this blog post, and goes into some explanation for why the method works. (The method was meant to motivate modular arithmetic, one of the topics on the syllabus, so explaining why the method works was secretly the whole point of teaching it.)

The handout is still valid. You can follow it verbatim, or you can replace the table of month constants with the one above and ignore the 21st century correction step.

Related posts

John Conway and mental exercise rituals

Drawing of John Conway with horned sphere

John Horton Conway (1937–2020) came up with an algorithm in 1973 for mentally calculating what day of the week a date falls on. His method, which he called the “Doomsday rule” starts from the observation that every year, the dates 4/4, 6/6, 8/8, 10/10, 12/12, 5/9, 9/5, 7/11, and 11/7 fall on the same day of the week [1], what Conway called the “doomsday” of that year. That’s Monday this year.

Once you know the doomsday for a year you can bootstrap your way to finding the day of the week for any date that year. Finding the doomsday is a little complicated.

Conway had his computer set up so that it would quiz him with random dates every time he logged in.

Mental exercises

Recently I’ve been thinking about mental exercise rituals, similar to Conway having his computer quiz him on dates. Some people play video games or solve Rubik’s cubes or recite poetry.

Curiously, what some people do for a mental warm-up others do for a mental cool-down, such as mentally reviewing something they’ve memorized as a way to fall asleep.

What are some mental exercise rituals that you’ve done or heard of other people doing? Please leave examples in the comments.

More on Conway

The drawing at the top of the page is a sketch of Conway by Simon Frazer. The strange thing coming out of Conway’s head in the sketch is Alexander’s horned sphere, a famous example from topology. Despite appearances, the boundary of Alexander’s horned sphere is topologically equivalent to a sphere.

Conway was a free-range mathematician, working in a wide variety of areas, ranging from the classification of finite simple groups to his popular Game of Life. Of the 26 sporadic groups, three are named after Conway.

Here are some more posts where I’ve written about John Conway or cited something he wrote.

[1] Because this list of dates is symmetric in month and day, the rule works on both sides of the Atlantic.