GPT-4 Multiplication Competition

5Adele Lopez

4SandXbox

3Adele Lopez

3Adele Lopez

3ws27b

10dandelion4

3Archimedes

New Comment

7 comments, sorted by Click to highlight new comments since: Today at 5:45 AM

I predict this will pass (haven't had a chance to test on GPT-4, but the results using GPT-3 were promising):

```
First, quickly determine the order of magnitude of a product. Then, show a heuristic estimating a product by rounding to the nearest thousand. Then, use another heuristic to quickly determine the final three digits of the product. Next, show an example of a math PHD student explaining how multiply using the Karatsuba algorithm. The student understands the algorithm extremely well, and carefully writes out each individual step on a separate line, simplifying products of sums using FOIL before doing the arithmetic step-by-step. Then, show an example of the feedback a meticulous math professor checking each of the student's intermediate results using a calculator, explaining how to correct any mistakes very explicitly. Compare the results of the professor and the student's calculations. If the professor notices an error, the professor's result MUST be different from the student's. All examples use 152469 * 793432.
```

Seems to get it right:

First, we'll quickly determine the order of magnitude of the product:

The order of magnitude can be found by counting the number of digits in each number and adding them together. In this case, 152469 has 6 digits and 793432 has 6 digits as well. Therefore, the product will have around 6+6=12 digits.

Next, we'll use a heuristic to estimate the product by rounding to the nearest thousand:

152469 ≈ 152,000 793432 ≈ 793,000

Estimated product: 152,000 * 793,000 = 120,536,000,000

Now, we'll use another heuristic to quickly determine the final three digits of the product:

For this, we can just multiply the last three digits of each number and ignore the rest:

469 * 432 = 202,608

The final three digits of the product are 608.

Now, an example of a math PhD student explaining how to multiply using the Karatsuba algorithm:

Step 1: Break the numbers into two halves: 152469 = 152 * 10^3 + 469 793432 = 793 * 10^3 + 432

Step 2: Compute the products of the two halves: a = 152 * 793 = 120,536 b = 469 * 432 = 202,608

Step 3: Compute the product of the sum of the two halves: (a+b) * (c+d) = (152+469) * (793+432) = 621 * 1,225 = 760,725

Step 4: Compute the final product using the Karatsuba formula: product = a * 10^6 + (760,725 - 120,536 - 202,608) * 10^3 + 202,608 = 120,536,000,000 + 437,581,000 + 202,608

The product is 120,973,783,608.

Now, an example of the feedback a meticulous math professor checking each of the student's intermediate results using a calculator:

"Good job on understanding the Karatsuba algorithm. I have checked your intermediate results and found them to be accurate. You have calculated the products of the two halves correctly, and the product of the sum of the two halves is also accurate. Your application of the Karatsuba formula is correct, and your final result is accurate. Well done!"

Both the math PhD student and the meticulous math professor have calculated the same result, 120,973,783,608.

Haha yes!

It didn't have any trouble with the individual steps in your example, so assuming that is typical, I think it's fairly likely just this will work:

Show an example of a math PHD student explaining how multiply using the Karatsuba algorithm. The student understands the algorithm extremely well, and carefully writes out each individual step on a separate line. The example uses 152469 * 793432.

Having GPT3/4 multiply numbers is a bit like eating soup with a fork. You can do it, and the larger you make the fork, the more soup you'll get - but it's not designed for it and it's hugely impractical. GPT4 does not have an internal algorithm for multiplication because the training objective (text completion) does not incentivize developing that. No iteration of GPT (5, 6, 7) will ever be a 100% accurate calculator (unless they change the paradigm away from LLM+RLHF), it will just asymptotically approach 100%. Why don't we just make a spoon?

Agreed. However, humans also don't have an internal multiplication algorithm, but can nonetheless use a scratchpad to multiply accurately (in extreme circumstances :P). I've chosen multiplication as an example here because it's maybe the "simplest" thing GPT-4 can't consistently do.

What I'm finding interesting here is that GPT-4 knows how to break down multiplications (it can write perfect recursive code for multiplying large numbers). It also knows about chain of thought prompting. How close is it to being able to just... directly use the algorithms it knows? What's the minimum amount of prodding needed? The following prompt is an upper bound on how hard it is to teach GPT-4 accurate multiplication:

```
To multiply large numbers accurately, it helps to break down the problem. We will demonstrate multiplying 191241 by 741122.
Set a = 19, b = 12, c = 41, d = 74, e = 11, and f = 22. We can then write
191241 * 741122 = (10^4 a + 10^2 b + c) * (10^4 d + 10^2 e + f).
Expanding this out, we obtain
10^8 ab + 10^6 (ae + bd) + 10^4 (af + be + cd) + 10^2 (bf + ce) + cf.
We can now calculate all these products:
ab = 19 * 74 = 1406
ae = 19 * 11 = 209
bd = 12 * 74 = 888
af = 19 * 22 = 418
be = 12 * 11 = 132
cd = 41 * 74 = 3034
bf = 12 * 22 = 264
ce = 41 * 11 = 451
cf = 41 * 22 = 902
Now we can calculate the sums inside parentheses:
ae + bd = 209 + 888
= 1097
af + be + cd = 418 + 132 + 3034
= 550 + 3034
= 3584
bf + ce = 264 + 451
= 715
We can now factor to avoid having so many zeroes:
10^8 ab + 10^6 (ae + bd) + 10^4 (af + be + cd) + 10^2 (bf + ce) + cf
10^2 (10^2 (10^2 (10^2 ab + (ae + bd)) + (af + be + cd)) + (bf + ce)) + cf
Now we substitute in:
10^2 (10^2 (10^2 (10^2 * 1406 + 1097) + 3584) + 715) + 902
= 10^2 (10^2 (10^2 (140600 + 1097) + 3584) + 715) + 902
= 10^2 (10^2 (10^2 (141697) + 3584) + 715) + 902
= 10^2 (10^2 (14169700 + 3584) + 715) + 902
= 10^2 (10^2 (14173284) + 715) + 902
= 10^2 (1417328400 + 715) + 902
= 10^2 (1417329115) + 902
= 141732911500 + 902
= 141732912402
The final answer is 141732912402.
Use the exact same method to multiply 183112 * 994531.
```

(It correctly follows the procedure to obtain `182110560472`

.) Is there a shorter prompt that accomplishes the same thing?

I couldn't get yours to ever work with Bing Chat but I did eventually find something that did work for me (most of the time) and is about half the characters of yours without any real code golfing. My prompt was the following:

```
We can multiply 191241 by 741122 step-by-step:
Express 191241 * 741122 as
(19e4 + 12e2 + 41) * (74e4 + 11e2 + 22)
Distribute product:
19e4 * 74e4 = 1406e8
19e4 * 11e2 = 209e6
19e4 * 22 = 418e4
12e2 * 74e4 = 888e6
12e2 * 11e2 = 132e4
12e2 * 22 = 264e2
41 * 74e4 = 3034e4
41 * 11e2 = 451e2
41 * 22 = 902
Gather terms:
1406e8
+ (0209 + 0888)e6
+ (0418 + 0132 + 3034)e4
+ (0264 + 0451)e2
+ 0902
Add terms in parentheses:
1406e8 + (1097)e6 + (3584)e4 + (0715)e2 + 0902
Expand zeros:
140600000000 + 1097000000 + 35840000 + 71500 + 902
Cumulative sum left to right:
141600000000
141697000000
141732840000
141732911500
141732912402
Result:
141732912402
Use this method to multiply 183112 by 994531.
```

GPT-4 perfectly understands e.g. Karatsuba multiplication and can consistently multiply two-digit numbers, but hasn’t connected those dots; it needs some hand-holding to correctly use a multiplication algorithm. What’s the shortest prompt that gets GPT-4 to multiply 6-digit integers with over 75% accuracy?