As a computer programmer, sometimes I have to deal with data-size limits when working with numbers, and I have to be careful of "OVERFLOW ERRORS" where my data is too large to process. It sounds a bit complex, but it's simple enough to understand with a few examples.

FOR EXAMPLE: I have a variable which is defined in my program as being an UNSIGNED BYTE. In computers, this is 8 bits of binary data (0s and 1s), and represents the range from 0 to 255. For those who don't understand computers at all, let's just say I have an egg-crate which can hold from 0 to 255 eggs in it...

Now, in one crate, I have 200 eggs. In the second crate, I have 150 eggs. (UNSIGNED BYTE variables, in my programs, but visualize it as egg crates, if you want.)

Now, to find the average number of eggs in the crates, we simply add the numbers together and divide by 2.

The problem with computers is that sometimes we solve things step by step. First, we add the numbers together and store them in a new crate (variable): Crate3count = Crate1count + Crate2count. Then we divide by 2: Answer = Crate3count / 2

Where the problem arises is when we try and put 350 eggs into Crate3count which only holds 255 eggs... It generates an "OVERFLOW ERROR" and causes our program to crash. We're putting more eggs into that third crate than what it was built to hold....

So, here's a solution which I discovered which works without overflowing bounds, with the problem being,I can't prove that it works 100% of the time.

Instead of(A+B)/2to get the solution, I've found thatA + (B - A)/ 2will give the proper results, without causing overflow issues with the variables. My question is: Can someone show proof that these two statements will always be functionally equivalent?

In this case, let's plug in the information from our dataset:

(A + B) / 2

(200 + 150) / 2

350 / 2

175

OR

A + (B - A) / 2

200 + (150 - 200) / 2

200 + (-50) /2

200 + (-25)

175

This seems like a simple enough algebra problem to me, but its solution eludes me everytime I try and prove it. Can someone show me proof that(A + B) / 2 = A + (B - A) / 2?

Experience shows me that it works. My poor brain however, just can't seem to figure outWHYit works. (And if this isn't a problem for Advanced Algebra people, feel free to move it to wherever it belongs. I simply posted here because it seemed like advanced algebra to me. )

Thanks for advance, and kindly don't laugh at me if this is something so simple it can be shown in two obvious steps and I'm just obliviously overlooking the solution to the proof.

## Bookmarks