If a² = b² + c² then how to cal calculate a = b + C?

Indranil

Junior Member
Joined
Feb 22, 2018
Messages
220
If a² = b² + c² then how to cal calculate a = b + C?

If we all know a = √b² + c² = a + b but I want to clarify the methods of getting a = b + c.
I have done below:
a² = b² + c², a = √b² + c² = (b²)¹⁄² + (c²)¹⁄² = b²⁄¹⋅¹⁄² + c²⁄¹⋅¹⁄² = b + c Am I correct? Pleace check my steps
 
Please post complete text of the problem.
If a² = b² + c² then it does not follow that a = b + c. E.g. straight triangle sides satisfy a² = b² + c², but definitely not a = b + c.
 
I have done below:
a² = b² + c², a = √b² + c² = (b²)¹⁄² + (c²)¹⁄² = b²⁄¹⋅¹⁄² + c²⁄¹⋅¹⁄² = b + c
Am I correct?
a² = b² + c², a = √b² + c²

HOW d'heck did you get that?

Simple example: a=5, b=4, c=3

a^2 = b^2 + c^2
25 = 16 + 9
25 = 25

a = √b² + c²
5 = b + 9
5 = 4 + 9
5 = 13

Suggestion: start checking your work yourself using examples....

Anyhooooo: if a^2 = b^2 + c^2, then a = sqrt(b^2 + c^2).
Have you not been exposed to right triangles yet?
 
Last edited:
If we all know a = √b² + c² = b + c[?] but I want to clarify the methods of getting a = b + c.
I have done below:
a² = b² + c², a = √(b² + c²) = (b²)¹⁄² + (c²)¹⁄² = b²⁄¹⋅¹⁄² + c²⁄¹⋅¹⁄² = b + c Am I correct? Pleace check my steps

This is the false idea I was correcting in the other thread. It is not true that \(\displaystyle \sqrt{a + b} = \sqrt{a} + \sqrt{b}\). Many beginners imagine it to be true, but it is false! They will never get anywhere in algebra until they learn that what feels right is not always right. You must only do what you have been taught is valid.

It is true that \(\displaystyle \sqrt{a \cdot b} = \sqrt{a} \cdot \sqrt{b}\); the root of a product is the product of the roots. But this is not true for addition.

Your claimed proof is based on a more general assumption that power of a sum is the sum of the powers. It is not true that \(\displaystyle (a + b)^n = a^n + b^n\), which you are assuming when you say "√(b² + c²) = (b²)¹⁄² + (c²)¹⁄²", since you are implying that "(b² + c²)1/2 = (b²)1/2 + (c²)1/2".
 
If we all know a = √b² + c² = a + b but I want to clarify the methods of getting a = b + c.
I have done below:
a² = b² + c², a = √b² + c² = (b²)¹⁄² + (c²)¹⁄² = b²⁄¹⋅¹⁄² + c²⁄¹⋅¹⁄² = b + c Am I correct? Pleace check my steps
What puzzles me when students say this is that why would we not simply state the Pythagorean as a + b = c?
 
This is the false idea I was correcting in the other thread. It is not true that \(\displaystyle \sqrt{a + b} = \sqrt{a} + \sqrt{b}\). Many beginners imagine it to be true, but it is false! They will never get anywhere in algebra until they learn that what feels right is not always right. You must only do what you have been taught is valid.

It is true that \(\displaystyle \sqrt{a \cdot b} = \sqrt{a} \cdot \sqrt{b}\); the root of a product is the product of the roots. But this is not true for addition.

Your claimed proof is based on a more general assumption that power of a sum is the sum of the powers. It is not true that \(\displaystyle (a + b)^n = a^n + b^n\), which you are assuming when you say "√(b² + c²) = (b²)¹⁄² + (c²)¹⁄²", since you are implying that "(b² + c²)1/2 = (b²)1/2 + (c²)1/2".
Do you mean that
1. a = √(b² + c²) or a = √(b² + c²) = (b² + c²)¹⁄² but not a = √b² + √c² or a = (b²)¹⁄² +( c²)¹⁄²
2. If a² = b² + c², then a ≠ b + c (not possible)
 
Do you mean that
1. a = √(b² + c²) or a = √(b² + c²) = (b² + c²)¹⁄² but not a = √b² + √c² or a = (b²)¹⁄² +( c²)¹⁄²
2. If a² = b² + c², then a ≠ b + c (not possible)

Correct.

If a = √(b² + c²), then you can't conclude that a = b + c.

In fact, the only way these can both be true is if b or c is zero.

As others have pointed out, you can easily determine this by trying examples. You don't need to guess at it.
 
Do you mean that
1. a = √(b² + c²) or a = √(b² + c²) = (b² + c²)¹⁄² but not a = √b² + √c² or a = (b²)¹⁄² +( c²)¹⁄²
2. If a² = b² + c², then a ≠ b + c (not possible)
\(\displaystyle (a + b)^2 = (a + b)(a + b) = a(a + b) + b(a + b) \implies\)

\(\displaystyle (a + b)^2 = a^2 + ab + ba + b^2 = a^2 + ab + ab + b^2 = a^2 + 2ab + b^2.\)

Any questions so far?

\(\displaystyle a \ne 0 \text { and } b \ne 0 \implies ab \ne 0 \implies 2ab \ne 0 \implies\)

\(\displaystyle a^2 + 2ab + b^2 \ne a^2 + b^2 \implies (a + b)^2 \ne a^2 + b^2.\)

Still following?

\(\displaystyle \therefore a \ne 0 \text { and } b \ne 0 \implies \sqrt{(a + b)^2} \ne \sqrt{a^2 + b^2}\)

\(\displaystyle \text {THUS, } a + b \ne \sqrt{a^2 + b^2} \text { if } a \ne 0 \text { and } b \ne 0.\)

EDIT: To expand a bit on what Dr. Peterson wrote.
You can prove, for any pair of real numbers, the truth of:

\(\displaystyle a = 0 \text { or } b = 0 \implies \sqrt{a^2 + b^2} = |a| + |b|.\)

It is false however that, for all real numbers a and b,

\(\displaystyle a = 0 \text { or } b = 0 \implies \sqrt{a^2 + b^2} = a + b.\)
 
Last edited:
Top