Definition in maths

Canario

New member
Joined
Aug 2, 2014
Messages
3
Hello,


I have a very general dude. Is a definition something that we put because is advisable and it can't be deduced?


Thanks for answering.
 
Your question doesn't make sense. A "definition" simply means you are labeling a concept. That has nothing to do with being "deduced".
 
That is what I wanted to know. I doubted whether the definitions are made for convenience or were deductible. Thank you so much.
 
Hello,


I have a very general dude. Is a definition something that we put because is advisable and it can't be deduced?


Thanks for answering.

The short answer is yes but I find it difficult to stop at the short answer most times so maybe the following will be of interest.


Definitions are the starting point of a system of mathematics. It is something we all agree to call it so we can discuss it. For example one definition of a real number is a value that represents a quantity along a continuous line. Now that we know what a real number is, we can discuss properties of the real numbers. Here we don't prove a real number, we define it.

The next step up from from definitions are axioms (postulates). These are statements that we agree to accept without having to prove them. Generally they appear obvious. An example of this is the completeness of the real numbers which can be stated as there is a real number between any two real numbers as well as in other equivalent ways. Here we don't prove the completeness of the real numbers, we just accept it.

The next step up from this is lemmas and theorems. These are initially conjectures which must be proved. One of the interesting things in mathematics is that, depending on which axioms you use for your systems of mathematics, what was an axiom in one system may become a lemma (or theorem) in a different equivalent system.

Enough for now.
 
Hi Canario:

Your question makes sense to me, also. I'll add to Ishuda's comments.

In mathematics, definitions are not always set in stone. Definitions may change, when moving between one area of mathematics and another.

Here are some examples.

In beginning algebra, mathematicians define signed numbers as positive or negative. Positive numbers lie to the right of zero, on the Real number line, and negative numbers lie to the left. We define zero as neither positive nor negative. Yet, other mathematicians define zero as a signed number, and they treat zero as either positive or negative depending on what's convenient. (If memory serves, treating zero as a negative number helps out in electrical engineering.)

In beginning algebra, mathematicians define the set of Natural numbers as the familiar numbers used for counting common objects {1,2,3,4,5,...}. We then define the set of Whole numbers as {0,1,2,3,4,5...}. In other words, the set of Whole numbers is what we get when we add zero to the set of Natural numbers. Yet, other mathematicians claim zero as a Natural number, so (for them) the definition of Natural number is different. (Some mathematicians deny the existence of the set of Whole numbers, even though that definition appears in a gazillion classrooms and text books.)

In intermediate algebra, the expression 0^0 is defined as meaningless. This is because 0^0 leads to a contradiction using properties of exponents. 0^0=0^(1-1) and 0^(1-1) may be rewritten as 0^1 times 0^(-1). However, the expression 0^(-1) represents 1/0. This result (dividing by zero) demonstrates why 0^0 is undefined. Yet, in precalculus and calculus, the expression 0^0 is defined to mean 1, despite contradicting properties of exponents. If we did not change the definition of 0^0 from meaningless to 1, other stuff (eg: binomial expansion, derivative power-rule) breaks down at x=0.

I could go on, but we're on the Beginning Algebra board, so I'm trying to stick to introductory definitions. My point is this: Be prepared to be flexible!

Cheers :D
 
Thank you Quaid and Ishuda. Quaid, what you told me about 0^0 is very interesting.
 
Top