Start Encyclopedia69 Dictionary | Overview | Topics | Groups | Categories | Bookmark this page.
 
dictionary -  encyclopedia  
Full text search :        
   A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   X   Y   Z   #   

 

 

Numbers

 
     
  Numbers are probably the most fundamental of all the objects of mathematical study, yet still remain remarkably little understood. Originally, the idea of a number was among the first concepts which was an abstraction from the real world activity of counting. Although a number seems to be an obvious concept, it has proved remarkably difficult (if not impossible) to define exactly what one is, something which perhaps underlines the fundamental nature of the idea. At its most basic, a number is something applied to a set of objects which are being counted. This is not a definition; it has several ambiguous points. With this concept of a number, is zero a number? (It was not recognized as necessary, in the West, until the late Middle Ages). Some schools among the ancient Greeks even denied that one was a number.

It soon becomes clear, even without an understanding of what a number is, that the numbers used for counting are not enough for all the purposes for which they are used. In the counting numbers (technically known as the whole numbers, though there is still disagreement among mathematicians as to whether zero is a whole number or not), there are equations of the form a - b = ? which have no solution (Which whole number is 2-4, for example?). To get around this, the whole numbers were supplemented by the integers, which include all the ‘owing’ numbers of this form (that is, the negative numbers). This turned out to be merely the first step of a process of expansion which led to the rational numbers, algebraic numbers, real numbers and complex numbers.

Towards the end of the 19th century, mathematicians, who were seeking to bring more rigour into mathematics in general, became somewhat concerned that there was no rigorous definition of number. This problem particularly concerned Bertrand Russell (1872 - 1970). In his conception of mathematics as a branch of logic, he wanted to find a way to characterize numbers as sets, as the basis for the Principia Mathematica he was writing with , Alfred North Whitehead (1861 - 1947). It is possible to characterize the idea ‘the set y has exactly n elements’ with a sentence of symbolic logic; Russell simply defined the number n to be the class of all sets with exactly n elements (this is not a circular definition because the logical sentence does not actually use the number n). Russell\'s definition, while it is obviously intuitively related to the basis of the concept of number, suffers from several drawbacks which eventually proved fatal. Each number is a class; it is too big actually to be a set itself. This means that a set of numbers is a set of classes—but this is a violation of Russell\'s own theory of types, as a class is a more complex notion than a set, so that classes can contain sets but sets should not contain classes. Also, it makes the definition of the operations of arithmetic very difficult; the relationship between the sentences for two different numbers is very complex. This meant, for example, that the proof that the Principia gives that 1 + 1 = 2 takes hundreds of pages, hardly a desirable state of affairs.

Russell\'s definition was finally dropped in favour of a far less concrete way of looking at numbers. , Giuseppe Peano (1858 - 1932), another member of the logicistic school, came up with an axiomatization of arithmetic. This requires the notions of 0 and successor (the operation of going from one number to the next) as basic to the concept of number. His axioms mean that in any system where some point can be designated as 0 and where there is a notion of succession can be defined which satisfies the rules he laid down (the most important of which is induction), it is possible to define the operations of addition and multiplication in a unique way which means that they behave in exactly the way you would expect. Peano\'s approach considerably simplifies proofs of the fundamental results of arithmetic in any such system. In effect, they divorce the mathematical concept of number from the intuitive, since all such systems can equally be taken actually to be the numbers. (In modern set theory, it is usual to single out one particularly simple such system, where 0 is taken to be the empty set, and the successor of n is the set containing all the members of n, and then n itself as a member.) SMcL
 
 

 

 

 
 
Bookmark this page:
 
 

 

 

 
 
<< former term
 
next term >>
Number Theory
 
Object Relations
 
     

 

Other Terms : Marriage | Positivism | Zionism
Home |  Add new article  |  Your List |  Tools |  Become an Editor |  Tell a Friend |  Links |  Awards |  Testimonials |  Press |  News |  About |
Copyright ©2009 GeoDZ. All rights reserved.  Terms of Use  |  Privacy Policy  |  Contact Us