CATEGORIES:

# Notation and communication

There are many major mathematical discoveries but only those which can be understood by others lead to progress. However, the easy use and understanding of mathematical concepts depends on their notation.

For example, work with numbers is clearly hindered by poor notation. Try multiplying two numbers together in Roman numerals. What is MLXXXIV times MMLLLXIX? Addition of course is a different matter and in this case Roman numerals come into their own, merchants who did most of their arithmetic adding figures were reluctant to give up using Roman numerals.

What are other examples of notational problems. The best known is probably the notation for the calculus used by Leibniz and Newton. Leibniz's notation lead more easily to extending the ideas of the calculus, while Newton's notation although good to describe velocity and acceleration had much less potential when functions of two variables were considered. British mathematicians who patriotically used Newton's notation put themselves at a disadvantage compared with the continental mathematicians who followed Leibniz.

Let us think for a moment how dependent we all are on mathematical notation and convention. Ask any mathematician to solve ax = b and you will be given the answer x = b/a. I would be very surprised if you were given the answer a = b/x, but why not. We are, often without realising it, using a convention that letters near the end of the alphabet represent unknowns while those near the beginning represent known quantities.

It was not always like this: Harriot used a as his unknown as did others at this time. The convention we use (letters near the end of the alphabet representing unknowns) was introduced by Descartes in 1637. Other conventions have fallen out of favour, such as that due to Viète who used vowels for unknowns and consonants for knowns.

Of course ax = b contains other conventions of notation which we use without noticing them. For example the sign "=" was introduced by Recorde in 1557. Also ax is used to denote the product of a and x, the most efficient notation of all since nothing has to be written!

3.1.3.3. Brilliant discoveries?

It is quite hard to understand the brilliance of major mathematical discoveries. On the one hand they often appear as isolated flashes of brilliance although in fact they are the culmination of work by many, often less able, mathematicians over a long period.

For example the controversy over whether Newton or Leibniz discovered the calculus first can easily be answered. Neither did since Newton certainly learnt the calculus from his teacher Barrow. Of course I'm not suggesting that Barrow should receive the credit for discovering the calculus, I'm mearly pointing out that the calculus comes out of a long period of progress staring with Greek mathematics.

Now we are in danger of reducing major mathematical discoveries as no more than the luck of who was working on a topic at "the right time". This too would be completely unfair (although it does go some why to explain why two or more people often discovered something independently around the same time). There is still the flash of genius in the discoveries, often coming from a deeper understanding or seeing the importance of certain ideas more clearly.

Date: 2015-12-18; view: 1115

 <== previous page | next page ==> Тексты по специальности | How we view history
doclecture.net - lectures - 2014-2024 year. Copyright infringement or personal data (0.008 sec.)