My interest is in the future because I am going to spend the rest of my time there,
Unknown
As you can see, the number of transistors and the speed of performance in the period of 1979-2000 were steadily increasing simultaneously with the decreasing of transistor size - from 6 down to 0,18 microns. For comparison, the thickness of a human hair is about 100 microns. It's worth to stress that the illustrated above tendency is still keeping on.
Taking Computer for Granted
In 1965, three years before Intel was founded, Gordon Moore noticed that microchip capacity* seemed to double every 18 to 24 months. This rate of increase later became known as Moore's Law. We gathered up a list of every major Intel processor family and got the corresponding number of transistors per chip. The result, a graphical representation, demonstrates that Moore's Law is alive and well. What is the future of Moore's Law?
Andy Groove, former Intel CEO**, predicted that Intel would ship a processor with one billion*** transistors in 2011 which is in line with Moore's Law. Other industry experts see silicon technology reaching its physical limits**** at around 2017. The implications of the continued viability of Moore's Law are profound. In addition to the fact that our increasingly computerized economy will become even more productive, other technologies such as voice recognition, virtual reality, and artificial intelligence begin to appear possible. And speaking of profound, if Moore's Law were to somehow survive on into 2030, the processor would than surpass the computational power of the human brain.
Gordon Moorewas one of the Intel founders
* number of transistors on a chip
** Chief Executive Officer
*** a thousand millions - 1 000 000 000 (US, now often in Britain)
**** amount of transistors per one unit of a chip surface area
It would be possible to finish on this optimistic prediction the brief, though far from being complete, history of computers. But still the question concerning even the nearest future of these machines has not been exhausted yet since in scientific literature you can find «slightly» different opinions. We can't help sharing the extracts of them with you:
For more than forty years manufacturers have crammed more and more devices, particularly transistors, into microprocessors. The current technological process to pack them is a photography-like technique called DUVL (Deep-Ultraviolet Lithography). Every technology has its limits. According to estimations
UNIT 4
DUVL will reach them around 2005. So at this time the Moore's Law won't work and engineers will have to look for other technologies to enhance the density of devices onto the chip. There are several possible ways to achieve it.
The first one, traditional, implies to improve the DUVL towards the higher resolution. It would be possible provided a new high resolution equipment were designed. The question is - what are the prospects and how long it'll work? In other words, is the game worth the candle? Some scientists who are in doubt about this way of technological progress are looking for possible alternatives as, for example, biological or DNA and quantum computers. DNA is an abbreviation for Deoxyribonucleic Acid, the main part of every cellular organism that preserves the basic genetic code. DNA computers will have many advantages in comparison with to-days ones:
• they will be much more smaller;
• they will store much more information;
• they will perform calculations simultaneously; it'll allow to solve complex mathematical problems in hours - problems that might take contemporary computers hundreds of years to complete. «The tear-drop sized DNA computer will be more powerful than the world's most powerful supercomputer».
Our computers work by manipulating «0» or «1». Quantum computers will be able to encode information the same way or in between «0» or «1». In this case atoms are working together to serve as computer memory and microprocessor. Such computers will be millions of times more powerful than to-days most powerful supercomputers! What's more, if it happened, people would be wearing them like a watch. Wearable computers will be built into people's clothing or jewelry. Voice and handwriting recognition software will provide the possibility to interface with such computers without the help of keyboard or mouse. It is quite interesting, isn't it?