In 1965 semiconductor pioneer Gordon Moore predicted that the number of transistors contained on a computer chip would double every year. This is now known as Moore’s Law, and it has proven to be somewhat accurate. The number of transistors and the computational speed of microprocessors currently doubles approximately every 18 months. Components continue to shrink in size and are becoming faster, cheaper, and more versatile.
With their increasing power and versatility, computers simplify day-to-day life. Unfortunately, as computer use becomes more widespread, so do the opportunities for misuse. Computer hackers-people who illegally gain access to computer systems-often violate privacy and can tamper with or destroy records. Programs called viruses or worms can replicate and spread from computer to computer, erasing information or causing malfunctions. Other individuals have used computers to electronically embezzle funds and alter credit histories (see Computer Security). New ethical issues also have arisen, such as how to regulate material on the Internet and the World Wide Web. Long-standing issues, such as privacy and freedom of expression, are being reexamined in light of the digital revolution. Individuals, companies, and governments are working to solve these problems through informed conversation, compromise, better computer security, and regulatory legislation.
Computers will become more advanced and they will also become easier to use. Improved speech recognition will make the operation of a computer easier. Virtual reality, the technology of interacting with a computer using all of the human senses, will also contribute to better human and computer interfaces. Standards for virtual-reality program languages-for example, Virtual Reality Modeling language (VRML)-are currently in use or are being developed for the World Wide Web.
Other, exotic models of computation are being developed, including biological computing that uses living organisms, molecular computing that uses molecules with particular properties, and computing that uses deoxyribonucleic acid (DNA), the basic unit of heredity, to store data and carry out operations. These are examples of possible future computational platforms that, so far, are limited in abilities or are strictly theoretical. Scientists investigate them because of the physical limitations of miniaturizing circuits embedded in silicon. There are also limitations related to heat generated by even the tiniest of transistors.
Intriguing breakthroughs occurred in the area of quantum computing in the late 1990s. Quantum computers under development use components of a chloroform molecule (a combination of chlorine and hydrogen atoms) and a variation of a medical procedure called magnetic resonance imaging (MRI) to compute at a molecular level. Scientists use a branch of physics called quantum mechanics, which describes the behavior of subatomic particles (particles that make up atoms), as the basis for quantum computing. Quantum computers may one day be thousands to millions of times faster than current computers, because they take advantage of the laws that govern the behavior of subatomic particles. These laws allow quantum computers to examine all possible answers to a query simultaneously. Future uses of quantum computers could include code breaking and large database queries. Theorists of chemistry, computer science, mathematics, and physics are now working to determine the possibilities and limitations of quantum computing.
Communications between computer users and networks will benefit from new technologies such as broadband communication systems that can carry significantly more data faster or more conveniently to and from the vast interconnected databases that continue to grow in number and type.
3.2.6. Computer telephony
Computer telephony is the technique of coordinating the actions of telephone and computer systems. This technology has existed in commercial form since the mid-1980s, but it has been exploited only in a few niche markets - particularly in large call centers, where call volumes easily justified the cost of complex custom-built systems. But in the 1990s, several factors have combined to significantly simplify computer-telephone systems and increase the marketplace's interest in computer telephony. International standards for interconnecting telephone and computer systems have been defined, notably the Computer-Supported Telephony Application (CSTA) call modeling and protocol standards from ECMA. Mass-market application programming interface (API) specifications have been heavily promoted by major market players such as Microsoft and Novell, and are gaining rapid acceptance. Voice processing technologies have advanced steadily, providing advanced features and high port densities at attractive prices. Public networks are offering more and more services which enable computer-telephone applications, such as Calling Line ID. And most important, the world economy is doing business over the telephone at an increasing rate, prompting business organizations tt look for ways to make this process more efficient and economical.
Public and private telephone systems provide real-time information paths between two or more parties. Traditionally, these information paths have taken the form of voice connections, originally through hardwired analog circuitry but later through an increasingly broad range of technologies such as radio transmission, digital signal encoding, and fiber. Over time, these transmission paths were also exploited for non-voice applications such as facsimile and data transmission.
At first, each non-voice application required a distinct set of dedicated "terminal equipment", the telephony term for any user device connected to the telephone network. Facsimile machines conversed only with other facsimile machines, computer devices sent data files only to other computer devices, and so forth. But in the 1990s, these disparate sets of equipment have begun to overlap, and the general-purpose computer has emerged as the point of intersection.
Computers can now send and receive every kind of information that passes through the telephone network: They can act as facsimile machines; they can interact with human speakers through voice synthesis and recognition; and of course they can send and receive data in many formats. It is this intersection, with the general-purpose computer serving as the interface point, which makes computer telephony so intriguing and potentially valuable to the marketplace.
Computer telephony today is characterized by a wealth of choices and options. Computer-telephone applications can be small or large in scope, simple or complex in operation. Any single feature can be implemented in a staggering number of ways, with implementation choices on both the telephone and computing sides of the equation. There is no right or wrong way to build a computer-telephone system.
With all of these choices, it is essential that the systems practitioner become knowledgeable about both computing and telephony, and begin to learn ways in which these two systems environments can be linked together.
Competence in both disciplines will become essential as the two technologies become even more closely integrated. The current focus on linkage between discrete telephone and computing systems is just a transition phase. Very soon, the distinction between telephone switches and LAN servers will disappear, as hybrid telephony servers are brought to market containing both switching and application-interface functions.
Computer telephony is at an important turning point: The necessary elements of the technology have been developed; now we need to educate large numbers of insightful practitioners who can put it to productive use.