Home Random Page


CATEGORIES:

BiologyChemistryConstructionCultureEcologyEconomyElectronicsFinanceGeographyHistoryInformaticsLawMathematicsMechanicsMedicineOtherPedagogyPhilosophyPhysicsPolicyPsychologySociologySportTourism






Colloquial use of the term

Origins

The term microcomputer came into popular use after the introduction of the minicomputer, although Isaac Asimov used the term microcomputer in his short story "The Dying Night" as early as 1956 (published in The Magazine of Fantasy and Science Fiction in July that year).[5] Most notably, the microcomputer replaced the many separate components that made up the minicomputer's CPU with one integrated microprocessor chip. The French developers of the Micral N (1973) filed their patents with the term "Micro-ordinateur", a literal equivalent of "Microcomputer", to designate the first solid state machine designed with a microprocessor. In the USA, the earliest models such as the Altair 8800 were often sold as kits to be assembled by the user, and came with as little as 256 bytes of RAM, and no input/output devices other than indicator lights and switches, useful as a proof of concept to demonstrate what such a simple device could do.[6] However, as microprocessors and semiconductor memory became less expensive, microcomputers in turn grew cheaper and easier to use:

Increasingly inexpensive logic chips such as the 7400 series allowed cheap dedicated circuitry for improved user interfaces such as keyboard input, instead of simply a row of switches to toggle bits one at a time.

Use of audio cassettes for inexpensive data storage replaced manual re-entry of a program every time the device was powered on.

Large cheap arrays of silicon logic gates in the form of read-only memory and EPROMs allowed utility programs and self-booting kernels to be stored within microcomputers. These stored programs could automatically load further more complex software from external storage devices without user intervention, to form an inexpensive turnkey system that does not require a computer expert to understand or to use the device.

Random access memory became cheap enough to afford dedicating approximately 1-2 kilobytes of memory to a video display controller frame buffer, for a 40x25 or 80x25 text display or blocky color graphics on a common household television. This replaced the slow, complex, and expensive teletypewriter that was previously common as an interface to minicomputers and mainframes.

All these improvements in cost and usability resulted in an explosion in their popularity during the late 1970s and early 1980s. A large number of computer makers packaged microcomputers for use in small business applications. By 1979, many companies such as Cromemco, Processor Technology, IMSAI, North Star Computers, Southwest Technical Products Corporation, Ohio Scientific, Altos Computer Systems, Morrow Designs and others produced systems designed either for a resourceful end user or consulting firm to deliver business systems such as accounting, database management, and word processing to small businesses. This allowed businesses unable to afford leasing of a minicomputer or time-sharing service the opportunity to automate business functions, without (usually) hiring a full-time staff to operate the computers. A representative system of this era would have used an S100 bus, an 8-bit processor such as an Intel 8080 or Zilog Z80, and either CP/M or MP/M operating system. The increasing availability and power of desktop computers for personal use attracted the attention of more software developers. In time, and as the industry matured, the market for personal computers standardized around IBM PC compatibles running DOS, and later Windows. Modern desktop computers, video game consoles, laptops, tablet PCs, and many types of handheld devices, including mobile phones, pocket calculators, and industrial embedded systems, may all be considered examples of microcomputers according to the definition given above.



Description

Monitors, keyboards and other devices for input and output may be integrated or separate. Computer memory in the form of RAM, and at least one other less volatile, memory storage device are usually combined with the CPU on a system bus in one unit. Other devices that make up a complete microcomputer system include batteries, a power supply unit, a keyboard and various input/output devices used to convey information to and from a human operator (printers, monitors, human interface devices). Microcomputers are designed to serve only one user at a time, although they can often be modified with software or hardware to concurrently serve more than one user. Microcomputers fit well on or under desks or tables, so that they are within easy access of users. Bigger computers like minicomputers, mainframes, and supercomputers take up large cabinets or even dedicated rooms.

A microcomputer comes equipped with at least one type of data storage, usually RAM. Although some microcomputers (particularly early 8-bit home micros) perform tasks using RAM alone, some form of secondary storage is normally desirable. In the early days of home micros, this was often a data cassette deck (in many cases as an external unit). Later, secondary storage (particularly in the form of floppy disk and hard disk drives) were built into the microcomputer case.

Colloquial use of the term

Everyday use of the expression "microcomputer" (and in particular the "micro" abbreviation) has declined significantly from the mid-1980s and has declined in commonplace usage since 2000.[7] The term is most commonly associated with the first wave of all-in-one 8-bit home computers and small business microcomputers (such as the Apple II, Commodore 64, BBC Micro, and TRS 80). Although, or perhaps because, an increasingly diverse range of modern microprocessor-based devices fit the definition of "microcomputer," they are no longer referred to as such in everyday speech.

In common usage, "microcomputer" has been largely supplanted by the term "personal computer" or "PC," which specifies a computer that has been designed to be used by one individual at a time, a term first coined in 1959.[8] IBM first promoted the term "personal computer" to differentiate themselves from other microcomputers, often called "home computers", and also IBM's own mainframes and minicomputers[citation needed] . However, following its release, the IBM PC itself was widely imitated, as well as the term[citation needed]. The component parts were commonly available to producers and the BIOS was reverse engineered through cleanroom design techniques. IBM PC compatible "clones" became commonplace, and the terms "personal computer," and especially "PC" stuck with the general public.

Since the advent of microcontrollers (monolithic integrated circuits containing RAM, ROM and CPU all onboard), the term "micro" is more commonly used to refer to that meaning.


Date: 2015-12-11; view: 988


<== previous page | next page ==>
A future for antibiotics? | Culture of Kazakhstan Kazakh culture and national traditions
doclecture.net - lectures - 2014-2024 year. Copyright infringement or personal data (0.007 sec.)