Boyle’s law, developed by English scientist Robert Boyle, states that the pressure of a gas times its volume is equal to a constant number, for a gas at a constant temperature. This relationship means that pressure increases as volume decreases, and vice versa. In this graph, the product of pressure and volume anywhere along one of the lines of constant temperate should be equal.
In the 16th century experimenters discovered how to create a vacuum, something that Aristotle had declared impossible. This called attention to the ancient theory of Democritus, who had assumed that his atoms moved in a void. The French philosopher and mathematician René Descartes and his followers developed a mechanical view of matter in which the size, shape, and motion of minute particles explained all observed phenomena. Most natural philosophers and iatrochemists at this time assumed that gases had no chemical properties, hence their attention was centered on the physical behavior of gases. A kinetic-molecular theory of gases began to develop. Notable in this direction were the experiments of Robert Boyle, the English physicist and chemist whose studies of the “spring of the air” (elasticity) led to the formation of what became known as Boyle's law, a generalization of the inverse relation between pressure and volume of a gas (see Gases).
PHLOGISTON: THEORY AND EXPERIMENT
While natural philosophers were thus speculating on mathematical laws, early chemists in their laboratories were attempting to use chemical theories to explain the very real chemical reactions they were observing. The iatrochemists paid particular attention to sulfur and the theories of Paracelsus. In the second half of the 17th century, the German physician, economist, and chemist Johann Joachim Becher built a system of chemistry around this principle. He noted that when organic matter burned, a volatile material seemed to leave the burning substance. His disciple, Georg Ernst Stahl, made this the central point of a theory that survived in chemical circles for nearly a century.
Stahl assumed that when anything burned, its combustible part was given off to the air. This part he called phlogiston, from the Greek word for “flammable.” The rusting of metals was analogous to combustion and therefore also involved loss of phlogiston. Plants absorbed the phlogiston from the air and thus were rich in it. Heating the calx, or oxides, of metals with charcoal restored phlogiston to them. It followed from this that the calx was an element, and the metal a compound. This theory is almost exactly the reverse of the modern concept of oxidation-reduction (see Chemical Reaction), but it involves the cyclic transfer of a substance—even if in the wrong direction—and some observed phenomena could be explained by it. However, recent studies of chemical literature of the period show that the phlogiston explanation had only minor influence among chemists until it was attacked by the wealthy amateur French chemist Antoine Laurent Lavoisier in the last quarter of the eighteenth century.
The 18th Century
At about the same time, another observation led to advances in the understanding of chemistry. As more and more chemicals were studied, chemists saw that certain substances combined more easily with, or had a greater affinity for, a given chemical than did others. Elaborate tables were drawn up showing relative affinities when different chemicals were brought together. Use of these tables made it possible to predict many chemical reactions before testing them in the laboratory.
All these advances led in the 18th century to the discovery of new metals and their compounds and reactions. Qualitative and quantitative analytical methods began to be developed, and the science of analytical chemistry was born. Nonetheless, as long as the part played by gases was believed to be only physical, the full scope of chemistry could not be recognized.
The chemical study of gases, generally called “airs,” became important after the British physiologist Stephen Hales developed the pneumatic trough to collect and measure the volume of gases released from various solids by heating in a closed system and collecting over water. The pneumatic trough became a valuable device for the collection and study of gases uncontaminated by ordinary air. The study of gases advanced rapidly and led to a new level of understanding of various different gases.
The initial understanding of the role of gases in chemistry occurred in Edinburgh in 1756, when British Chemist Joseph Black published his studies on the reactions of magnesium and calcium carbonates (see Carbonates). When these compounds were heated, they gave off a gas and left a residue of what Black called calcined magnesia, or lime (the oxides). The latter reacted with “alkali” (sodium carbonate) to regenerate the original salts. Thus, the gas carbon dioxide, which Black called fixed air, took part in chemical reactions (was “fixed,” as he said). The idea that a gas could not enter a chemical reaction was overthrown, and soon a number of new gases were recognized as being distinct substances.
The British physicist Henry Cavendish isolated “flammable air” (hydrogen) in the next decade. He also introduced the use of mercury instead of water as the confining liquid over which gases were collected, making it possible to collect water-soluble gases. This variant was used extensively by the British chemist and theologian Joseph Priestley, who collected and studied almost a dozen new gases. Priestley's most important discovery was oxygen, and he quickly realized that this gas was the component of ordinary air that was responsible for combustion and made animal respiration possible. However, he reasoned that combustible substances burned more energetically in this gas, and metals formed calxes more readily, since it was devoid of phlogiston. Hence, the gas accepted the phlogiston present in the combustible substance or the metal more readily than ordinary air, which was already partially filled with phlogiston. He named this new gas “dephlogisticated air” and defended that belief to the end of his life.
Meanwhile chemistry had been making rapid progress in France, particularly in the laboratory of Lavoisier. He was troubled by the fact that metals gained weight when heated in the air when presumably they were losing phlogiston.
In 1774 Priestley visited France and told Lavoisier about his discovery of dephlogisticated air. Lavoisier quickly saw the significance of this substance, and the way was opened for the chemical revolution that established modern chemistry. He used the name “oxygen,” meaning acid former.
The Birth of Modern Chemistry
Lavoisier showed by a series of brilliant experiments that air contains 20 percent oxygen and that combustion is due to the combination of a combustible substance with oxygen. When carbon is burned, fixed air (carbon dioxide) is produced. Phlogiston therefore does not exist. The phlogiston theory was soon replaced by the view that oxygen from the air combines with the components of the combustible substance to form oxides of the component elements. Lavoisier used the laboratory balance to give quantitative support to his work. He defined elements as substances that could not be decomposed by chemical means and firmly established the law of the conservation of mass. He replaced the old system of chemical names (which was still based on alchemical usage) with the rational chemical nomenclature used today, and he helped to found the first chemical journal. After his death on the guillotine in 1794, his colleagues continued his work in establishing modern chemistry. A little later the Swedish chemist Jöns Jakob Berzelius proposed symbolizing atoms of the elements by the initial letters or pairs of letters from their names.
THE 19TH AND 20TH CENTURIES
By the beginning of the 19th century the precision of analytical chemistry had improved to such an extent that chemists were able to show that the simple compounds with which they worked contained fixed and unvarying amounts of their constituent elements. In certain cases, however, more than one compound could be formed between the same elements. At the same time the French chemist and physicist Joseph Gay-Lussac showed that the volume ratios of reacting gases were small whole numbers (which implies the interaction of discrete particles, later shown to be atoms). A major step in explaining these facts was the chemical atomic theory of the English scientist John Dalton in 1803.
Dalton assumed that when two elements combined, the resulting compound contained one atom of each. In his system, water could be given a formula corresponding to HO. He arbitrarily assigned to hydrogen the atomic weight of 1 and could then calculate the relative atomic weight of oxygen. Applying this principle to other compounds, he calculated the atomic weights of other elements and drew up a table of the relative atomic weights of all the then known elements. His theory contained many errors, but the idea was correct, and a precise quantitative value could then be assigned to the mass of each atom.
The major weaknesses in Dalton's theory were that he did not account for the law of multiple proportions and made no distinction between atoms and molecules. Thus, he could not distinguish between the possible formulas for water HO and H2O2, nor could he explain why the density of water vapor, with its assumed formula HO, was less than that of oxygen, assumed to have the formula O. The solution to these problems was found in 1811 by the Italian physicist Amedeo Avogadro. He suggested that the numbers of particles in equal volumes of gases at the same temperature and pressure were equal and that a distinction existed between molecules and atoms. When oxygen combined with hydrogen, a double atom of oxygen (a molecule in our terms) was split, each oxygen atom then combining with two hydrogen atoms, giving the molecular formula of H2 O for water and O2 and H2 for molecules of oxygen and hydrogen.
Unfortunately, Avogadro's ideas were overlooked for nearly 50 years, and during this time great confusion prevailed among chemists in their calculations. It was not until 1860 that the Italian chemist Stanislao Cannizzaro reintroduced Avogadro's hypotheses. By this time chemists had found it more convenient to take the atomic weight of oxygen, 16, as the standard to which to relate the atomic weights of all the other elements instead of taking the value 1 for hydrogen, as Dalton had done. The molecular weight of oxygen, 32, was then used universally and, expressed in grams, was called the gram molecular weight of oxygen, or more simply, 1 mole of oxygen. Chemical calculations were standardized, and fixed formulas written.
The old problem of the nature of chemical affinity remained unsolved. For a time it appeared that the answer might lie in the newly discovered field of electrochemistry. The discovery in 1800 of the voltaic pile, the first true battery, gave chemists a new tool, which led to the discovery of such metals as sodium and potassium. It seemed to Berzelius that positive and negative electrostatic forces might hold elements together; at first his theories were generally accepted. As chemists prepared and studied more new compounds and reactions in which electrical forces did not seem to be involved (the nonpolar compounds), the problem of affinity was shelved for a time.
New Fields of Chemistry
The most striking advances in chemistry in the 19th century were in the field of organic chemistry (see Chemistry, Organic). The structural theory, which gave a picture of how atoms were actually put together, was nonmathematical, but employed a logic of its own. It made possible the prediction and preparation of many new compounds, including a large number of important dyes, drugs, and explosives that gave rise to great chemical industries, especially in Germany.
At the same time, other branches of chemistry made their appearance. Stimulated by the advances in physics then being made, some chemists sought to apply mathematical methods to their science. Studies of reaction rates led to the development of kinetic theories that had value both for industry and for pure science. The recognition that heat was due to motion on the atomic scale, a kinetic phenomenon, led to the abandonment of the idea that heat was a specific substance (termed caloric) and initiated the study of chemical thermodynamics (see Thermodynamics). Continuation of electrochemical studies led the Swedish chemist Svante August Arrhenius to postulate the dissociation of salts in solution to form ions carrying electrical charges. Studies of the emission and absorption spectra of elements and compounds became important to both chemists and physicists (see Spectroscopy; Spectrum). In addition, fundamental research in colloid and photochemistry was begun. By the end of the 19th century, studies of this type were combined into the field known as physical chemistry (see Chemistry, Physical).
Inorganic chemistry also required organization. The number of new elements being discovered continued to grow, but no method of classification had been developed that could bring order to their reactions. The independent development of the periodic law by the Russian chemist Dmitry Ivanovich Mendeleyev in 1869 and the German chemist Julius Lothar Meyer in 1870 eliminated this confusion and indicated where new elements would be found and what their properties would be (see Elements, Chemical; Periodic Law).
At the end of the 19th century chemistry, like physics, seemed to have reached a stage in which no striking new fields remained to be developed. This view changed completely with the discovery of radioactivity. Chemical methods were used in isolating new elements such as radium, in the separation of the new class of substances known as isotopes, and in the synthesis and isolation of the new transuranium elements. The new picture of the actual structure of atoms obtained by physicists solved the old problem of chemical affinity and explained the relation between polar and nonpolar compounds. See Nuclear Chemistry.
The other major advance for chemistry in the 20th century was the foundation of biochemistry. This began with the simple analysis of body fluids; methods were then rapidly developed for determining the nature and function of the most complex cell constituents. By midcentury biochemists had unraveled the genetic code and explained the function of the gene, the basis of all life; the field had grown so vast that its study had become a new science, molecular biology. See also Genetics.
Recent Research in Chemistry
Recent advances in biotechnology and materials science are helping to define the frontiers of chemical research. In biotechnology, sophisticated analytical instruments have made it possible to initiate an international effort to sequence the human genome. Success in this project will likely completely change the nature of such fields as molecular biology and medicine. Materials science, an interdisciplinary combination of physics, chemistry, and engineering, is guiding the design of advanced materials and devices. A recent example is the discovery of high-temperature superconductors, ceramic compounds that lose their resistance to the flow of electricity above 77K (-196° C/-321° F; see Superconductivity). Characterization of surfaces is being advanced by the invention of the scanning tunneling microscope, which can provide images of certain surfaces with atomic-scale resolution. See Microscope; Superconductivity.
Even in conventional fields of chemical research, new, more powerful analytical tools are providing unprecedented detail of chemicals and their reactions. For example, laser techniques are providing snapshots of gas-phase chemical reactions on the femtosecond (a millionth of a billionth of a second) time scale. From the soot produced by graphite electrodes has been isolated a new form of carbon, called buckminsterfullerene, that has the shape of a soccerball, and the chemical formula C60. This compound and its chemistry have been characterized with astonishing rapidity using the vast array of analytical techniques currently available. Certain alkali metal salts of this compound have even been found to be superconducting.
The Chemical Industry
The growth of chemical industries and the training of professional chemists had an interestingly shared history. Until about 150 years ago chemists were not trained professionally. Chemistry was advanced by the work of those who were interested in the subject, but who made no systematic effort to train new workers in the field. Physicians and wealthy amateurs often hired assistants, only some of whom continued their masters' work.
Early in the 19th century, however, this haphazard system of chemical education changed. Many provincial universities were established in Germany, a country with a long tradition of research. A research center in chemistry was set up at Giessen by the German chemist Justus Liebig. This first teaching laboratory became so successful that it drew students from all over the world; other German universities soon followed.
A large group of young chemists was thus trained just at the time when chemical industries were beginning to exploit new discoveries. This exploitation had its start during the Industrial Revolution; the Leblanc process for the production of soda, for example—one of the first large-scale production processes—was developed in France in 1791 and was commercialized in England beginning in 1823. The laboratories of such growing industries were able to employ the newly trained chemistry students and also to use university professors as consultants. This interplay between the universities and the chemical industry benefited both of them, and the accompanying rapid growth of the organic chemical industry toward the end of the 19th century created the great German dye and pharmaceutical trusts that gave Germany scientific predominance in the field until World War I.
After the war, the German system was introduced into all the industrial nations of the world, and chemistry and chemical industries progressed even more rapidly. Among some of the more recent industrial developments, increasing use has been made of enzymatic reaction processes (see Enzyme), mainly because of the low costs and high yields that can be achieved. Industries are at present studying production methods using genetically altered microorganisms for industrial purposes (see Genetic Engineering).
Chemistry and Society
Chemistry has had an enormous influence on human life. In earlier periods chemical techniques were used to isolate useful natural products and to find new ways to employ them. In the 19th century techniques were developed for synthesizing completely new substances that were either better than the natural ones or could completely replace them more cheaply. As the complexity of synthesized compounds increased, wholly new materials with novel uses began to appear. Plastics and new textiles were developed, and new drugs conquered whole classes of disease. At the same time, what had been entirely separate sciences began to be drawn together. Physicists, biologists, and geologists had developed their own techniques and ways of looking at the world, but now it became evident that each science, in its own way, was the study of matter and its changes. Chemistry lay at the base of each of them. The resulting formation of such interscientific disciplines as geochemistry or biochemistry has stimulated all of the parent sciences.
The progress of science in recent years has been spectacular, although the benefits of this progress have not been without some corresponding liabilities. The most obvious dangers come from radioactive materials, with their potential for producing cancers in exposed individuals and mutations in their children. It has also become apparent that the accumulation in plant and animal cells of pesticides once thought harmless or of by-products from manufacturing processes often have damaging effects. These dangerous materials have been manufactured in enormous amounts and dispersed widely, and it has become the task of chemistry to discover the means by which these substances can be rendered harmless. This is one of the greatest challenges science will have to meet.
Robot, computer-controlled machine that is programmed to move, manipulate objects, and accomplish work while interacting with its environment. Robots are able to perform repetitive tasks more quickly, cheaply, and accurately than humans. The term robot originates from the Czech word robota, meaning “compulsory labor.” It was first used in the 1921 play R.U.R. (Rossum's Universal Robots) by the Czech novelist and playwright Karel Capek. The word robot has been used since to refer to a machine that performs work to assist people or work that humans find difficult or undesirable.
The concept of automated machines dates to antiquity with myths of mechanical beings brought to life. Automata, or manlike machines, also appeared in the clockwork figures of medieval churches, and 18th-century watchmakers were famous for their clever mechanical creatures.
Feedback (self-correcting) control mechanisms were used in some of the earliest robots and are still in use today. An example of feedback control is a watering trough that uses a float to sense the water level. When the water falls past a certain level, the float drops, opens a valve, and releases more water into the trough. As the water rises, so does the float. When the float reaches a certain height, the valve is closed and the water is shut off.
The first true feedback controller was the Watt governor, invented in 1788 by the Scottish engineer James Watt. This device featured two metal balls connected to the drive shaft of a steam engine and also coupled to a valve that regulated the flow of steam. As the engine speed increased, the balls swung out due to centrifugal force, closing the valve. The flow of steam to the engine was decreased, thus regulating the speed.
Feedback control, the development of specialized tools, and the division of work into smaller tasks that could be performed by either workers or machines were essential ingredients in the automation of factories in the 18th century. As technology improved, specialized machines were developed for tasks such as placing caps on bottles or pouring liquid rubber into tire molds. These machines, however, had none of the versatility of the human arm; they could not reach for objects and place them in a desired location.
The development of the multijointed artificial arm, or manipulator, led to the modern robot. A primitive arm that could be programmed to perform specific tasks was developed by the American inventor George Devol, Jr., in 1954. In 1975 the American mechanical engineer Victor Scheinman, while a graduate student at Stanford University in California, developed a truly flexible multipurpose manipulator known as the Programmable Universal Manipulation Arm (PUMA). PUMA was capable of moving an object and placing it with any orientation in a desired location within its reach. The basic multijointed concept of the PUMA is the template for most contemporary robots.
HOW ROBOTS WORK
The inspiration for the design of a robot manipulator is the human arm, but with some differences. For example, a robot arm can extend by telescoping—that is, by sliding cylindrical sections one over another to lengthen the arm. Robot arms also can be constructed so that they bend like an elephant trunk. Grippers, or end effectors, are designed to mimic the function and structure of the human hand. Many robots are equipped with special purpose grippers to grasp particular devices such as a rack of test tubes or an arc-welder.
The joints of a robotic arm are usually driven by electric motors. In most robots, the gripper is moved from one position to another, changing its orientation. A computer calculates the joint angles needed to move the gripper to the desired position in a process known as inverse kinematics.
Some multijointed arms are equipped with servo, or feedback, controllers that receive input from a computer. Each joint in the arm has a device to measure its angle and send that value to the controller. If the actual angle of the arm does not equal the computed angle for the desired position, the servo controller moves the joint until the arm's angle matches the computed angle. Controllers and associated computers also must process sensor information collected from cameras that locate objects to be grasped, or they must touch sensors on grippers that regulate the grasping force.
Any robot designed to move in an unstructured or unknown environment will require multiple sensors and controls, such as ultrasonic or infrared sensors, to avoid obstacles. Robots, such as the National Aeronautics and Space Administration (NASA) planetary rovers, require a multitude of sensors and powerful onboard computers to process the complex information that allows them mobility. This is particularly true for robots designed to work in close proximity with human beings, such as robots that assist persons with disabilities and robots that deliver meals in a hospital. Safety must be integral to the design of human service robots.
USES FOR ROBOTS
In 1995 about 700,000 robots were operating in the industrialized world. Over 500,000 were used in Japan, about 120,000 in Western Europe, and about 60,000 in the United States. Many robot applications are for tasks that are either dangerous or unpleasant for human beings. In medical laboratories, robots handle potentially hazardous materials, such as blood or urine samples. In other cases, robots are used in repetitive, monotonous tasks in which human performance might degrade over time. Robots can perform these repetitive, high-precision operations 24 hours a day without fatigue. A major user of robots is the automobile industry. General Motors Corporation uses approximately 16,000 robots for tasks such as spot welding, painting, machine loading, parts transfer, and assembly. Assembly is one of the fastest growing industrial applications of robotics. It requires higher precision than welding or painting and depends on low-cost sensor systems and powerful inexpensive computers. Robots are used in electronic assembly where they mount microchips on circuit boards.
Activities in environments that pose great danger to humans, such as locating sunken ships, cleanup of nuclear waste, prospecting for underwater mineral deposits, and active volcano exploration, are ideally suited to robots. Similarly, robots can explore distant planets. NASA's Galileo, an unpiloted space probe, traveled to Jupiter in 1996 and performed tasks such as determining the chemical content of the Jovian atmosphere.
Robots are being used to assist surgeons in installing artificial hips, and very high-precision robots can assist surgeons with delicate operations on the human eye. Research in telesurgery uses robots, under the remote control of expert surgeons that may one day perform operations in distant battlefields.
IMPACT OF ROBOTS
Robotic manipulators create manufactured products that are of higher quality and lower cost. But robots can cause the loss of unskilled jobs, particularly on assembly lines in factories. New jobs are created in software and sensor development, in robot installation and maintenance, and in the conversion of old factories and the design of new ones. These new jobs, however, require higher levels of skill and training. Technologically oriented societies must face the task of retraining workers who lose jobs to automation, providing them with new skills so that they can be employable in the industries of the 21st century.
Automated machines will increasingly assist humans in the manufacture of new products, the maintenance of the world's infrastructure, and the care of homes and businesses. Robots will be able to make new highways, construct steel frameworks of buildings, clean underground pipelines, and mow lawns. Prototypes of systems to perform all of these tasks already exist.
One important trend is the development of microelectromechanical systems, ranging in size from centimeters to millimeters. These tiny robots may be used to move through blood vessels to deliver medicine or clean arterial blockages. They also may work inside large machines to diagnose impending mechanical problems.
Perhaps the most dramatic changes in future robots will arise from their increasing ability to reason. The field of artificial intelligence is moving rapidly from university laboratories to practical application in industry, and machines are being developed that can perform cognitive tasks, such as strategic planning and learning from experience. Increasingly, diagnosis of failures in aircraft or satellites, the management of a battlefield, or the control of a large factory will be performed by intelligent computers.
Factory System, working arrangement whereby a number of persons cooperate to produce articles of consumption. Today the term factory generally refers to a large establishment employing many people involved in mass production of industrial or consumer goods. Some form of the factory system, however, has existed since ancient times.
Pottery works have been uncovered in ancient Greece and Rome. In various parts of the Roman Empire factories manufactured glassware and bronze ware and other similar articles for export as well as for domestic consumption. In the Middle Ages, large silk factories were operated in the Syrian cities of Antakya and Tyre; and in Europe, during the late medieval period, textile factories were established in several countries, notably in Italy, Flanders (now Belgium), France, and England.
During the Renaissance, the advance of science, contact with the New World, and the development of new trade routes to the Far East stimulated commercial activity and the demand for manufactured goods and thereby promoted industrialization. In western Europe and particularly in England, during the 16th and 17th centuries, many factories were created to produce such goods as paper, firearms, gunpowder, cast iron, glass, items of clothing, beer, and soap. Although heavy machinery, operated by water power in some places, was used in a few establishments, the industrial processes were generally carried on by means of hand labor and simple tools. In contrast to modern mechanized plants with assembly lines, the factories were merely large workshops where each laborer functioned independently. Nor were factories the most usual place of production; although some workers used their employer's tools and worked on the premises, most manufacturing was done under the domestic, or putting-out, system, by which workers received the raw materials, worked in their own homes, returned the finished articles, and were paid for their labor.
DEVELOPMENT OF THE FACTORY SYSTEM
The factory system, which eventually replaced the domestic system and became the characteristic method of production in modern economies, began to develop in the late 18th century, when a series of inventions transformed the British textile industry and marked the beginning of the Industrial Revolution. Among the most important of these inventions were the flying shuttle patented (1733) by John Kay, the spinning jenny (1764) of James Hargreaves, the water frame for spinning (1769) of Sir Richard Arkwright, the spinning mule (1779) of Samuel Crompton, and the power loom (1785) of Edmund Cartwright. These inventions mechanized many of the hand processes involved in spinning and weaving, making it possible to produce textiles much more quickly and cheaply. Many of the new machines were too large and costly for them to be used at home, however, and it became necessary to move production into factories.
One of the major technological breakthroughs early in the Industrial Revolution was the invention of a practical steam engine. When textile factories first became mechanized, only water power was available to operate the machinery; the factory owner was forced to locate the establishment near a water supply, sometimes in an isolated and inconvenient area far from a labor supply. After 1785, when a steam engine was first installed in a cotton factory, steam began to replace water as power for the new machinery. Manufacturers could build factories closer to a labor supply and to markets for the goods produced. The development of the steam locomotive and steamship in the early 19th century made it possible to ship factory-built products to distant markets more rapidly and economically, thus encouraging industrialization.
The Arkwright method of spinning was introduced into the U.S. in 1790 by Samuel Slater, a former apprentice in a British mechanized textile factory who started a factory in Pawtucket, Rhode Island. From that time on, mechanized textile factories sprang up throughout New England. In 1814, at a cotton mill established by the American industrialist Francis Cabot Lowell in Waltham, Massachusetts, all the steps of an industrial process were, for the first time, combined under one roof; here, cotton entered the factory as raw fiber and emerged as finished goods ready for sale.
Textiles, particularly cotton goods, were the major factory-made products during the early 19th century. Meanwhile, new machinery and techniques were being invented that made it possible to extend the factory system to other industries. The American inventor Eli Whitney, who stimulated textile manufacturing in the U.S. by inventing the cotton gin in 1793, made an equally, if not more important, contribution to the factory system by developing the idea of using interchangeable parts in making firearms. Interchangeable parts, with which Whitney began experimenting in 1798, eventually made it possible to produce firearms by assembly line techniques, rather than custom work, and to repair them quickly with premade parts. The idea of interchangeable parts was applied to the manufacture of timepieces from about 1820 on. Then, in the 1850s, at Waltham, Massachusetts, automatic machinery was used for the first time to make watches by consecutive process in a single factory. Thus, by the middle of the 19th century, American factories had begun to develop the outstanding feature of the modern factory system: mass production of standardized articles.
The garment industries were revolutionized by the sewing machine, patented in 1846 by the American inventor Elias Howe, and underwent a tremendous expansion during the 1860s. Spurred by the urgent demand for uniforms during the American Civil War, clothing manufacturers developed standardized sizes, a prerequisite for mass production of ready-made garments. At the same time, the military demand for shoes stimulated the creation of shoe-sewing machinery to mass-produce footwear.
As the 20th century began, the factory system of production prevailed throughout the United States and most of Western Europe. It reached its greatest European development in Germany, England, the Netherlands, and Belgium, which became, to a great extent, importers of food and raw materials and exporters of factory-made commodities. In 1913 Henry Ford, the pioneer automobile manufacturer, made an immense contribution to the expansion of the factory system in the U.S. when he introduced assembly line techniques to automobile production in the Ford Motor plant. In time the factory system spread to the Orient, where cheap labor attracted capital from the industrialized countries of the West. Japan, which had begun to industrialize in the late 19th century, rapidly became the foremost industrial power of Asia and a serious competitor of the Western nations.
The general trend of development of the factory system has been toward larger establishments with greater capital investment per worker. In the U.S. the number of manufacturing establishments actually declined from about 500,000 in 1899 to about 325,000 in the early 1980s, but the number of workers employed increased greatly, as did the value added to the economy by manufacture. By the mid-1980s, however, many factories felt the impact of serious problems in manufacturing industries, especially in the production of textiles, steel, automobiles, machine tools, and electrical equipment. Of major concern was the proliferation of cheap foreign imports. Cuts in these industries have led to relocation of businesses and factory closings, with accompanying loss of jobs and even economic devastation in some regions.
Other important trends have been the rise to leadership positions of professional managers who treat factory organization and operation as a science, and the development and use of increasingly sophisticated equipment in modern factory operation. Some machines, aided by computers, semiconductors, and other technological innovations of the mid-20th century, are so nearly self-regulating that an entire factory may be kept running by a few people operating sets of controls. This method of production, called automation, has brought many economic changes, which eventually may be as basic as those resulting from the Industrial Revolution.
WORKING CONDITIONS IN FACTORIES
The introduction of the factory system had a profound effect on social relationships and living conditions. In earlier times the feudal lord and the guildmaster both had been expected to take some responsibility for the welfare of the serfs, apprentices, and journeymen who worked under them (see Feudalism; Guild). By contrast, the factory owners were considered to have discharged their obligations to employees with the payment of wages; thus, most owners took an impersonal attitude toward those who worked in their factories. This was in part because no particular strength or skill was required to operate many of the new factory machines. The owners of the early factories often were more interested in hiring a worker cheaply than in any other qualification. Thus they employed many women and children, who could be hired for lower wages than men. These low-paid employees had to work for as long as 16 hours a day; they were subjected to pressure, and even physical punishment, in an effort to make them speed up production. Since neither the machines nor the methods of work were designed for safety, many fatal and maiming accidents resulted. In 1802 the exploitation of pauper children led to the first factory legislation in England. That law, which limited a child's workday to 12 hours, and other legislation that followed were not strictly enforced.
The workers in the early mill towns were not in a position to act in their own interest against the factory owners. The first cotton mills were located in small villages where all the shops and inhabitants depended on a single factory for their livelihood. Few dared to challenge the will of the person who owned such a factory and controlled the lives of the workers both on and off the job. The long hours of work and low wages kept a laborer from leaving the community or being otherwise exposed to outside influences. Later, when factories were located in larger cities, the disadvantages of the mill town gave way to such urban evils as overcrowded sweatshops and slums. In addition, the phenomenon of the business cycle began to manifest itself, subjecting industrial laborers to the frequent threat of unemployment.
REFORMS AND CHANGES
By the early 19th century the condition of workers under the factory system had aroused concern. One who called for reform was Robert Owen, a British self-made capitalist and cotton mill owner, who tried to set an example by transforming a squalid Scottish mill town called New Lanark into a model industrial community between 1815 and 1828. At New Lanark, wages were higher and hours shorter, young children were kept out of the factory and sent to school, and employee housing was superior by the standards of the day; yet the mill operated at a substantial profit. In Owen's day modern trade unions were beginning to develop in the British Isles, and he sought to organize them into a national movement. His aim was to improve working conditions as well as effect basic social and economic reforms. In his concern for the increasing differences between capital and labor, Owen was joined by such economic theorists as the Frenchmen Charles Fourier, Claude Henri de Saint-Simon, and Pierre Joseph Proudhon and the Germans Karl Marx and Friedrich Engels, each of whom analyzed the processes of modern industrial society and proposed social and industrial reforms.
In time, organized protest forced owners to correct some of the worst abuses. Workers agitated for and obtained the right to vote, and they established political parties and labor unions. The unions, after a considerable struggle and frequent setbacks, won important concessions from management and government, including the right to organize workers in factories and to represent them in negotiations (see Trade Union; Trade Unions in the United States). Furthermore, issues and problems germane to the factory system came to figure prominently in the formulation of modern political and economic theory (see Labor Relations). In the Soviet Union, the factory became a social and political, as well as an industrial, unit (see Socialism; Union of Soviet Socialist Republics).
One of the important and often overlooked consequences of the factory system was its promotion of the emancipation of women. The factory created wage-earning opportunities for women, enabling them to become economically independent. Thus, industrialization began to change the family relationship and the status of women. See Women, Employment of.
The inspection of factories by state agencies began in England in the early 19th century in response to public protest against the working conditions for women and child laborers. Later, wherever the factory system spread, governments eventually adopted regulations against unhealthful and dangerous conditions. Thus, a factory code became standard in every industrialized country. These codes provided for restrictions on child labor and hours of work, regulation of sanitary conditions, installation of safety devices and the enforcement of safety standards, medical supervision, adequate ventilation, the elimination of sweatshops, and the establishment of minimum wages. One important regulating agency was the International Association of Factory Inspection, established in 1886 by Canada and 14 states of the U.S. The International Labor Organization, acting in cooperation first with the League of Nations and later with the United Nations, correlated the regulation of factory conditions throughout the world.
In the U.S., the federal government is responsible for regulating the working conditions in factories and most other places of employment. Prior to the early 1970s, each of the states regulates the inspection of factories within its own borders. In 1970 the Occupational Safety and Health Administration (OSHA) was established as an agency of the U.S. Department of Labor. OSHA gradually took over the regulation of health and safety standards in the workplace. Although some states still maintain their own inspection plans, all are monitored by OSHA in order to keep stringent standards. A citation is issued for each violation and a fine may be imposed for a serious infraction. Factory inspection also includes examination of payrolls and employment records. Any establishment covered by the Fair Labor Standards Act is subject to review by the Wage and Hour Division of the Labor Department to ascertain whether employers are complying with regulations.
Manufacturing, producing goods that are necessary for modern life from raw materials. The word manufacture comes from the Latin manus (hand) and facere (to make). Originally manufacturing was accomplished by hand, but most of today's modern manufacturing operations are highly mechanized and automated.
There are three main processes involved in virtually all manufacturing: assembly, extraction, and alteration. Assembly is the combination of parts to make a product. For example, an airplane is assembled when the manufacturer puts together the engines, wings, and fuselage. Extraction is the process of removing one or more components from raw materials, such as obtaining gasoline from crude oil. Alteration is modifying or molding raw materials into a final product—for example, sawing trees into lumber.
Science and engineering are required to develop new products and to create new manufacturing methods, but there are other factors involved in the manufacturing process. Legal matters, such as obtaining operating permits and meeting industrial safety standards, must be adhered to. Economic considerations, such as competition, worldwide markets, and tariffs, control to some degree what prices are set for manufactured goods and what inventories are needed.
HISTORY OF MANUFACTURING
Manufacturing has existed as long as civilizations have required goods: bricks to build the Mesopotamian city of Erech (Uruk), clay pots to store grain in ancient Greece, or bronze weapons for the Roman Empire. In the Middle Ages, silk factories operated in Syria, and textile mills were established in Italy, Belgium, France, and England. New routes discovered from Europe to the Far East and to the New World during the Renaissance (14th century to 17th century) stimulated demand for manufactured goods to trade. Factories were built to produce gunpowder, clothing, cast iron, and paper. The manufacturing of these goods was primarily done by hand labor, simple tools, and, rarely, by machines powered by water.
The Industrial Revolution began in England in the middle of the 18th century when the first modern factories appeared, primarily for the production of textiles. Machines, to varying degrees, began to replace the workforce in these modern factories. The cotton gin, created by the American inventor Eli Whitney in 1793, mechanically removed cotton fibers from the seed and increased production. In 1801 Joseph Jacquard, a French inventor, created a loom that used cards with punched holes to automate the placement of threads in the weaving process. The development of the steam engine as a reliable power source by Thomas Newcomen, James Watt, and Richard Trevithick in England, and in America by Oliver Evans, enabled factories to be built away from water sources that had previously been needed to power machines. From the 1790s to the 1830s, more than 100,000 power looms and 9 million spindles were put into service in England and Scotland (see Factory System; Industrial Revolution).
In addition to inventing the cotton gin, Eli Whitney made another contribution to the factory system in 1798 by proposing the idea of interchangeable parts. Interchangeable parts make it possible to produce goods quickly because repairs and assembly can be done with previously manufactured, standard parts rather than with costly custom-made ones. This idea led to the development of the assembly line, where a product is manufactured in discrete stages. When one stage is complete, the product is passed to another station where the next stage of production is accomplished. In 1913 the American industrialist Henry Ford and his colleagues first introduced a conveyer belt to an assembly line for flywheel magnetos, a type of simple electric generator, more than tripling production. The assembly line driven by a conveyor belt was then implemented to manufacture the automobile body and motors.
Labor unions, associations of workers whose goal is to improve their economic conditions, originated in the craft guilds of 16th-century Europe. The modern labor movement, however, did not start until the late 19th-century, when reliable railroad systems were developed. Railroads brought materials from diverse locations for final manufacturing and assembly and created a large demand for industrial labor. Labor unions gained enormous strength after World War II (1939-1945) when the United States had both high inflation and a huge population of factory workers. This combination forced labor unions to negotiate for better contracts and wages, and they achieved significant influence in industry. Today fewer manufacturing jobs and the trend for factories to relocate to foreign countries have combined to diminish the strength of organized labor (see Trade Unions).
Military Operations and Manufacturing
When the United States joined the Allies against Hitler in World War II, the country was in its 11th year of economic depression, 17 percent of the workforce was unemployed, and manufacturers were unprepared to mobilize for wartime production. President Franklin Delano Roosevelt succeeded in motivating the industrial complex to invest in new manufacturing facilities through a combination of generous business contracts, tax laws, and patriotism. By 1943 manufacturing capacity had increased dramatically: 10,000 military airplanes were produced a month, and it took only 69 days to build a warship. When World War II ended, the United States was the leading producer of manufactured goods. After the war, part of this vast military manufacturing capacity was converted to create consumer items such as automobiles, furniture, and televisions.
The development of the Cold War between Communist and non-Communist powers was accompanied by a buildup of manufactured weapons such as fighter airplanes and bombers, submarines, missiles, and nuclear weapons. The shift to a military manufacturing base accelerated the development of space science and advanced electronics, particularly integrated circuitry, which would eventually become the processing engine for the modern personal computer. Computers, in turn, have helped increase the productivity of modern manufacturing plants because they enable automated design, production, and record keeping (see Computer-Aided Design/Computer-Aided Manufacture).
Types of Manufacturing
Manufacturing processes can produce either durable or nondurable goods. Durable goods are products that exist for long periods of time without significant deterioration, such as automobiles, airplanes, and refrigerators. Nondurable goods are items that have a comparatively limited life span, such as clothing, food, and paper.
Iron and Steel Manufacture
Iron manufacturing originated about 3500 years ago when iron ore was accidentally heated in the presence of charcoal. The oxygen-laden ore was reduced to a product similar to modern wrought iron.
Today, iron is made from ore in blast furnaces. Oxygen and other elements are removed when the ore is mixed with coke (a material that contains mostly carbon) and limestone and is then blasted by hot air. The gases formed by the burning materials combine with the oxygen in the ore and reduce the ore to iron. This molten iron still contains many impurities, however. Steel is manufactured by first removing these impurities and then adding elements, predominantly carbon, in a controlled manner. Strong steels contain up to 2 percent carbon. The steel is then shaped into bars, plates, sheets, and such structural components as girders (see Iron and Steel Manufacture).
Raw fibers of cotton, wool, or synthetic materials such as nylon and polyester go through a complex series of processes to form fabrics for apparel, home furnishings, and biomedical, recreation, and aerospace products. In most cases, loose tufts of fiber are straightened, and the thick ropelike slivers are thinned for spinning. In the spinning process, the fibers are twisted to add strength. Synthetic fibers are generally made in a continuous string, but sometimes they go through a texturing process to give them a natural appearance. These twisted fibers, known as yarns, are then woven or knitted into fabrics. Weaving is a process that interlaces two sets of yarns, the warp and filling, in a variety of patterns that impart design and different physical characteristics. Knitting is a technique that loops yarns together to form fabric. The fabrics are then dyed, and finishes applied (see Textiles).
The lumber industry converts trees into construction materials or the precursor material for pulp and paper. Trees are harvested, debarked, then sawed into usable shapes such as boards and slabs. The lumber is graded for use and quality and then dried in large kilns, or ovens. Lumber is manufactured into boards, plywood, composition board, or paneling. Pulp wood for paper is sent directly to the manufacturer without sawing or drying (see Lumber Industry).
The automobile was the first major manufactured item built by a mass production system using cost-effective assembly line techniques. Today, before an automobile reaches its final assembly point, subsystems, such as the engine, transmission, electrical components, and chassis, are fabricated from raw materials in other specialized facilities. The metallic automobile body parts are stamped and welded together by robots into a unibody, or one-piece, construction. This body is then dipped in a succession of chemical baths that rustproof and provide undercoat and paint treatments. During the final assembly, conveyor systems direct all of the components to stations along the production route. The engine, transmission, fuel tank, radiator, electrical systems, body panels and doors, suspension system, tires, and interior accessories are fastened to the chassis. Rigid quality-control standards at every step ensure that the completed vehicle is safe and built to specifications (see Automobile Industry).
The aerospace industry manufactures airplanes, rockets and missiles, among other technologies. The first airplanes were constructed from wood and fabrics; modern airplanes are built from aluminum alloys, titanium, plastics, and advanced textile-reinforced composite materials. As in automobile manufacturing, components such as engines and landing gear are manufactured in separate facilities and then assembled with the wings, rudders, and fuselage to produce the finished airplane. Final assembly is conducted on an assembly line, where the partially manufactured airplane is moved from station to station.
Rockets are built on an individual basis. Rocket casings are created by winding high-strength carbon fibers and epoxy resins onto a cylindrical shape. The epoxy hardens and encapsulates the fibers to produce a strong, lightweight material. Solid rocket fuel is put into the body of the rocket. Thrust nozzles and exit cones are then added along with electronic guidance systems and payloads.
Petrochemicals are manufactured from naturally occurring crude oils and gases. Once removed from the earth, the crude oil is refined into gasoline, heating oil, kerosene, plastics, textile fibers, coatings, adhesives, drugs, pesticides, and fertilizers. Crude oil contains thousands of natural organic chemicals. These are separated by distilling, or boiling off, the compounds at different temperatures. Gases such as methane, ethane, and propane are also released. Methane, when combined with nitrogen and pressurized and heated, yields ammonia, an important ingredient in fertilizers. Simple plastic materials, such as polyethylene and polypropylene, are manufactured by first heating ethane and propane gases and then rapidly cooling them to alter their chemical structure (see Petroleum).
Manufacturing systems today are designed to recycle many of their components. For example, in the automotive industry, excess steel and aluminum can become scrap stock for new metal, rubber tires can be chopped and mixed with asphalt for new roadways, and engine starters can be remanufactured and sold again. Recycling for newer materials, such as composites (combinations of materials designed with superior physical and mechanical properties), has yet to be developed, however.
Emission control will be a critical issue for future manufacturers. Smoke scrubbers must remove dangerous gases and particulates from industrial plant discharges, and manufacturing facilities that dump chemicals into rivers must develop methods of eliminating or reusing these waste products.
The economically advantageous automated factory has become the norm. Most automobile engines are manufactured using robotic tools and handling systems that deliver the engine to various machining sites. Computers with sophisticated inventory tracking programs make it possible for items to be assembled and delivered at the manufacturing facility only as they are needed. In demand-activated manufacturing, when an item is sold a computer schedules the manufacture of an item to replace the unit sent to the customer.
Engineers use computers to help them design new products efficiently. The Boeing 777 jet, for example, was developed in record time by having its entire design and manufacturing systems created on a computer database rather than using traditional blueprints.
Aviation, term applied to the science and practice of flight in heavier-than-air craft, including airplanes, gliders, helicopters, ornithopters, convertiplanes, and VTOL (vertical takeoff and landing) and STOL (short takeoff and landing) craft (see Airplane; Glider; Helicopter). These are distinguished from lighter-than-air craft, which include balloons (free, usually spherical; and captive, usually elongated), and dirigible airships (see Airship; Balloon).
Operational aviation is grouped broadly into three classes: military aviation, commercial aviation, and general aviation. Military aviation includes all forms of flying by the armed forces—strategic, tactical, and logistical. Commercial aviation embraces primarily the operation of scheduled and charter airlines. General aviation embraces all other forms of flying such as instructional flying, crop dusting by air, flying for sport, private flying, and transportation in business-owned airplanes, usually known as executive aircraft.
Centuries of dreaming, study, speculation, and experimentation preceded the first successful flight. The ancient legends contain numerous references to the possibility of movement through the air. Philosophers believed that it could be accomplished by imitating the wing motions of birds, and by using smoke or other lighter-than- air media. The first form of aircraft made was the kite, about the 5th century bc. In the 13th century, the English monk Roger Bacon conducted studies that led him to the conclusion that air could support a craft in the same manner that water supports boats. At the beginning of the 16th century, Leonardo da Vinci gathered data on the flight of birds and anticipated developments that subsequently became practical. Among his important contributions to the development of aviation were his invention of the airscrew, or propeller, and the parachute. He conceived three different types of heavier-than-air craft: an ornithopter, a machine with mechanical wings designed to flap like those of a bird; a helicopter, designed to rise by the revolving of a rotor on a vertical axis; and a glider, consisting of a wing fixed to a frame on which a person might coast on the air. Leonardo's concepts involved the use of human muscular power, quite inadequate to produce flight with the craft that he pictured. Nevertheless, he was important because he was the first to make scientific proposals.
THE 19TH CENTURY
The practical development of aviation took various paths during the 19th century. The British aeronautical engineer and inventor Sir George Cayley was a farsighted theorist who proved his ideas with experiments involving kites and controlled and human-carrying gliders. He designed a combined helicopter and horizontally propelled aircraft and deserves to be called the father of aviation. The British scientist Francis Herbert Wenham used a wind tunnel in his studies and foresaw the use of multiple wings placed one above the other. He was also a founding member of the Royal Aeronautical Society of Great Britain. Makers and fliers of models included the British inventors John Stringfellow and William Samuel H