Each fall almost 50 million young Americans walk through the doorways of about 100,000 elementary and secondary schools for the start of a new school year. Filling classrooms from kindergarten to the 12th grade, they attend classes for an average of five hours a day, five days a week, until the beginning of the following summer.
These students are part of one of the most ambitious undertakings in the history of education: the American effort to educate an entire national population. The goal is—and has been since the early decades of the republic—to achieve universal literacy and to provide individuals with the knowledge and skills necessary to promote both their own individual welfare as well as that of the general public. Though this goal has not yet been fully achieved, it remains an ideal toward which the American educational system is directed. The progress which has been made is notable both for its scope and for the educational methods which have been developed in the process of achieving it.
About 85 percent of American students attend public schools (schools supported by American taxpayers). The other 15 percent attend private schools, for which their families choose to pay special attendance fees. Four out of five private schools in the United States are run by churches, synagogues or other religious groups. In such schools, religious teachings are a part of the curriculum, which also includes the traditional academic courses of reading, mathematics, history, geography and science. (Religious instruction is not given in public schools.)
The combined expenses of both education systems, public and private, exceed $190,000 million a year. From that point of view, American education is a powerful consumer. Who decides how many of these thousands of millions of dollars should be used annually for teachers' salaries, new computers or extra books? Private schools that meet state standards use the fees they collect as they think best. But where public taxes are involved, spending is guided by boards of education (policymakers for schools) at the state and/or district level. The same thing is true for decisions about the school curriculum, teacher standards and certification, and the overall measurement of student progress.
EDUCATION—A LOCAL MATTER
From Hawaii to Delaware, from Alaska to Louisiana, each of the 50 states in the United
States has its own laws regulating education. From state to state, some laws are similar; others are not. For example:
• All states require young people to attend school. (The age limits vary: 32 states require attendance to age 16; eight to 18; etc.) Thus, every child in America is guaranteed up to 13 years of education. This is true regardless of a child's race, religion, sex, learning problems, physical handicap or inability to speak English.
• Some states play a strong role in the selection of learning material for their students. For example, state committees may decide which publishers' textbooks can be purchased with state funds. In other cases, decisions about buying instructional material are left entirely to local school officials.
Americans have a strong tendency to educate their children about major public concerns—problems such as environmental pollution, nuclear issues, neighborhood crime and drugs. Responding to public pressure, boards of education in different areas often add courses on various relevant issues to the elementary and secondary school curriculum.
WHAT AN AMERICAN STUDENT LEARNS
American students pass through several levels of schooling—and thus, several curricula—on their way to a high school diploma. They attend:
• Elementary School. In statistical report
published by the federal government, "elementary school" usually means grades kindergarten (K) through 8. But in some places, the elementary school includes only grades K- 6. And sometimes grades 4, 5 and 6 make up what is called a "middle grade" school. (Many Americans refer to the elementary grades as "grammar school.")
• Secondary School. Again, in statistical reports, "secondary school" generally means grades 9-12. These grades are popularly called "high school." However, in many districts, "junior high school" includes grades 7-9. And when grades 7-9 are included with the 10th, 11th and 12th grades, all six are said to form a "senior high school."
Although there is no national curriculum in the United States, certain subjects are taught in all Ê to 12 systems across the country.
Almost every elementary school provides instruction in these subjects: mathematics; language arts (a subject that includes reading, grammar, composition and literature); penmanship; science; social studies (a subject that includes history, geography, citizenship and economics); music; art; and physical education. In many elementary schools, courses in the use of computers have been introduced. And in some cases, a second language (other than English) is offered in the upper elementary grades.
Most secondary schools offer the same "core" of required subjects: English, mathematics, science, social studies and physical education. But school boards differ greatly from one district to another in the amount of class time they want high school students to spend on these basic subjects. In some high schools, for example, students must complete three years of mathematics before graduation. The national average is lower.
Students are guided by school counselors in choosing electives, which can range from specialized academic to vocational subjects. For example, high schools offer more than one year—in most cases, several years—of math, science and the other core subjects. After they complete the required units in these core areas (for example, one year of American history), students can take additional units as electives (perhaps a year of European history and a year of world political issues).
Other elective courses vary from school to school. Some high schools specialize in particular types of subjects—business education, or industrial trades, or foreign languages, for example. A student planning to be a physician would want to attend a school offering many electives in science.
A CRACK IN THE SYSTEM?
By the early 1980s, the most popular electives were physical education, music performance, remedial (basic) English grammar and composition, driver education, health, "shop" (construction and repair of tools and machinery), marriage training and home economics (home care).
The trend in electives was clearly not toward academic subjects. This was the issue Americans debated with some concern in the early 1980s. The opportunity for elective courses in high school satisfies some ideals that are very important to Americans: • The opportunity to get an education that prepares a person for his or her life's work— whether in computer science, office work, agriculture or a trade.
• The opportunity to pursue and study one's own interests—whether child development, political science or speaking a foreign language.
• The opportunity to discover one's own talents and perfect them—whether in music, creative writing or ceramics.
The vision of school as the place for satisfying such goals is not a new one, but, until the 1950s, school boards made most decisions about which curricula would best prepare students for a productive life after high school. The trend of the 1960s and 1970s was to offer more and more choices to students. By the 1980s, American parents and educators were taking a second look at this practice. One reason for this concern was that allowing more free choice to students seemed linked to another trend that had also emerged in the previous two decades—the slow but steady decline of American students' average scores in standardized tests of mathematics, reading and science.
There was no mistaking the evidence. Nationwide testing services used at different grade levels and college entrance examinations demonstrated the drop in student scores. College administrators and business executives began to complain that high school graduates needed remedial courses in reading, mathematics and composition. About 99 percent of adult Americans reported in the 1980 census that they could read and write. But critics claimed that about 13 percent of America's 17-year-olds were "functionally illiterate." That is, they were unable to deal successfully with everyday demands such as reading printed instructions, filling out a job application, etc.
This was gloomy news. In the American mind, schools are a guarantee that the next generation will be informed, self-reliant citizens. Was the system failing some children? Every possible cause for the decline in average scores was examined and written about in the newspapers in the early 1980s. Publishers were blamed for producing textbooks that were too easy. The makers of standardized tests were criticized for using poor questions. Television was blamed for the effect of its uninspiring programs. (In a recent year, it was estimated that Americans between the ages of 6 and 19 years watched television for an average time of 25 hours a week.) School boards were criticized for not paying teachers enough to keep them in the field of education. And parents were accused of not making sure that their children did their homework.
It was easy, at the moment, to overlook how much the American education system had accomplished since its origin, 350 years earlier.
THE PURITANS AND EDUCATION
Americans trace the origins of their nation to the English colonists (settlers) who came to the eastern coast of North America in the early 17th century. The largest group of these first colonists, the Puritans, founded the Massachusetts Bay Colony in 1630. Like others who followed them to America, the Puritans sought the freedom to practice their religion—a freedom they could not enjoy in their native country. They found this freedom in the small towns and villages they built on the edge of the forest in Massachusetts.
One of the things the Puritans believed was that every person should be able to read the Bible. One hundred percent literacy seeme like a dream in the 17th century. Within just a few years after their arrival, they took steps to set up a system of education in their colony:
• In 1634, they opened a "Latin grammar" school, a school for those who wanted to prepare for college.
• In 1636, Harvard College was founded for the training of religious ministers.
• In 1634 and 1638, the Puritans passed laws declaring that all property could be taxed for the common good, which included the support of schools.
• In 1642 and 1647, the Bay Colony passed laws requiring all parents to provide reading education for their children.
Thus, in less than 20 years, the Puritans introduced two practices that still influence American youth: compulsory education for all children, and public taxation for schools. The situation was different in other British colonies in North America. In Pennsylvania, for example, where there were several different religious groups, decisions about education were left to the leaders of each church. In southern colonies such as Virginia, those who could afford tutors hired them for their sons (and sometimes for their daughters). The older sons of wealthy landowner: were sent to England for their education. Occasionally, a landowner might allow a literate adult to teach reading to the children of poor whites and, perhaps, a few blacks. But mostly, custom forbade the teaching of children of slaves to read.
Throughout the colonies, young men and women could receive an education in reading by becoming an apprentice in a small business. It ha been a practice in England to have young boys and girls live with the families of those for whom they worked (bakers, printers, etc.). In return for i youth's work, the business owner promised to teach him or her to read, as well as how to do a craft (bake or print, for example). This practice was brought to North America.
EDUCATION IN A NEW NATION
During the 17th and 18th centuries, the English continued to develop new settlements along the eastern seaboard of the continent. (Swedish, Dutch, German, and other European immigrants also settled in these colonies.) Each colony developed its own economy, its own style of local government and its own religion: and cultural traditions. Most religious groups were Protestant.
On July 4, 1776, the 13 colonies issued a Declaration of Independence, and went to war for their freedom from England. They won the war for independence in 1781, and negotiated favorable treaty in 1783. But it took until 178S for them to shape a unified national government. The shape and power of this new government, described in the Constitution, were determined after many debates and compromises. The new United States was to b a federal republic—a union of states with a strong central government representing all the people.
The states did not easily give up their ow political powers to this new central government. In fact, the 10th Amendment was added to the Constitution to guarantee that
"powers not delegated to the United States by the Constitution...[would be] reserved to the [governments of the] States..." One of these reserved powers was the right of each state to provide for the education of its people.
Actually, at the end of the 18th century, elementary education throughout the United States was in local hands. State governments were allowing local districts (small towns and villages) to set up and run their own elementary schools. Most often, these schools were in one- room buildings, with one teacher for all the students who could attend. The teacher, who was hired by a committee of citizens, had to teach what the local community expected.
What kind of education did Americans want for their children in 1800? At the very least, they wanted each child to learn to write his or her name, to do simple arithmetic, to learn the local rules of conduct. Most of all, they wanted their children to learn to read.
The first colonists had believed that literacy was important to the preservation of religious freedom. Americans in the early 1800s also believed that the ability to read was important to preserving a democratic republic. Thomas Jefferson, third president of the United States, argued that Americans should be given an excellent education. He felt that this was the only way to guarantee "the preservation... of liberty."
Several leaders in Jefferson's time urged the formation of a national system of education with uniform standards for schools in all the states. But none of these plans was ever tried. Americans wanted the best education for their children, but they feared the effects of giving its direction to the national government.
EARLY CHALLENGES, EARLY REFORMS
And so the future was decided: Education in the United States was to remain in the hands of state and local governments. But while the national government had no role to play in shaping American education, many national events did.
The 19th century was a time of great change within the United States:
• Both the population and the land area of the nation grew rapidly after 1800. By 1850, there were 31 states in the nation and more than 23 million people. New states were added to the American nation as its people pushed westward.
• After the middle of the 19th century, steel, railroad and other giant industries increased rapidly in America. This industrialization led to the growth of large cities—especially in northern states.
• Immigration to the United States swelled to almost two million in the 1840s, and continued to increase for the rest of the century. (Between 1820 and 1920, about 33 million people immigrated to America.)
• After 1840, the immigrant population included the first large numbers of Catholics; after 1870, many of America's immigrants did not speak English. Tensions often grew out of the cultural differences between newcomers and those who had lived in America for a longer time.
• Between 1861 and 1865, a tragic civil war tore America apart. Northern (Union) and Southern (Confederate) armies fought one another at a terrible cost of lives and property. One of the outcomes of this war was the end of slavery.
All of these events and changes placed enormous pressure on the practice of education in America. From the 1830s, reformers such as Horace Mann (in Massachusetts) had been trying to improve the quality of schools in each state. The reformers wanted especially to do three things: (1) to train better teachers; (2) to shift some decision-making from school districts back to the state government; and (3) to increase student enrollment in elementary ("common") schools. The combined pressures of population moves, industrialization, urbanization and cultural tensions occasionally stalled these reform efforts. Still, there were certain ideas that refused to die in the young and changing nation. As each new group of immigrants arrived on American soil, they caught the spark of these ideas and made them their own. The Declaration of Independence, read aloud at every Fourth of July celebration, reminded Americans: "We hold these Truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights that among these are Life, Liberty, and the Pursuit of Happiness."
Increasingly, Americans believed that a basic education was an unalienable right, too. Because of prejudice, some groups were forced to wait longer than others for the exercise of this right: blacks, Native Americans (Indians), young women of every race. Nevertheless, improvements in educational opportunity continued to be made—in both the 19th and the 20th centuries:
• Between the 1820s and 1860s, public high schools were opened in city after city throughout the United States. Until the 20th century, the purpose of these schools was to offer Latin and other academic courses to students preparing for college.
• In 1852, Massachusetts passed the first compulsory education law in the new nation. It affected young people six to 16 years old. Over the next few decades, all other states in the country passed similar laws.
• In 1917, the Congress passed the Smith- Hughes Act to provide federal money to high schools that devoted at least half their time to vocational education. This act gave a big boost to courses in specific trades and industries.
(It also showed one of the ways the federal government would influence education in the states—through funding.)
CURRICULUM IN THE 20TH CENTURY
By the early 20th century, compulsory education was in place in the United States. The old dream of universal literacy seemed about to come true. More than that, the schools were becoming centers for the Americanization of immigrants from all over the world. But there was still one major issue under debate— the matter of school curricula.
Some reformers in the early 1900s wanted secondary schools to offer "modem" subjects such as history, science and English composition. They also urged that high schools offer elective courses. Philosopher John Dewey and educator Francis Wayland Parker went beyond suggestions about courses. Dewey and Parker wanted to see changes in the methods of education and even in its goals. They felt, for example, that American schools should help develop the special abilities of each child and that subject matter should be adjusted to the child's innate capacities. They disliked
formalized teaching methods, arguing that each school should be a "little community" in which students were filled with a "spirit of service" toward the interests of the whole group. Dewey's suggestions about self-directed training and about dealing with each child's abilities were widely adopted. These methods were practiced in the laboratory school at the University of Chicago between 1896 and 1904 and became known as "Progressive Education." Individualized instruction within a classroom is still an important idea in American education, and "learning by doing" remains a highly respected approach to child development.
In 1918, a committee of the National Education Association, a teachers' organization, reported seven new goals for secondary education. The report urged schools to educate students in (1) health, (2) the command of fundamental skills, (3) family living, (4) vocational skills, (5) citizenship, (6) the proper use of leisure time and (7) ethical (moral) conduct. School boards across the nation responded eagerly to these suggestions. New courses were added to the curriculum: history, geography, science and English. Sports and other after-school activities became an unofficial part of the school curriculum. Teachers were now being asked to prepare children for every aspect of life.
LEARNING TO BE WORLD CITIZENS
After 1920, the Ê to 12 education in America remained very much the same until World War II. That tragic event introduced changes that affected every institution in America— including the schools. American parents— especially young couples who married in the late 1940s—wanted their children to be educated for the post-war world. At the same time, American blacks and other minority groups demanded educational opportunities equal to those of whites. In 1954, the Supreme Court ruled that the practice of segregating blacks into separate schools was unconstitutional. By 1955, the United States was a nuclear power, a member of the United Nations, and a competitor with the Soviet Union for world influence. American jobs were changed by new technology (especially by the computer), and American businesses spread around the globe. Television brought the faces of presidents, entertainers and people from all over the world into America's homes each evening. Discoveries by scientists opened new secrets of the stars and of the atom. Between 1950 and 1960, more new knowledge was developed than in all of the world's history before 1950.
Schools were asked not only to teach this new information, but to help students ask their own questions about it. The "inquiry" method of learning, focusing on solving problems rather than memorizing facts, became popular. More science courses were added to the curriculum, some as a result of the orbiting of the first man-made satellite, Sputnik, by the Soviet Union in 1957. The federal government began to spend millions for the development of new science curricula and for training teachers to use them. (Federal spending would spread to other fields, too, especially for programs to aid students with learning difficulties. By the early 1980s, the federal government was spending about eight to 10
thousand million dollars annually on elementary and secondary education.)
But a good secondary education was no longer enough for many Americans. In one school district after another, parents insisted on high school programs that would prepare their sons and daughters for admission to a university. More and more Americans viewed the university as the doorway to a medical or law degree, a position in government, or a management position in a major business office.
TOWARD THE 21ST CENTURY
The late 1960s and early 1970s—years of the American involvement in the Vietnam War— were difficult for all Americans. Drug abuse became a problem, even for teenagers. Despite the rapid development of new, well-equipped schools, students began to "drop out" of (leave) high school before graduation. Average test scores declined.
Were young people disillusioned by the war; were they confused by conflicting values in the world they saw every day on TV? In an effort to help students deal with their problems, schools added more "attractive" courses and increased their counseling services. But leading educators across the country claimed that American schools were drifting away from their chief task: to stimulate, challenge and educate students.
The Department of Education, established in 1979, by consolidating smaller federal elements, sponsored the National Commission on Excellence in Education to examine the question. In 1983, the Commission made several recommendations: lengthen the school day and year; establish a new core curriculum for all students (four years of English, three years each of math, science and social studies, one-half year of computer science); and raise the standards of performance in each subject.
Once more, across the nation, school boards responded. Course requirements were tightened.
As a result of the concern for excellence, test scores for American children are once again on the rise and many schools are returning to basic educational principles.
Schools also face new challenges. They must deal with a large influx of immigrant children, many of whom do not speak adequate English. They must meet community demands that the curriculum reflect the culture of the children attending the school. They must make sure that students develop basic job skills and, in many cases, they must meet the diverse needs of non-traditional students, such as teenaged mothers.
Schools are meeting these challenges in ways that reflect the diversity of the U.S. educational system. They are training large numbers of teachers of English as a second language and, in some communities, setting up bilingual schools. Many communities are working with teachers and school administrators to move the school curriculum away from the traditional European-centered approach and to include more African, Asian and other studies. They believe that will help raise the self-esteem of children whose ancestors came from places other than Europe and that it will also teach children of European ancestors an appreciation of other cultures. And since schools are, for many people, their only point of contact with the government, some communities have started "one-stop-shopping" schools, which offer counseling, child care, health services and other social services. Schools are also working to teach "thinking skills" to the 70 percent of U.S. students who do not go on to higher education.
These students, in order to advance beyond entry-level jobs, need to be able to cope with today's technologically advanced workplace. In the words of a recent report by the Commission on Achieving Necessary Skills:
"A strong back, the willingness to work and a high school diploma were once all that was necessary to make a start in America. They are no longer. A well developed mind, a continued willingness to learn and the ability to put knowledge to work are the new keys to the future of our young people, the success of our business and the economic well-being of the nation."
In 1989, President Bush and the governors of all 50 states set six basic educational goals to be achieved by the end of the century. ÒÚåó are:
• That all children will start school ready to learn
• That 90 percent of all high school students will graduate
• That all students will achieve competence in core subjects at certain key points in their education careers
• That American students be first in the world in math and science achievement
• That every adult American will be literate and have the skills to function as a citizen and as a worker
• That all schools will be free of drugs and violence and offer a disciplined environment conducive to learning.
Two years later, in 1991, the government issued a "report card," assessing progress toward those goals. The report found that progress had been made toward most of those goals, but that everyone—schools, students and the government—would have to work very hard in order to meet all of the goals by the end of the century.
Suggestions for Further Reading
Cremin, Lawrence A. American Education: The National Experience 1783-1876
New York: Harper and Row, 1982.
Cremin, Lawrence A. Transformation of the School: Progressivism in American Education, 1876-1957
New York: Random House, 1964.
Johnson James A. and others, eds. Foundations of American Education: Readings. 5th ed.
Newton, MA: Allyn & Bacon, 1981.
The National Commission on Excellence in Education. A Nation at Risk:
The Imperative for Educational Reform Washington: Superintendent of Documents, U.S. Government Printing Office, 1983.
Rippa, S. Alexander. Education in a Free Society : An American History. 3rd ed. New York: David McKay Co., 1976.
HIGHER EDUCATION
Out of more than three million students who graduate from high school each year, about one million go on for "higher education." Simply by being admitted into one of the most respected universities in the United States, a high school graduate achieves a degree of success. A college at a leading university might receive applications from two percent of these high school graduates, and then accept only one out of every ten who apply. Successful applicants at such colleges are usually chosen.on the basis of (a) their high school records; (b) recommendations from their high school teachers; (c) the impression they make during interviews at the university; and (d) their scores on the Scholastic Aptitude Tests (SATs).
The system of higher education in the United States is complex. It comprises four categories of institutions: (1) the university, which may contain (a) several colleges for undergraduate students seeking a bachelor's (four-year) degree and (b) one or more graduate schools for those continuing in specialized studies beyond the bachelor's degree to obtain a master's or a doctoral degree; (2) the four-year undergraduate institution—the college—most of which are not part of a university; (3) the technical training institution, at which high school graduates may take courses ranging from six months to four years in duration and learn a wide variety of technical skills, from hair styling through business accounting to computer programming; (4) and the two-year, or community college, from which students may enter many professions or may transfer to four-year colleges or universities.
Any of these institutions, in any category, might be either public or private, depending on the source of its funding. There is no clear or inevitable distinction in terms of quality of education offered between the institutions which are publicly or privately funded. However, this is not to say that all institutions enjoy equal prestige nor that there are no material differences among them.
Many universities and colleges, both public and private, have gained reputations for offering particularly challenging courses and for providing their students with a higher quality of education. The great majority are generally regarded as quite satisfactory. A few other institutions, conversely, provide only adequate education, and students attend classes, pass examinations and graduate as merely competent, but not outstanding, scholars and professionals. The factors determining whether an institution is one of the best or one of lower prestige are quality of teaching faculty, quality of research facilities; amount of funding available for libraries, special programs, etc.; and the competence and number of applicants for admission, i.e., how selective the institution can be in choosing its students. All of these factors reinforce one another.
In the United States it is generally recognized that there are more and less desirable institutions in which to study and from which to graduate. The more desirable institutions are generally—but not always— more costly to attend, and having graduated from one of them may bring distinct advantages as the individual seeks employment opportunities and social mobility within the society. Competition to get into such a college prompts a million secondary school students to take the SATs every year. But recently emphasis on admissions examinations has been widely criticized in the United States because the examinations tend to measure only competence in mathematics and English. In defense of using the examinations as criteria for admissions, administrators at many universities say that the SATs provide a fair way for deciding whom to admit when they have 10 or 12 applicants for every first-year student seat
WHY AMERICANS GO TO COLLEGE
The United States leads all industrial nations in the proportion of its young men and women who receive higher education. Why is this? What motivates a middle-income family with two children to take loans for up to $120,000 so that their son and daughter can attend private universities for four years? Why would both parents in a low-income family take jobs to support their three children at a state university—each at an annual cost of $4,000? Why should a woman in her forties quit her job and use her savings to enroll for the college education she did not receive when she was younger?
Americans place a high value on higher education. This is an attitude that goes back to the country's oldest political traditions. People in the United States have always believed that education is necessary for maintaining a democratic government. They believe that it prepares the individual for informed, intelligent political participation, including voting.
Before World War II, a high school education seemed adequate for satisfying most people's needs, but the post-war period produced dozens of complex new questions for Americans, including issues such as use of atomic power, experiments in splitting genes, space programs and foreign aid. Americans rarely express a direct vote on such complex matters, but the representatives they elect do decide such issues. In recent years, as a result, many Americans have begun to regard a college education as necessary to deal with such questions as an informed American voter.
In addition to idealistic reasons for going to college, however, most Americans are concerned with earning a good (or better) income. For some careers—law, medicine, education, engineering—a college education is a necessary first step. Some careers do not require going to college, but many young Americans believe that having a degree will help them obtain a higher salary on their first job. Today, that first job is likely to involve handling information: More than 60 percent of Americans now work as teachers, computer programmers, secretaries, lawyers, bankers, and in other jobs involving the discovery, exchange and use of data (facts). A high- school diploma is not sufficient preparation for most such employment.
SELECTING A COLLEGE OR UNIVERSITY
In addition to learning about a school's entrance requirements (and its fees), Americans have a lot of questions to think about when they choose a university or college. They need to know:
• What degrees does the school offer? How long does it take to earn one? At the undergraduate (college) level, a four-year "liberal arts" course of study is traditionally offered which leads to a bachelor of arts (B.A.) degree in such subjects as history, languages and philosophy. (The term "liberal arts" comes from liberales artes, a Latin expression for free, or human, arts and skills. In the time of the Roman Empire, these were skills and arts that only a free person—not a slave—could acquire.) Many liberal arts colleges also offer a bachelor of science (B.S.) degree in physics, chemistry or other scientific subjects. A technical training institution, offering such courses as agriculture or business skills, offers courses of varying length, and community college studies last two years.
Graduate schools in America award master's and doctor's degrees in both the arts and sciences. (The term "doctor" comes from the Latin word docere, meaning "to teach.") The courses for most graduate degrees can be completed in two to four years. But if a graduate program requires original research, a student could spend many additional months or even years in a university library or laboratory.
• What curricula does a college or university offer? What are the requirements for earning a degree? In an American university, each college and graduate school has its own curriculum. At the undergraduate level, there may be some courses that every student has to take (for example, classes in world history, math, writing or research). But students do select their "major" (the field in which they want their degree), plus a specific number of "electives" (courses that are not required but that students may choose). The National Institute of Education, a government agency, reports that a total of more than 1,000 majors are offered in America's colleges and universities. The combined electives available in these schools probably amount to a number in the tens of thousands.
Typically, an undergraduate student has to earn a certain number of "credits" (about 120) in order to receive a degree at the end of four years of college. Credits are earned by attending lectures (or lab classes) and by successfully completing assignments and examinations. One credit usually equals one hour of class per week in a single course. A three-credit course in biology could involve two hours of lectures plus one hour in a science lab, every week. A course may last 10 to 16 weeks—the length of a "semester."
• Is the college or university a public institution (operated by a state or local government) or a private one? If it is private, is it a religious school? The United States does not have a national (federal) school system, but each of the 50 states operates its own university, and so do some large city governments. (The government does grant degrees in the schools it operates for professional members of the armed services— for example, the United States Naval Academy at Annapolis, Maryland.)
About 25 percent of all schools of higher education in the United States are privately operated by religious organizations. Most are open to students of different faiths, but in some religious schools all students are required to attend religious services. There are also privately owned schools with no religious connection.
Both public and private colleges depend on three sources of income: student tuitions, endowments (gifts made by wealthy benefactors) and government funding. Some endowments are very large: Harvard, Princeton and Yale Universities have more than a thousand million dollars each. Public institutions receive a larger portion of public tax monies than do private schools.
• How large is the school? There are many small American colleges—some with fewer than 100 students. But the larger universities tend to keep attracting larger numbers of enrollments. By the mid-1980s, at least seven
universities had total enrollments of over 100,000 each. (One of the seven, the State University of New York, has more than 60 campuses in different parts of the state.)
Why do the large universities flourish Until recent years, a major answer to this question was: They offer the best libraries; facilities for scientific research. Access to a "mainframe" (very large) computer and to modern laboratories attracts leading scientists to the faculties of such schools. And students enroll to study with the experts. Research programs continue to be important to the reputation of America's universities. But ii recent years, the percentage of advanced degrees awarded in the "pure" (research) sciences has defined. The same has been true for the liberal arts. Students continue to see the largest, most respected universities—but for new and different programs.
TRENDS IN DEGREE PROGRAMS
During the 1970s and 1980s, there was a trend away from the traditional liberal arts. Instead students were choosing major fields that prepare them for specific jobs. In 1987, 56 percent of the four-year bachelor's degree: were conferred in business and management, computer and information science, education, engineering, health professions, and public affairs. Only 13 percent of the degrees we conferred in the traditional arts and sciences.
But some observers believe this trend toward pre-professionalism may be ending that students are switching back to tradition areas of study. They cite the fact that in H bachelor's degrees in mathematics were u percent since the low point they had reach 1981. Bachelor's degrees in English and literature, foreign languages, history, and physics also saw an upswing.
In many ways, this new popularity î liberal arts is a return to the early traditions of American education.
TRADITIONS IN EDUCATION
When the colonies that eventually became United States of America were settled in 1600s, the world already had some very ñ universities. The University of Al-Azhar Cairo was then more than 600 years old. had had its University of Bologna for centuries. Oxford and Cambridge in England and the University of Paris were founded 12th century.
European colleges were an offspring universities. The first colleges were open Paris in the 15th century as residence halls university students. Usually, all the students in one residence studied the same subject (for example, law, medicine or theology). The "college" gradually came to mean a place studying a specific subject (for example, law). And thus, colleges developed into school
Both institutions—colleges and universities—were an important part of 1 England when its people began to migrate to North America. Within the first 25 years Massachusetts Bay Colony more than 1Ñ graduates from Oxford Cambridge and E joined its founders. It was natural for the early colonists to set up the same institute America that they had known in their native country. And—since many colonists came to America for religious freedom—it is not surprising that their first colleges trained young men to be ministers. Harvard College was founded for this purpose in 1636; so were William and Mary College (Virginia) in 1693, and Yale (Connecticut) in 1701. By the time the colonies won their independence from Britain in 1783, six more colleges had been added: Princeton (New Jersey), Pennsylvania, Columbia (New York), Brown (Rhode Island), Rutgers (New Jersey) and Dartmouth (New Hampshire). All are active, respected universities today.
NEW WORLD COLLEGES
The colonies prospered in the 18th century. Men and women who left England and other European countries as poor people became landowners and traders on American shores. In Europe, college was regarded as a place for the elite (members of the wealthy "upperclass"), but in early colonial America no rigid traditions of class existed. So those who could afford it sent their sons (but not their daughters) to a colonial college. Not all these sons, however, went on to the religious ministry. By the middle of the 18th century, only half the graduates of American colleges were becoming ministers. The other 50 percent usually chose careers as lawyers, doctors and businessmen.
What did colonial colleges teach? As in Europe, Latin and Greek were basic subjects. So were philosophy and the study of religion. But, responding to the interest of the "new student" in the New World (as America was then called), colleges introduced "modem" subjects, too. Students read and discussed the new political ideas of England's John Locke and France's Montesquieu. They were given a taste of geography. A few colonial colleges even offered courses in the so-called "practical" subjects—surveying, navigation, husbandry (farming), commerce (trade) and government.
But the basic goals and methods of 18th- century academic education did not change in colonial colleges. These colleges still followed the models set down by Oxford and Cambridge: They were dedicated to forming their students' characters, to handing down the knowledge of previous generations. They did not offer to lead their students in doing fresh research or in adding new ideas to what the world already knew. Even after the independence Qf the United States in 1783, this model of higher education would continue in the United States for most of the next century.
DEMOCRACY AND EDUCATION
By the time of George Washington's inauguration as the first president in 1789, several very powerful ideas had worked their way into American thinking. Inspiring documents had attended the birth of the new nation: Thomas Paine's pamphlet, "Common Sense," the Declaration of Independence, the Constitution of the United States and the Federalist Papers (essays in which the new Constitution was discussed). Reading and debating the contents of these works was an education in itself. Americans became deeply conscious of the principles of democracy and of the proper role of government in a republic.
The two principles of excellence in education and popular control of government were sometimes difficult to keep in balance. For example, when the founders of the new nation urged more education for all citizens, Americans applauded the idea. But when Washington and his first five successors proposed opening a national university in the nation's capital, the Congress said no. The people's representatives feared giving too much power to the new central government. Decisions about education, they decided, should continue to be made by each state and locality.
TÍÅ 19TH CENTURY
The 19th century hit the United States like a series of strong, gusting winds. If these winds had a common direction, it was westward: Millions of Europeans sailed west across the Atlantic to live in the new nation. And millions of these newcomers joined the descendants of earlier immigrants in a westward trek across the North American continent. As pioneers, they planted homes, farms, towns and colleges as they moved toward the Pacific Ocean. Most of these new colleges were poor, but they accepted almost everyone who had the time and interest to apply. And with this development, a crack appeared in the European model.
Another crack appeared with the
admission of women into college. The first three women to receive their B.A.'s from an American school graduated from Oberlin College, Ohio, in 1841. But Oberlin—which had admitted all applicants regardless of race or sex since the 1830s—was an exception. Most colleges in the first half of the 19th century refused women applicants. It was also considered improper for women to attend the same class as men. (Even in Oberlin, women were not allowed to attend an evening class demonstrating the use of telescopes for observing stars.) These attitudes changed slowly. Vassar (New York), the first American college founded for women, did not open until 1865. Wellesley, Smith (both in Massachusetts) and a few others followed within the next 35 years.
The most unusual change in American higher education may have begun with an unusual law—the Land Grant College Act of 1862. Under this law, the federal government gave huge tracts of public land to the states for the development of agricultural and technical colleges. The states sold the land and used the money to build these colleges. The Land Grant Act marked the beginning of federal influence on higher education—an influence based on financial aid. It also was the beginning of another trend: Land grant colleges became deeply involved in researching new methods of scientific farming.
In 1869 Harvard's new president, Charles Eliot, reorganized his college—the nation's oldest—into a university. He raised Harvard's entrance requirements, added new courses (including electives), and toughened Harvard's standards for awarding degrees. A few years later, the Johns Hopkins University opened in Maryland, followed more than a decade later by the University of Chicago (Illinois) and Stanford University (California). These new research-oriented institutions introduced graduate school programs (a level of education European nations had had for some time). By the beginning of the 20th century, almost all the other characteristics of American higher education were in place:
• A number of graduate and undergraduate schools began to specialize (focus on just one field of study). "Normal Schools," for example, were founded to prepare those who wanted to be teachers.
• Many colleges and universities that had been operated by religious groups were now simply private—or even public—schools.
• Most colleges and universities were coeducational (open to men and women). In the years following the end of slavery, black Americans, too, began to attend colleges and universities. (But it would take many more years to erase school segregation—the practice of educating blacks and whites in separate schools.)
Despite all these changes, however, higher education in the United States was still regarded as something for a sort of elite: the most talented, the wealthy, or at least those who could afford not to work full-time while they attended college or a university.
EDUCATION FOR ALL
In 1944 Congress passed the Servicemen's Readjustment Act, soon popularly called the 'GI Bill of Rights." ("GI," at the time, was a nickname for the American soldier. The nickname came from an abbreviation for "Government Issue"—the uniforms and other articles "issued" to a soldier.) The Act promised financial aid, including aid for higher education, to members of the armed forces after the end of World War II.
The war ended in the following year. The prediction had been that 600,000 war veterans would apply for aid for education. By 1955, more than two million veterans of World War II and of the Korean War had used their GI Bill of Rights to go to college. Many of these veterans were from poor families. Thirty percent were married when they applied for college aid; 10 percent had children. More than a few had to work part-time while they took courses. It was difficult, but these veterans believed that a college degree (which they could not afford on their own) would improve their chances for a good job in the post-war economy. Some went to liberal arts colleges; others to technical and professional institutions. Their outstanding success in all these schools forced everyone connected with higher education to rethink its purpose and goals. Within just a few years, American veterans had changed the image of who should go to college.
In post-war America, other groups sought their place on America's campuses, too. The enrollment of women in higher education began to increase. Racial segregation in elementary and secondary education ended, and thus blacks achieved an equal opportunity to get into any college of their choice.
By the 1960s, some colleges introduced special plans and programs to equalize educational opportunities—at every level, for all groups. Some of these plans were called "affirmative action programs." Their goal was to make up for past inequality by giving special preference to members of minorities seeking jobs or admission to college. (In the United States, the term "minority" has two meanings, often related: (a) A minority is any ethnic or racial group that makes up a small percentage of the total population; (b) The term also suggests a group that is not the dominant political power.) Some colleges, for example, sponsored programs to help minority students prepare for college while still in high school.
By the 1970s, the United States government stood firmly behind such goals. It required colleges and universities receiving public funds to practice some form of affirmative action. But when colleges began to set quotas (fixed numbers) of minority students to be admitted, many Americans (including minority citizens) protested. They felt that this was another form of discrimination.
As with most (but not all) problems in American public life, the conflict was resolved by change and compromise. Colleges continued to serve the goal of affirmative action—but in less controversial ways. One large university, for example, announced a new policy: It would seek to admit students who would add diverse talents to the student body. It thus dealt with all applicants— minorities, included—on a basis that was not restricted to high school performance and entrance tests, but which took into account the talents, voluntary activities and "life experience" of the student.
What success did these efforts have? American college students are an increasingly diverse group. In 1987, 54 percent were women. Women received 51 percent of the bachelor's and master's degrees awarded that year, and 35 percent of the doctorates and professional degrees. But not all groups are doing so well.
Although 59 percent of the students who graduated from high school in 1988 enrolled in college that same year, only 45 percent of the African-American high school graduates went on to college. Educators and others are working to increase that percentage.
U.S. colleges and universities are also enrolling a higher percentage of non- traditional students—students who have worked for several years before starting college or students who go to school part-time while holding down a job. In 1987, 41 percent of college students were 25 years of age or older and 43 percent were part-time students.
HIGHER EDUCATION—TOMORROW
Can America's colleges and universities rest on their accomplishments? About 12 million students currently attend schools of higher education in America. They are students in a society that believe in the bond between education and democracy. They have at their disposal great libraries (Harvard alone has more than 10 million volumes); the latest in technology; and faculties with a tradition of research accomplishments. (The world's first electronic computer was developed at the University of Pennsylvania, for example.) They are free to pursue their interests, to develop their talents, and to gain professional rank.
Still, many Americans are not satisfied with the condition of higher education in their country.
Perhaps the most widespread complaint has to do with the college curriculum as a whole, and with the wide range of electives in particular. In the mid-1980s, the Association of American Colleges (AAC) issued a report that called for teaching a body of common knowledge to all college students. According to the AAC report, this common core of subjects should include science and the study of cultural differences (as well as basic literacy). A somewhat similar report, "Involvement in Learning," was issued by the National Institute of Education (NIE). In its report, the NIE concluded that the college curriculum has become "excessively vocational work-related." The report also warned that college education may no longer be developing in students "the shared values and knowledge" that traditionally bind Americans together. A serious charge: Is it true?
For the moment, to some degree, it probably is. Certainly, some students complete their degree work without a course in Western civilization—not to mention other world cultures. Others leave college without having studied science or government. As one response, many colleges have begun reemphasizing a core curriculum that all students must master.
On the other hand, many students and some professors have charged that university curricula are too "Euro-centered," that they emphasize European culture at the expense of the cultures of Africa, Asia or Latin America, for example. This has led to a movement toward "multiculturalism," or the addition to the curriculum in many institutions of courses on such subjects as African literature or on the contributions of women to society. Some traditionalists argue that this trend has gone too far.
Such problems are signs that American higher education is changing, as it has throughout its history. And as in the past, this change may be leading in unexpected directions: The Puritans set up colleges to train ministers. But their students made their mark as the leaders of the world's first Constitutional democracy. The land grant colleges were founded to teach agriculture and engineering to the builders of the American West. Today, many of these colleges are leading schools in the world of scientific research. American universities were established to serve a rather small elite. In the 20th century, GIs, women and minorities claimed their right to be educated at these same universities. The full impact of this change is probably yet to be seen.
Americans have always had a stake in "making the system work." They have especially critical reasons for doing so in the field of education. People in the United States today are faced with momentous questions: "What is America's proper role as the world's oldest Constitutional democracy; its largest economy; its first nuclear power?"
Americans cherish their right to express opinions on all such issues. But the people of the United States are also painfully aware of how complex such issues are. To take part in dealing with new problems, most Americans feel they need all the information they can get. Colleges and universities are the most important centers of such learning. And whatever improvements may be demanded, their future is almost guaranteed by the
American thirst to advance and be well- informed. In fact, the next change in American education may be a trend for people to continue their education in college—for a lifetime.
Suggestions for Further Reading
Barnes, Gregory A.
The American University: A World Guide. Philadelphia: ISI Press, 1984.
Ben-David, Joseph.
American Higher Education: Directions Old and New.
The Carnegie Commission on Higher Education. New York: McGraw-Hill, 1972.
The College Board.
The College Handbook.
New York: Scribner, issued annually.
Peterson's Annual Guide to Undergraduate Study: Guide For Colleges.