бесплатно рефераты

бесплатно рефераты

 
 
бесплатно рефераты бесплатно рефераты

Меню

U.S. Culture бесплатно рефераты

expanded funding for universities. The federal government began to provide

substantial amounts of money for university research programs through

agencies such as the National Science Foundation, and later through the

National Institutes of Health and the departments of Energy and Defense.

At the same time, the government began to focus on providing equal

educational opportunities for all Americans. Beginning with the GI Bill,

which financed educational programs for veterans, and later in the form of

fellowships and direct student loans in the 1960s, more and more Americans

were able to attend colleges and universities.

During the 1960s the federal government also began to play more of a role

in education at lower levels. The Great Society programs of President

Lyndon Johnson developed many new educational initiatives to assist poor

children and to compensate for disadvantage. Federal money was funneled

through educational institutions to establish programs such as Head Start,

which provides early childhood education to disadvantaged children. Some

Americans, however, resisted the federal government’s increased presence

in education, which they believed contradicted the long tradition of state-

sponsored public schooling.

By the 1980s many public schools were receiving federal subsidies for

textbooks, transportation, breakfast and lunch programs, and services for

students with disabilities. This funding enriched schools across the

country, especially inner-city schools, and affected the lives of millions

of schoolchildren. Although federal funding increased, as did federal

supervision, to guarantee an equitable distribution of funds, the

government did not exercise direct control over the academic programs

schools offered or over decisions about academic issues. During the 1990s,

the administration of President Bill Clinton urged the federal government

to move further in exercising leadership by establishing academic

standards for public schools across the country and to evaluate schools

through testing.

Concerns in Elementary Education

The United States has historically contended with the challenges that come

with being a nation of immigrants. Schools are often responsible for

modifying educational offerings to accommodate immigrants. Early schools

reflected many differences among students and their families but were also

a mechanism by which to overcome these differences and to forge a sense of

American commonality. Common schools, or publicly financed elementary

schools, were first introduced in the mid-19th century in the hopes of

creating a common bond among a diverse citizenship. By the early 20th

century, massive immigration from Europe caused schools to restructure and

expand their programs to more effectively incorporate immigrant children

into society. High schools began to include technical, business, and

vocational curricula to accommodate the various goals of its more diverse

population. The United States continues to be concerned about how to

incorporate immigrant groups.

The language in which students are taught is one of the most significant

issues for schools. Many Americans have become concerned about how best to

educate students who are new to the English language and to American

culture. As children of all ages and from dozens of language backgrounds

seek an education, most schools have adopted some variety of bilingual

instruction. Students are taught in their native language until their

knowledge of English improves, which is often accomplished through an

English as a Second Language (ESL) program. Some people have criticized

these bilingual programs for not encouraging students to learn English

more quickly, or at all. Some Americans fear that English will no longer

provide a uniform basis for American identity; others worry that immigrant

children will have a hard time finding employment if they do not become

fluent in English. In response to these criticisms, voters in California,

the state that has seen the largest influx of recent immigrants, passed a

law in 1998 requiring that all children attending public schools be taught

in English and prohibiting more than one year of bilingual instruction.

Many Americans, including parents and business leaders, are also alarmed

by what they see as inadequate levels of student achievement in subjects

such as reading, mathematics, and science. On many standardized tests,

American students lag behind their counterparts in Europe and Asia. In

response, some Americans have urged the adoption of national standards by

which individual schools can be evaluated. Some have supported more

rigorous teacher competency standards. Another response that became

popular in the 1990s is the creation of charter schools. These schools are

directly authorized by the state and receive public funding, but they

operate largely outside the control of local school districts. Parents and

teachers enforce self-defined standards for these charter schools.

Schools are also working to incorporate computers into classrooms. The

need for computer literacy in the 21st century has put an additional

strain on school budgets and local resources. Schools have struggled to

catch up by providing computer equipment and instruction and by making

Internet connections available. Some companies, including Apple Computer,

Inc., have provided computer equipment to help schools meet their

students’ computer-education needs.

Concerns in Higher Education

Throughout the 20th century, Americans have attended schools to obtain the

economic and social rewards that come with highly technical or skilled

work and advanced degrees. However, as the United States became more

diverse, people debated how to include different groups, such as women and

minorities, into higher education. Blacks have historically been excluded

from many white institutions, or were made to feel unwelcome. Since the

19th century, a number of black colleges have existed to compensate for

this broad social bias, including federally chartered and funded Howard

University. In the early 20th century, when Jews and other Eastern

Europeans began to apply to universities, some of the most prestigious

colleges imposed quotas limiting their numbers.

Americans tried various means to eliminate the most egregious forms of

discrimination. In the early part of the century, "objective" admissions

tests were introduced to counteract the bias in admissions. Some educators

now view admissions tests such as the Scholastic Achievement Test (SAT),

originally created to simplify admissions testing for prestigious private

schools, as disadvantageous to women and minorities. Critics of the SAT

believed the test did not adequately account for differences in social and

economic background. Whenever something as subjective as ability or merit

is evaluated, and when the rewards are potentially great, people hotly

debate the best means to fairly evaluate these criteria.

Until the middle of the 20th century, most educational issues in the

United States were handled locally. After World War II, however, the

federal government began to assume a new obligation to assure equality in

educational opportunity, and this issue began to affect college admissions

standards. In the last quarter of the 20th century, the government

increased its role in questions relating to how all Americans could best

secure equal access to education.

Schools had problems providing equal opportunities for all because

quality, costs, and admissions criteria varied greatly. To deal with these

problems, the federal government introduced the policy of affirmative

action in education in the early 1970s. Affirmative action required that

colleges and universities take race, ethnicity, and gender into account in

admissions to provide extra consideration to those who have historically

faced discrimination. It was intended to assure that Americans of all

backgrounds have an opportunity to train for professions in fields such as

medicine, law, education, and business administration.

Affirmative action became a general social commitment during the last

quarter of the 20th century. In education, it meant that universities and

colleges gave extra advantages and opportunities to blacks, Native

Americans, women, and other groups that were generally underrepresented at

the highest levels of business and in other professions. Affirmative

action also included financial assistance to members of minorities who

could not otherwise afford to attend colleges and universities.

Affirmative action has allowed many minority members to achieve new

prominence and success.

At the end of the 20th century, the policy of affirmative action was

criticized as unfair to those who were denied admission in order to admit

those in designated group categories. Some considered affirmative action

policies a form of reverse discrimination, some believed that special

policies were no longer necessary, and others believed that only some

groups should qualify (such as African Americans because of the nation’s

long history of slavery and segregation). The issue became a matter of

serious discussion and is one of the most highly charged topics in

education today. In the 1990s three states—Texas, California, and

Washington—eliminated affirmative action in their state university

admissions policies.

Several other issues have become troubling to higher education. Because

tuition costs have risen to very high levels, many smaller private

colleges and universities are struggling to attract students. Many

students and their parents choose state universities where costs are much

lower. The decline in federal research funds has also caused financial

difficulties to many universities. Many well-educated students, including

those with doctoral degrees, have found it difficult to find and keep

permanent academic jobs, as schools seek to lower costs by hiring part-

time and temporary faculty. As a result, despite its great strengths and

its history of great variety, the expense of American higher education may

mean serious changes in the future.

Education is fundamental to American culture in more ways than providing

literacy and job skills. Educational institutions are the setting where

scholars interpret and pass on the meaning of the American experience.

They analyze what America is as a society by interpreting the nation’s

past and defining objectives for the future. That information eventually

forms the basis for what children learn from teachers, textbooks, and

curricula. Thus, the work of educational institutions is far more

important than even job training, although this is usually foremost in

people’s minds.

ARTS AND LETTERS

The arts, more than other features of culture, provide avenues for the

expression of imagination and personal vision. They offer a range of

emotional and intellectual pleasures to consumers of art and are an

important way in which a culture represents itself. There has long been a

Western tradition distinguishing those arts that appeal to the multitude,

such as popular music, from those—such as classical orchestral

music—normally available to the elite of learning and taste. Popular art

forms are usually seen as more representative American products. In the

United States in the recent past, there has been a blending of popular and

elite art forms, as all the arts experienced a period of remarkable cross-

fertilization. Because popular art forms are so widely distributed, arts

of all kinds have prospered.

The arts in the United States express the many faces and the enormous

creative range of the American people. Especially since World War II,

American innovations and the immense energy displayed in literature,

dance, and music have made American cultural works world famous. Arts in

the United States have become internationally prominent in ways that are

unparalleled in history. American art forms during the second half of the

20th century often defined the styles and qualities that the rest of the

world emulated. At the end of the 20th century, American art was

considered equal in quality and vitality to art produced in the rest of

the world.

Throughout the 20th century, American arts have grown to incorporate new

visions and voices. Much of this new artistic energy came in the wake of

America’s emergence as a superpower after World War II. But it was also

due to the growth of New York City as an important center for publishing

and the arts, and the immigration of artists and intellectuals fleeing

fascism in Europe before and during the war. An outpouring of talent also

followed the civil rights and protest movements of the 1960s, as cultural

discrimination against blacks, women, and other groups diminished.

American arts flourish in many places and receive support from private

foundations, large corporations, local governments, federal agencies,

museums, galleries, and individuals. What is considered worthy of support

often depends on definitions of quality and of what constitutes art. This

is a tricky subject when the popular arts are increasingly incorporated

into the domain of the fine arts and new forms such as performance art and

conceptual art appear. As a result, defining what is art affects what

students are taught about past traditions (for example, Native American

tent paintings, oral traditions, and slave narratives) and what is

produced in the future. While some practitioners, such as studio artists,

are more vulnerable to these definitions because they depend on financial

support to exercise their talents, others, such as poets and

photographers, are less immediately constrained.

Artists operate in a world where those who theorize and critique their

work have taken on an increasingly important role. Audiences are

influenced by a variety of intermediaries—critics, the schools,

foundations that offer grants, the National Endowment for the Arts,

gallery owners, publishers, and theater producers. In some areas, such as

the performing arts, popular audiences may ultimately define success. In

other arts, such as painting and sculpture, success is far more dependent

on critics and a few, often wealthy, art collectors. Writers depend on

publishers and on the public for their success.

Unlike their predecessors, who relied on formal criteria and appealed to

aesthetic judgments, critics at the end of the 20th century leaned more

toward popular tastes, taking into account groups previously ignored and

valuing the merger of popular and elite forms. These critics often relied

less on aesthetic judgments than on social measures and were eager to

place artistic productions in the context of the time and social

conditions in which they were created. Whereas earlier critics attempted

to create an American tradition of high art, later critics used art as a

means to give power and approval to nonelite groups who were previously

not considered worthy of including in the nation’s artistic heritage.

Not so long ago, culture and the arts were assumed to be an unalterable

inheritance—the accumulated wisdom and highest forms of achievement that

were established in the past. In the 20th century generally, and certainly

since World War II, artists have been boldly destroying older traditions

in sculpture, painting, dance, music, and literature. The arts have

changed rapidly, with one movement replacing another in quick succession.

Visual Arts

Страницы: 1, 2, 3, 4, 5, 6, 7, 8