Dare to Know

Essays on Education, Schooling, Politics, and Life

 

He had failed to acquire any useful education…He must educate himself over again…His course had led him through oceans of ignorance; he had tumbled from one ocean into another till he had learned to swim.
— Henry Adams

 

Educational Malpractice at a Business School in Texas

Educational Malpractice at a Business School in Texas: 

An Essay on the Corruption of Higher Education in America 

By Josh M. Beach

2023 

 

Don’t Believe the Hype

I had been looking at business schools for many years.  I had followed The Economist business school rankings for decades.  I had an idea about what schools I would ideally like to attend, but they were very expensive and very far away.  I was not rich.  I had a family.  And it was the middle of a global pandemic. 

Thus, I looked more practically at schools that I could afford, and that were close to home, and which offered on-line instruction so I wouldn’t die of a deadly virus in a classroom. 

In looking closer to home, I came across the business school at the University of Texas at Dallas.  The university has a good academic reputation, and my step daughter was just finishing up her undergraduate degree at this university. 

According to the Naveen Jindal School of Management’s website, it is ranked somewhere between 7th to 38th in terms of best MBA program in America, and also between 40th and 79th in terms of the top global MBA programs. 

Because I was going back to school during a pandemic, which meant completely online, I also saw that this school was tied at 6th place for Best Online Master’s in Business Programs, according to U.S. News & World Report

Later in 2021, after I was enrolled as a student, UTD’s Naveen Jindal School of Management reported that it had risen from 81st place up to 73rd place in the 2021 Financial Times rankings, largely due to the increase in published research articles by full-time faculty.  The Princeton Review ranked it 40th out of all colleges in the country based on “academic excellence, affordability and career prospects for graduates.”  The school was ranked first to 11th in various measures of published academic research.  Also, Bloomberg Businessweek for their Best B-School MBA ranked the school 32 out of 119 MBA programs, rising four places, and ranked it 11th for public universities.[1]

Importantly, the school was also relatively cheap for a business school, with in-state tuition at only about $25,000 a year (two full semesters plus summer school), while the top two programs in the state charged over $50,000 a year.  U.S. News ranked this business school number 1 in terms of salary-to-debt ratios of MBA graduates.  U. S. News also ranked it as a top “value school,” although the university claimed to be in “the top 50 nationwide,”[2] but when you go to the actual list UT Dallas ranks #135,[3] so UTD is guilty of some false marketing.

On paper, this looked like a good school.  As the Chronicle of Higher Education recently reported, colleges and universities are “obsessed” over their national rankings and advertise their positions every chance they get so they can attract more students.[4]  It’s often the easiest accessible data for students to compare schools before applying.

But as many scholars have demonstrated, never judge a university by its rankings.  They don’t tell you anything about the actual quality of program.  Malcom Gladwell recently discussed on his podcast Revisionist History, higher education rankings are meaningless when it comes to the actually academic quality of institutions.[5]  Colin Diver goes much further.  He was formerly the President of Reed College, a trustee of Amherst College, Dean of the University of Pennsylvania Law School, and a former professor of law of economics.  He has argued that rankings are a scam that often “create powerful incentives to manipulate data and distort institutional behavior.”[6]

Columbia Math professor Michael Thaddeus was able to recently get Columbia University to withhold data from U.S. News & World Report because he found that “several of the key figures supporting Columbia’s high ranking are inaccurate, dubious or highly misleading.”[7]

The Deans at UTD’s Naveen Jindal School of Management often call attention to their high rankings as proof of a high-quality university program. 

But the Naveen Jindal School of Management is anything but a quality university school.  After earning my MBA there, I would argue that this business school a fraudulent degree mill that doesn’t care about either educational standards, or the educational and labor market needs of their students. 

Business school rankings are meaningless.  I was lied to.  Had I known the truth, I would have never enrolled in this school.  I was sold substandard services.  My educational experience has been awful.  It is the worst schooling experience of my life.  I didn’t learn anything of value in my classes, although due to my own self-learning efforts honed during decades of study and scholarly research and publication, I was able to read about 250 books and academic articles on my own time, which taught me a great deal about business management.

If UTD were honest business that was actually accountable to its customers, then I would demand my money back.  But universities don’t care about the educational needs of their students.  So-called “higher education” in the United States has become just another empty ritual, especially in Texas.

So, while I am singling out in this essay The Naveen Jindal School of Management at the University of Texas at Dallas, the malpractices that I describe can be found in many institutions of “higher education” across the state of Texas and the U.S.  I used to work at the University of Texas at San Antonio and found the same fraudulent practices there.  I’ve also seen these malpractices all across California too, as well as South Korea and China.  Empty, useless, ritualized schooling is endemic around the world. 

Some critics, like Kevin Carey, director of the education policy program at New America, a think tank in Washington, D.C., have gone so far as to call many college degree programs, especially master’s degree programs, fraudulent “scams” because the quality of education is so low, the cost is so high, and there is often no direct link to a good job if you graduate so students are stuck with costly debt for the rest of their life.[8]

The shabby state of the curriculum is one reason why MBA programs do not live up to the advertised hype of higher salaries and better careers, which management professors Pfeffer & Fong (2002) have proven to be widely publicized but false claims.[9]

Is There an Instructor in This Course?

It’s commonplace for administrators in higher education to automate online courses so much that they virtually eliminate the possibility of teaching.  Not only does this deny students any chance for an authentic learning experience, it can also create a lot a lot of confusion.  Often written materials, like the syllabus, and course videos, are often far from clear, especially when the instructor is a poor writer or speaker.  In face-to-face communication, you can ask questions and get clarifications.  That is often impossible in on-line schooling.

There is also a darker side to online instruction.  It is easy for instructors to commit fraud by shirking many of the basic requirements of the job, like interacting with students and answering questions.  That’s what I found at the University of Texas at Dallas. 

For example, in one of my classes at UT Dallas there was no syllabus posted and no course by the first day of class.  I had emailed the professor a couple of weeks before the semester started, asking about a book list and a syllabus, but there was no reply.  On the first day of class, when all the materials were supposed to be posted online, there was nothing, not even a syllabus, which was required to be posted on the first day of class by state law.  Most of the class had joined together on an app, and several students said they had emailed the professor, but did not get a response.  By the end of the first day, I sent an email off to the Associate Dean to complain and hopefully get some action in place so we could start the course.  The Dean just forwarded my email to the professor and another professor who was the Area Coordinator, but obviously the incompetent professor had already proven that she didn’t respond to emails.  By the end of the second day of the first week, there was still no syllabus and no class, so I emailed the Senior Associate Dean who texted the professor.

On another occasion, there was a syllabus and course materials, but very little help from the professor, who gave his students much less than what was legally required for a college class in Texas.  What follows is the text of an email that I sent to the Dean of the School of Business on September 16, 2020 because I was disgusted with the low quality of service that I was getting in a Calculus class my first semester.

“Dear Dean,

There is an important matter that I need to bring to your attention.  I’m not sure you can do anything about this situation now, but hopefully you can intervene to make changes for future students.

I am a first semester MBA student taking OPRE 6303 this fall.

While you can consider this a student complaint about a professor, I think it would be more accurate to think of this as a collegial complaint about an unprofessional colleague.  I was a college and university lecturer for 20 years, most recently at UT San Antonio, and I’ve never seen a collogue behave so unprofessionally, nor have I ever experienced such unprofessional behavior during the 11+ years that I have been a full-time undergraduate and graduate student.

I am taking Calculus with Dr X, which is a 3-credit course.

I was expecting three hours a week of instruction, plus office hours to ask further questions.

Instead, over the first 6 weeks, our class has received 10 videos.  These videos total about 109 min, which is less than two hours of instruction.

Going by Carnegie credit hours, which should be a basic part of your department’s accreditation process, we should have received 18 hours of instruction by this point.

Plus, it is a standard requirement for professors to offer at least 1 office hour a week per 3 credit class, plus kind professors offer more time for those who need more help.  Our professor held no office hours the first week, and then initiated 30 min of office hours once a week. 

We also have a TA who holds one office hour a week, but he is not as knowledgeable as the professor, and therefore, not the best resource.

I find this situation alone to be very unprofessional.  I am paying over $2500 for a 3-credit course, but getting a small fraction of the standard instruction time that is supposed to come with this course.  I was also struggling for the first three weeks, and I was getting almost no help, except some from the TA who was overwhelmed also trying to help many other students.

To make matters worse, Dr. X skips a lot of steps when he hastily solves problems in the videos.  This makes it impossible for an inexperienced learner to actually follow along and learn how to do a problem.  This is also the case with his written “practice” test worksheets, which have solutions to the problems, but the solutions either skip a lot of steps, or use methods and notations that are not standardly used when teaching this material (I found out about that from the tutor I had to hire).

And there is no assigned textbook, so there is no way for a student to understand all the steps to solve any problem without asking either the TA or the professor, but they are not readily available for help, nor is the TA help always accurate.

Furthermore, all professors are required to respond to student emails and make an effort to respond to student requests for help.  I emailed Dr. X with a very detailed message with specific pedagogical terminology that a competent teacher should know.  He ignored all of my issues and requests.  He responded with an extremely condescending reply, essentially telling me that I had a “perception” problem that he was powerless to help.  He then asked me to give him times so we could schedule a meeting, but he never responded to my reply.  It’s been 12 days.  After waiting one day, I hired a private tutor.

Also, his unprofessional, condescending attitude is in almost every one of his videos where he says at least 2-3 times in each video how “easy” and “simple” the material is, but to beginning learners like myself, the material is anything but easy or simple.  I found this language offensive.

Finally, I have had to pay a lot of money to hire a private math tutor who holds a PhD and is a college lecturer.  With his help, I was actually able to learn all of the material quite quickly because he is a competent teacher, although it is taking a lot of time to practice and learn how to appropriate recall basic concepts to answer specific problems. 

And in working with my tutor, he pointed out several places in the “practice quizzes” where Dr. X had typos, which caused me a lot of grief as I was trying to work through problems when I first started, and I’m sure it has also been a roadblock to other students. 

Furthermore, my tutor explained how the TA for our class gave us misleading and incorrect information about some problem types, which also caused me some pain due to lost time and confusion. 

My tutor also said that Dr. X throws out a lot un unnecessarily complex and advanced problems on his worksheets, which are not very appropriate for inexperienced learners because they cause a lot of confusion.  This has also been the case with some of the problems he included on his practice tests.

I’m not sure what you can do about these issues at this point, but I am struggling with this class and there is a chance that I will not be able to pass.  While I am definitely inexperienced, incompetent, and out of practice with math, Dr. X has made my situation much worse.  Had I had a competent professor who gave his students the full amount of time he is institutionally required to give then I would be doing much better in this class. 

While I am getting a great education from my tutor, I am getting almost nothing of value from my actual UTD class.

Please keep my information fully confidential so there can be no retaliation against me while I finish this class.  I have also attached my email correspondence with Dr. X.”

And what was the result of my email? 

Well, the Dean did respond.  He agreed that these incidents were not kosher, but he didn’t take the time to say much else.  He wrote a very short reply stating, “Thank you for reaching out to me.  What you describe is unacceptable.  I apologize.  I will ask for corrective action.  We will keep your email confidential.” 

Was there corrective action?  Yes, well, kind of. 

Some of the issues were addressed, at least for a short while.  The professor was still teaching a full load the next semester so clearly there were no repercussions.  I find it hard to believe that nobody has complained about this professor before.  Clearly the department is quite permissive with mal-practicing professors, probably because teaching doesn’t matter for the performance reviews of most professors, which I wrote about in one of my most recent books on higher education.

After I complained to the Dean, and without saying anything to the class, the instructor all of the sudden started to spend more time at office hours.  He also created more and longer videos explaining concepts in a more step-by-step fashion.  He also stopped saying how easy all of the math concepts were as he was explaining them.  Small changes, but they were changes.

However, the professor’s overall lack of care for the class, and his uncaring attitude for students, persisted, especially careless typos on course materials, and worst of all, lots of typos and mistakes on exams.

For example, take exams.  The whole course consisted of three exams.  Fail one, and you fail the course.  With this type of class structure, it is very unprofessional, and also very unkind, to write bad tests that are hard to read because they are full of typos and broken sentences. 

After one of the early exams, I emailed the professor and said that there was one question that was grammatically incorrect, which made it unclear to read.  I had trouble answering the question.  The first clause of the sentence was completely unclear.  The second clause focused on “unity,” so I assumed it was referring to an identity matrix, and that’s how I answered it, but I simply didn’t know what it was asking. 

I’m pretty sure I got that question wrong.  But I don’t know because we never got any actual feedback on exams.  In fact, there was never any feedback from the professor to students about anything.

I pointed out to the professor that he did cover the topic of the zero matrix in his lecture very briefly, but he didn’t mention anything specific about it being a different type of entity that could be multiplied to any size matrix.  On the exam, he asked a question that seemed very obscure, given there was no practice question on this concept, or any application of the concept in his videotaped lectures.  

When I raised these issues with my professor, he dismissed them.  My emails were not appreciated.  He ignored my concerns with a short response that made it seem like it was my fault for not being able to read his exam. 

Before another big exam, I took the short practice test online and noticed that there was an error.  After taking the test, the computer indicated “right” and “wrong” answers, but one of the “correct” answers for a question was wrong. 

I emailed the professor and explained what the correct answer was and I attached a screen shot of the incorrect answer on the test marked as “correct.”  I even showed it to my math tutor and he agreed that it was incorrect.  I emailed the professor to fix this part of the practice exam, and I also said he needed to “double check the test itself to make sure there are no other errors,” as there had already been typos and badly written sentences on the major exams. 

The professor replied that he didn’t see the problem, but he did note that he had not written a complete sentence for the test question, which he fixed, but he didn’t actually fix how the computer was grading the questions. 

The final exam in this class was a nightmare.  It was perhaps the most traumatizing test-taking experience I have ever had in 25 years as a student. 

I spent over two thousand dollars on tutoring lessons over the course of the semester, and I worked quite hard on learning calculus.  I learned a lot from my tutor.  He was great.  I learned nothing at all from my professor.  In fact, the course would have been better with no instructor at all. 

I was quite confident that I would do well on the final exam, and I needed to do well in order to pass the class with a B or higher.  But there were more typos on the final exam, although I didn’t realize how many typos there were until after I failed the exam.

There were several badly written questions, which I should have expected.  I did my best answering them, and I think I got those questions right.  But then there were two questions in the middle of the test with numbers written incorrectly.  I didn’t know it at the time, but it was impossible to answer those questions.  The professor had written non-sensical questions that could not be answered.

It was a multiple-choice format, so you had to select one of four possible answers.  I did all of the work, many, many times. I went over each of the two flawed questions again and again, but I ran low on time so I had to rush through the rest of test to finish.  The time ran out before I could finish. 

My heart was racing for the last 30 minutes.  The fear of failure and panic started to overwhelm my cognitive abilities.  I’ve never failed a test since my sophomore year of college.  I thought I had failed the test and the class.  It was a horrible feeling.

But then I noticed that I was able to keep working, somehow time didn’t run out, so I continued to answer the last questions the best I could.  I ended up being able to finish the exam, except for the questions that I could not answer.

Then I went to email the TA pictures of my written work.  That’s when I saw the email from the professor, which he sent during the final exam, when we don’t have access to our email or any other program on our computer. 

He admitted in a very dismissive way that he had accidentally put a couple of typos in exam, which made two questions impossible to answer.  He said he would look over the exams and grade them himself, rather than rely on the automated computer to score the exams like the other tests. 

I’m not sure how he graded our exams, or even if he graded our exams.  He never released grades for final exam.  There was no feedback, nor any explanation of how we did on the test.  He just gave us final course grades. 

I got an A- for a final grade. 

Why?  I’ll never know. 

Student learning was not an objective in that course.  Students were merely supposed to follow directions and play with numbers on badly written standardized tests.  It was a complete waste of my time.  But I did learn a great deal from my private tutor, for a couple thousand extra dollars. My experience in the UTD course was an expensive and stressful waste of time.  I felt completely disrespected by the institution.

An Education or Quid Pro Quo?

Sadly, that was not my only horrible experience.  In another course on business management fundamentals, we spent most of our time reading a poorly written textbook that was meant for undergraduates.  The professor spent class time lecturing.  She rarely said anything that wasn’t already mentioned in the textbook.  She just droned on and on.  The whole class was a waste of time. 

What made the experience even worse was that I was genuinely interested in the subject matter of the course.  But it worked out.  I spent class time with the volume off so I could read books on business management (which weren’t on the syllabus).  I was able to learn, despite a bad teacher, bad textbooks, and a badly designed course.

At a couple of point during each class, students broke out into “discussion” groups online.  However, most of the students had nothing to say.  Sometimes, no one in a break-out group would respond to questions or comments.  I never knew for sure, but either the students weren’t actually there, they didn’t want to talk, or they didn’t have the technical tools to participate.  I talked a bit, when I actually had conversation partners, but mostly I spent “discussion” time reading books.

Almost all assignments were poorly written multiple-choice tests, just like my math class.  I complained to the professor about the tests.  They were vague, misleadingly written, and often open-ended with many possible “correct” answers, even though there was only ONE correct answer to select for the test.  

In one case, there was a question about John Dewey, who I know a lot about.  Dewey was mentioned in one obscure sentence in the textbook.  I remember that because I stopped in anger at how badly the textbook treated this complex philosopher and his ideas.  On the test, the “right” answer about John Dewey was actually factually wrong – in fact, all the provided multiple-choice answers were wrong. 

I wrote a long email complaining about her tests, and that question in particular.  Surprisingly, the professor gave me the opportunity to negotiate a new course projects so I wouldn’t have to take any more of her standardized tests.  Instead, I had to write eight essays, on top of the essay I already wrote in the class, which was technically a “team” project, but I did almost all of the work because my partners were clueless and unmotivated.

But just like the math class, even though I wrote all these essays, not a single one received a grade or any feedback.  I got an A in the class, or an A-, but I’m not sure why I got the grade I did.  None of the professors in the department ever gave students any feedback on their projects or tests.

Later that semester, I was surprised to get an email from this professor asking if I would be interested in helping her on her research project.  She was going to write a new article for publication, and she was working with another professor in the department.  She wanted to bring me on as a co-author. 

I didn’t know anything about her project, so I was hesitant.  Plus, publishing an article in a business journal wouldn’t help me personally in any way.  And I didn’t want my name associated with a topic, methods, or conclusions that I didn’t believe in.  I was also finishing up what would become two new books for publication.  So, I wasn’t very interested.

But I thought it would be a way to earn course credit.  I could avoid more standardized tests.  So, I asked her to do an independent study class.  This way, I would get course credit, which would enable me to finish my degree faster.  I also hedged, and said that I would help in any way that I could with comments and research, but I never said that I would write the article with them.  I didn’t want to have my name attached to a sub-standard article. 

We both signed an independent study request form.  It was approved by the Dean.  The form stated that I would be supplying essays and literature reviews to support their paper project, but it did not say that I would actually write their paper. 

Once the form was signed, she sent over some articles.  I got into the research they relied upon for their study, but I found it methodologically flawed, to say the least.  When I read the published work of the other professor from the business school at UT Dallas, I was shocked at how bad it was.  I don’t understand how it could have ever got published.

For example, one of his studies published in the International Journal of Human Resource Management in 2006 relied upon a badly outdated theoretical framework from 1961.  You read that right, the authors of this research paper somehow believed that no relevant research on their theoretical framework had been conducted in over 45 years.  They based their whole study on an ancient relic of a theory, as if it were some type of classic, holy text with bedrock axioms that supposedly would remain relevant forever.  I was embarrassed reading this paper.

So, what were the hypotheses that were being statistically tested?  The first hypothesis was that people from different countries would have different values?  Any student of anthropology, sociology, political science, or history would know this claim doesn’t really need to be tested, at least not with the highly rigid and abstract typology that these authors used.  Of course, people from different cultures have different values.  Anthropologists have known that for over a century, and it is already well documented.

And what countries were sampled?  India, Poland, Russia, and the U.S.  Why these four countries?  The authors don’t admit it, but this random selection was based on convivence, rather than any theoretical considerations.  There is really no rational way to link these populations into a single study.

And what was the second hypothesis? That the U.S. had a “capitalistic system,” which created different values from the other three countries, ostensibly because these other countries did not have a capitalist system.  This hypothesis is not only highly simplistic, it is also false.  It is based on a misunderstanding of what the concept of culture and nationality entails, and also a clear misunderstanding of the specific cultural patterns of these four countries.  It is naive and nonsensical to believe that the U.S. could be clearly defined by one attribute (capitalism), which the other three countries lacked, which they didn’t.  All four countries were capitalistic in some form or another, and they all had other important characteristics, but this study only acknowledged simplistic, naive, and false stereotypes.

This article stated many claims that were highly generalized, conceptually confused, sometimes tritely true, and sometimes false.  Some of the claims presented as facts that were actually debatable, if not clearly false.  But the real weakness of this paper had to do with the flawed statistics being used. 

The authors not only chose an indefensible sample of four countries, but they also collected a non-random and small sample of only 341 to 578 respondents from each country.  These authors believed that they could make meaningful statistical conclusions with only about 500 people from each country.  For a country like India, with a population of over 1 billion people, this is highly problematic, to say the least.  How are 500 people supposed to randomly represent over 1 billion? 

These samples were also highly skewed by socio-economic factors.  Almost all respondents were educated professionals, which make these samples incompatible with the highly generalized and sweeping cultural hypotheses being studied.  How are 500 highly educated professional Indians or Russians supposed to represent the total population for each country in order to reasonably demonstrate divergent cultural values between different nations? 

I would expect naive undergraduates to make such rookie mistakes, but tenured professors?  And how did a study like this even get published?  Your guess is as good as mine.

Needless to say, drivel like this should never have been published.  I was embarrassed simply to waste my time reading these articles.  Clearly, there are academic backwaters in the field of business, which publish low quality work that is somewhere between the level of a precocious undergraduate to the level of a lazy graduate student.  It was really, really bad.  I often see such bad work as a reviewer for annual social scientific conferences and academic journals, but I was surprised at how bad research could actually get published.

So, I wrote a highly critical literature review citing some of the most respected, gold standard research on the topic.  I also pointed out the flaws of my professor’s and her colleagues theoretical and methodological foundations.  I wrote that her colleagues “use incomplete, out-of-date, and irrelevant literature reviews, faulty and invalid theoretical foundations based on faulty literature reviews, and faulty methodology that result in questionable studies that produce invalid, statistically meaningless, and/or commonplace results that would not be publishable in most social-scientific fields of study.” 

I also criticized the Dutch scholar Geert Hofstede, who published highly flawed and questionable work, but got quite famous in business schools over the past couple of decades.  Some business professors love his work, particularly because he enables them to make crass cultural stereotypes in order to label and evaluate employees.  I argued that “Hofstede’s theory and his results should not be used or relied upon.  Serious social scientists have largely ignored Hofstede’s work because of its many theoretical and methodological flaws.  Hofstede is mostly cited by scholars in business management and international business, perhaps because these scholars have little training cultural methodology and so do not fully understand the limitations of Hofstede’s work.”

I also made several other critical suggestions, especially about the survey instruments, and how the statistical data was being analyzed.  I pointed out that survey data are not self-explanatory and that survey data collection tools never produce neutral measurements.  As business management professor Henry Mintzberg said in his study, The Rise and Fall of Strategic Planning, “Answers come cheap on seven-point scales” (p. 93).  Mintzberg quoted another management professor who explained how “self-selected respondents” have “little reason to respond accurately” and sometimes don’t even know “what they were talking about” (p. 93).

My professor and her colleagues were not only operating with a flawed theoretical foundation, there were also using flawed survey instruments, and they were using flawed data samples.  There was no way that I could be associated with such sloppy research, and I was embarrassed simply to read this mess, and I was highly surprised that any academic journal would have published such meaningless scholarship.  This is an example of how there are murky academic backwaters in many disciplines that publish highly questionable “research,” one step above the predatory journals that simply publish anything for a fee.

These scholars seemed oblivious to point made by management professor Jeffrey Pfeffer forty years ago in his book Organizations and Organization Theory (1982).  Pfeffer criticized the naive practice of using invalid questionnaires to conduct social scientific research: “By imposing the researcher’s concepts and language on the subject though questionnaires, one largely determines that the result will be consistent with those concepts and measurements” (p. 76).  Pfeffer went on to sardonically add, “It is not clear, simply put, whether the results of much of the research in this tradition tell us anything about the world of organizations or those who populate that world, but they certainly tell us a lot about those who study organizations and how they view the world” (p. 76).  So much shoddy scholarship in the social sciences, especially in the field of business, is just pretentious navel gazing re-packaged with a scientistic veneer. 

I delivered this 40-page critique, with a list of recommendations.  I knew it wasn’t the response my professor expected or wanted, and I knew she would take it badly on some level. 

But I was hopeful she would accept the offering, say I wasn’t the right fit to go forward, and then give me course credit for my work because I delivered exactly what I said I would, which was in writing on the official school form signed by all parties.  I provided essays and literature reviews, just like I said I would.  I was also genuinely willing to help them correct their mistakes.

I had hoped that maybe, the professor might say, hey, I never looked at it that way, so let’s discuss a better approach, so my research would be stronger. With such a response, I would have been willing to do some more work to help the project along.  But I strongly doubted she would have this kind of response.

Guess what happened?

My professor was offended that I criticized her methodology.  She said she was surprised by my recommendations, and she weakly defended her work.  But more importantly, she basically said that I was not the right fit for the project because all she wanted was someone to write her paper for her.  Then she said that obviously she could not work with me on an independent study. 

Clearly, she was too filled with emotion to remember that she had already negotiated with me and that she had already agreed to the independent study.  We had both signed the form.  It was approved by the Dean.  The independent study was already official.

Now, apparently, that contract was null and void because my professor was offended by my critical conclusions. When a professor criticizes another professor for the peer review process, it is perfectly acceptable.  When a student criticizes a professor, even though the student WAS a former professor, it is unacceptable.  It’s a double standard I’ve run into before, always with negative consequences for me.

This is just one of many examples I could cite to explain how the whole notion of academic freedom and critical debate in higher education are empty ideals, especially here in Texas.  They are supposedly some of the bedrock values of higher education, but only if one tenured professor criticizes another tenured professor’s work, and even then, many professors wilt like jilted flowers and emotionally lash out when they are criticized by their colleagues.  What happens when a powerless graduate student criticizes a professor?  Nothing good, I can tell you that.  It’s the main reason I wasn’t able to earn my PhD, which is an interesting story I’ve written about.

Of course, I could have tried to fight my professor by pointing to the form that had been signed.  I had the signed form and I had her emails.  But why bother?  I didn’t really want to work with that professor anyway due to the abysmally low research standards she had, and her obvious ignorance about the topic she was researching.  I just let it go. I dropped the independent study course.

For me, it was just further proof that the Naveen Jindal School of Management at The University of Texas at Dallas was not an educational institution at all.  It was all just a sham.  Students were being used for the benefits of professors.  We paid their salaries and we were supposed to do whatever they told us to do, and we were supposed to sit there and take it with smiles on our face.

Clearly, this professor did not care about me or my learning as a student.  I was nothing to her.  My independent study was not about my education.  It was simply to provide her with free labor so she could publish another article and get a promotion. 

The only thing that mattered to my professor was her objectives, her research agenda, getting her paper written the fastest and easiest way possible, so she could reap the professional rewards with a better yearly evaluation and possible higher salary.  My needs as a student and as a learner were obviously never considered.  I was nothing but an object to be used and then thrown away.

Since when is an independent study class in graduate school about the needs and interests of the student as a learner?  I should have known better.  Clearly that is a silly and antiquated notion.

The Senseless Cruelty of Small-Minded Math Professors

I took a couple of math courses from the same professor, not the one I discussed above, but a different professor.  One was a statistics class.  The other was a class on supply management, but according to the professor and the textbook she chose, everything about supply management could be boiled down to complex calculus equations.  This particular math teacher never even attempted to teach.  She simply offered video lectures that added nothing beyond the textbook.  In fact, her lectures were inferior to the textbook, which offered much more explanation and detail.  However, the textbook she chose was filled with typos and errors in the answers for the questions at the end of every chapter. 

Because this professor didn’t teach at all, students had to teach themselves by reading the textbook and working through the sample problems, with the help of the answer key.  But this was sometimes impossible because the answer key was wrong due to typos, omissions, or just false information.  I emailed the professor several times about this and her response was: “students are smart enough to recognize a typo.” One way to interpret this statement was that she was just too lazy to do anything to alert students to typos or to compensate for them by offering correct solutions.  Another way to interpret this statement was that she was calling me stupid because I, for one, could not properly learn the material when I didn’t have an accurate model to use as a guide.  Clearly, I was too stupid to recognize a type and know the correct information that was supposed to be there.

This arrogant professor never engaged with students or offered any helpful information – or any help at all.  At the end of the above email she said, “I will ask the course TA to address your concern,” which was her standard answer to every question.  The TA was helpful, but as a student, she was not an expert and she was very bad a communicating clearly with students, and more than a little overwhelmed with time because she had to do all the actual teaching work for the course because the professor was too lazy.

This professor was so lazy that she didn’t even both to update her syllabus with the correct chapter numbers and titles from the textbook, which had changed edition.  Not only was it hard to read and understand the textbook, it was difficult just to read her syllabus.  She also had several errors on her syllabus for the statistics course.

The irony of this professor’s response, and overall lack of care in the course, was that the class covered the Toyota Manufacturing system and the topic of quality control, where front-line workers have the responsibility to stop the line to point out defects so they can be immediately fixed.  This a great example of professors who “teach” what they do not know or cannot practice. 

This professor didn’t care about the quality of her course, catching and fixing errors, or teaching students, or even the mental health of students who struggle through a difficult math course, which was made more difficult with confusing homework that was full of errors.  She simply dismissed student concerns, if not passively aggressively mocking them, by telling us to figure things out for ourselves.  What happens to businesses disdain their customers like this?  They go bankrupt.  But not a university level MBA program.  This type of malpractice is the status quo because there is no accountability to students, or their future employers.

This professor also committed malpractice intellectually and academically by taking complex subject matter and reducing it to tricky multiple-choice questions with supposedly clear and discrete “right” answers.  But many of her exam questions were far from clear because they were badly worded and ambiguous, especially in the statistics class.  One question was about “discrete” variables, which I got wrong because, as the professor pointed out, I was “over thinking the question” by “inserting a socially acceptable distinction” into a “non-social question.”  She clearly has an impoverished, and false, epistemological understanding of the world where in her mind there are “non-social” meanings attached to words and concepts.  She lives in small world demarcated by the clean logic of calculus, but she clearly is clueless about how calculus and statistics get used by real human beings in the social world to try to solve real problems. 

The textbook in this class did offer real world problems with which to apply statistical tools.  And the textbook was very clear about the messy judgments that practitioners have to make when interpreting the mathematical results.  However, the professor didn’t seem to understand how statistics are actually used in the world - the strengths and weaknesses of statistical methods and conclusions.  On our exams, we were offered nothing more than tricky word problems.  On the final exam, about 8% of the grade boiled down to two questions defining the same big, useless concept.  And because her exam was open-book, these two questions merely tested the dexterity of our fingers to look up the definition of this word in the book. 

I like statistics and I was enthusiastic about the class, but her lack of any teaching and her utterly ridiculous exams made me dread her class.  It was utterly demoralizing.  I knew how to conduct the statistical tests using Excel, and how to interpret the results, but those important skills were largely useless because she wanted us to just explain abstract concepts in tricky, standardized questions, or to perform the complex, tedious math by hand just to show that we could do it. 

In a previous statistics class for another master’s program (in the field of Education) we actually combined our knowledge of concepts and our ability to use statistical software to collect read world data and design our own research study.  I learned a great deal in that previous course.  In this course at UTD I learned much more math, but nothing about how to actually use statistics to do useful, meaningful research.

The economists John Kay and Mervyn King wrote an insightful book called Radical Uncertainty: Decision-Making Beyond the Numbers in which they argue that we live in a “radically uncertain world” that cannot be known or tamed by mathematical equations, which give us a false sense of security.  I actually tried to talk about this book in one of my emails with the professor, and I recommended it to her, but she ignored my comments because, clearly, in her mind she already knew it all and she had nothing to learn from anyone, least of all her ignorant students. 

It is precisely this arrogance, which almost every professor at The Naveen Jindal School of Management had, which creates so many problems in the world, like the great financial crisis of 2007-2010.  Arrogant professors try to turn the messy, uncertain world into a simple puzzle with “well-defined rules and a single solution,” which can be clearly stated on a multiple-choice test (p. 20).  But the most important and pressing problem in the world are “ill defined” and there is not simple solution that everyone would agree on (p. 97).  While mathematical data and models are very helpful analytical tools, they are “never descriptive of the world as it really it” and so are of limited value in solving read world problems (pp. 96, 247).  Mathematics and deductive reasoning will never replace the messy practice of decision making. 

Henry Mintzberg talked about this predicament in his book Managers Not MBAs: A Hard Look at the Soft Practice of Managing and Management Development.  He explained that there was a deep set “assumption” in business schools that “one cannot be a proper manager without mathematical ability,” which he noted “would come as a great surprise to the legions of managers who have succeeded without that ability.”  Mintzberg decried the myopic focus on teaching only quantitative analysis, noting the many “dismal managers who have risen far in the business world with great mathematical skills” (p. 40).

In Good Strategy, Bad Strategy: The Difference and Why It Matters, management professor Richard P. Rumelt explained how the mathematician Kurt Godel proved that mathematical systems couldn’t definitely solve many important problems, even within the closed domain of mathematical reasoning.  Even sophisticated logical systems, like mathematics, are by their nature “incomplete,” and thereby contain “statements and propositions that cannot be judged true or false within the logic of the system” (p. 285).  Thus, even with mathematics, one has to look beyond the logic of math for answers to complex problems, and especially how to apply one’s conclusions in a practical and useful way.

The math classes at UT Dallas were utterly useless to anyone seeking to one day understand and solve real world problems.  Those students who did well in these classes demonstrated only the useless ability to play stupid, small-world math games with clearly defined solutions.  This is an utterly useless skill in the real world, especially to professionals who hope to manage organizations in order to produce productive results.

Playing School: Dumbed-Down Curriculum, Unresponsive Instructors, & Passive Students

A couple of years ago, I was looking at online courses in order to research ideas for the non-profit organization that I was running.  I came across a new breed of online MBA programs, which offered certification in core subjects for low prices.  Some programs were even free because they sold graduates to corporate head-hunters for high fees. 

One of these programs was called Smartly, now called Quantic, which is run by the business Pedago.  It was a very well-designed web course that was organized around core-subject modules.  I did several of the modules and liked the user experience a lot.  It gave me great ideas about how to create something similar for my non-profit, if I was ever able to raise the funds for an IT expansion of the organization’s core services.

But taking those modules also got me to thinking about the meaning of “higher education,” and the idea of educational quality for MBA programs, and other graduate degrees.  While the Smartly website was certainly cheap (it was free), well designed, up to date, and easy to use, it also was a hollow academic shell that did not adequately reproduce the fullness of what “higher education” is supposed to be about.  It was just a ritualized academic exercise that led to a meaningless certification. 

I thought to myself that if I ever went back to school for an MBA, I would choose a traditional program at a university because I wanted to get an actual education.

But high quality, traditional MBA programs are very, very expensive.  Unless you are willing to pay over $100,000 a year to earn a degree at a private, ivy league school or at a flagship state university, there is really no way to get anything resembling quality higher education.  I really wanted to enroll at Rice University, but the program would have costs over $120,000, plus living expenses and travel to Houston for two years.  Few people can afford that.  I certainly couldn’t.

Plus, I was enrolling during a pandemic, so I knew that I had to take a web program, for obvious health reasons.  I didn’t want to waste money on an expensive degree from a good school because I thought all programs would be of roughly the same low quality, given the online experience.

I knew the sordid truth about undergraduate education at state schools and community colleges across the U.S., having been an undergraduate instructor in higher education for twenty years and a researcher in higher education.  But I was absolutely shocked at how bad graduate schools had become when I enrolled at the University of Texas as Dallas. 

I’m not getting “higher education.”  I’m not getting any education. I might as well have enrolled in Trump University.  Seriously, it was that bad.

John Cassidy at The New Yorker famously warned his readers that Trump University was “worse than you think.”  Well, the same could be said about The Naveen Jindal School of Management at the University of Texas at Dallas.  It has been a horrible educational experience, as I’ve already discussed.  But it’s worse than you think.

Let’s start with the basics.

All business schools use the “case-study” as the primary curricular and pedagogical method to teach students about business practices.  James G. March was a professor of management, education, sociology, and political science at Stanford University.  In his book The Ambiguities of Experience, March explained the theoretical foundation of management education, which for decades has been based on learning from experience through case studies.  The stories in case studies are supposed to give students a large repertoire of “changing experiences” and “good practices.” These stories supposedly serve as a models to help better inform the future decisions of managers once they are on the job. 

Professors of Management Charles A. O’Reilly III and Jeffrey Pfeffer argued in their book Hidden Value: How Great Companies Achieve Extraordinary Results with Ordinary People that the “most effective executive education” happens “not through lectures by professors or other experts but through engaged discussions of examples, typically in the form of a case, in which the interactions among the participants generates a variety of possibilities and perspectives” (p. ix).  In Good Strategy, Bad Strategy: The Difference and Why It Matters, management professor Richard P. Rumelt illustrates the proper practice of the case study method as a tool to for discussion and critical examination of complex problems that have no clear solutions.

However, as March explained, this type a pedagogical tool often produces only “low intellect” learning, and it can easily devolve into an empty ritual.  By reading case study stories, students gain little knowledge, and they get no deep understanding of the issue or problem discussed in the article.  Most case studies come in the form of very short articles that have very few details.

Plus, case studies usually offer only the easy to see, surface details that can be easily recollected by biased participants.  These details are organized into a superficial, and often subjective, story about what happened at a business and why.  Most of the time, it’s essentially fancy gossip. 

Henry Mintzberg did a fine job critiquing the case study method in his book Managers Not MBAs: A Hard Look at the Soft Practice of Managing and Management Development (pp. 48-64).  Unfortunately, “the practice of managing cannot be replicated in a classroom the way chemical reactions are replicated in a laboratory,” especially with short and superficial case studies (pp. 53, 59).  Thus, according to management professor Sterling Livingston, by relying on the case study method, “managers are not taught in formal education programs what they most need to know to build successful careers in management” (quoted in Mintzberg, p. 56).

This is why management researchers have warned for at least sixty years that case studies are a poor source of knowledge for training managers.  This was the conclusion of a report published in 1959 by the Carnegie Corporation and the Ford Foundation, entitled The Education of American Businessmen.  This basic conclusion is still true today.

O’Reilly and Pfeffer’s book was based on extensive research and each chapter, focusing on a different company, was relatively in-depth, but at The Naveen Jindal School of Management we never read a case study of more than 10 pages.  The best case studies are book length, or at least a full chapter in a book.  But such books were never assigned or discussed by UT Dallas professors.  In fact, real research was almost never assigned in any class, only poorly written, highly general textbooks meant for undergraduates.

But I read quite a few books during my MBA, in fact hundreds, often while ignoring the banal trivialities that my professors were talking about during class.  I read books such as The Art of Innovation, about IDEO, and The Toyota Way, which were both very interesting and useful lessons on the specific ideas, culture, and actions of specific companies.  I learned nothing about management from any professor at The Naveen Jindal School of Management, but I learned amazing knowledge and practical skills from professors like Jeffery Pfeffer.  I read every book he ever wrote, and I am indebted and grateful for his knowledge and wisdom.

When we were assigned case studies at The Naveen Jindal School of Management, they were little more than a short, ambiguous story, like a novel, which we were supposed to subjectively interpret.  And that is what students did, if they even participated in the exercise, which many did not.  Students said whatever they wanted to say, often with trivial generalizations, with no right or wrong answers.  It was all relative.  And the professor’s grading was completely subjective, often awarding more points for brief bullet point format than accuracy or substance.

Students’ subjective opinions usually went undiscussed and unchallenged by other students or the professor.  They just floated in the air, or on the web, as so many useless words that were uttered simply to pass the time in a useless ritual.  This meaningless exercise is the very best that The Naveen Jindal School of Management had to offer. 

Little did I know, but I would be looking back on case study readings as the closest I was going to get to authentic learning at UT Dallas. All of my classes were such a waste of time and money.

Most professors didn’t use the case study method.  Most professors didn’t even bother to try and engage students in the classroom.  Most professors at The Naveen Jindal School of Management used boring, pedantic, and pompous lectures as their only classroom tool. 

At UT Dallas, the classroom was simply a place to disseminate out of date and useless information in a boring ritual called schooling.  There was no learning of any kind.  There was no activity required of students beyond sitting, watching, listening, and recording.  There was no thinking.    

The Naveen Jindal School of Management at UT Dallas is literally stuck in the 19th century.  Things were that bad.

Horace Mann, a 19th century Boston school reformer criticized traditional learning as a useless ritual.  He argued, “hearing recitations from a book is not teaching,” but that is exactly what is still happening at UT Dallas, almost 200 years later.[10] 

In the Naveen Jindal School of Management, professors simply summarize textbooks and present dumb-down factoids on ugly, cluttered PowerPoints, and students are expected to unquestionably memorize this information for a standardized test.  Just like in the 19th century, professors at UT Dallas “drone on through dull hours and dreary routine” lecturing “commonplace” information from textbooks and requiring students to answer “in the exact language of the book.”  Classes were a “dull, uninteresting, tiresome place.”[11]  These quotes are from 19th century reformers critiquing outdated, traditional Bostonian schooling, but they also perfectly describe what is still happening at UT Dallas, and many other schools, colleges, and universities all over America.

What is the purpose of schooling?  Student learning and development?  Don’t be so foolish.  The purpose of schools, classically captured by a satirical magazine in England during the middle of the 19th century:

“Ram it in, cram it in, -

Children’s heads are hollow!

Slam it in, jam it in, -

Still there’s more to follow…[12]

Most classes focused exclusively on the professor’s lectures rather than course textbooks.  In many classes, there was either no textbook, or it was not required to read it.  Thus, the name of the game was recitation and memorization.  Every utterance out of the professor’s mouth was supposed to be carefully transcribed so that it could be faithfully recalled on a standardized test.

When textbooks were used, most of the time they were giant, overpriced, badly written, conceptually bland encyclopedias designed for ignorant undergraduates who knew nothing about the topic.  Almost none were graduate-level books.  I had only one class where we used a book by a first-rate scholar in the field.  Another class used an interesting, but faddish book by a management consultant.  The rest of the books were written by hacks and published by predatory textbook companies.

These bloated informational monstrosities were designed for two central purposes.  First, and most obvious, these books are used to price-gouge students.  Monopolistic textbook companies set whatever prices they want, and students have to pay it.  It’s robbery.

Sometimes, professors get in on the racket, like one of my UT Dallas professors who wrote his own textbook.  He told the class we had to buy the most recent, and most expensive, edition of his textbook, otherwise we would miss out on important, “new” information.  That was a bald-face lie.  He just wanted to pocket more profits.  There is often little that’s new in a “new” edition.  I should know.  I spent many years as a textbook reviewer for all of the major academic publishers.

Large textbooks were also created to make useless, ritualized schooling possible.  Encyclopedic textbooks are the repository of generalized facts, which students have to memorize in order to regurgitate those facts on standardized tests, especially for the mindless multiple-choice tests that were used by most professors at UT Dallas.  Standard pedagogy in most American colleges has been reduced to the same mindless rituals that plagued K-12 schooling over a hundred years ago, where classrooms where “characterized by a lifeless and perfunctory study and recitation of assigned textbook materials,” as explained by an educational professor in 1931.[13]

The Naveen Jindal School of Management did not offer its students any learning or education.  This department offered only useless, ritualized schooling.  I was subjected to an endless series of mindless multiple-choice tests that reward senseless memorization of meaningless factoids that were graded by machines. 

How is that supposed to prepare future business leaders?  Obviously, it’s not. 

As I already pointed out, this fraudulent institution of “higher education” does not serve the best interests of students or society.  This institutions serves itself.

But wait.  It’s worse.

There is an even more brazen and embarrassing smoking gun, which proves that schools like the University of Texas are utterly ignorant of basic educational standards, and unabashedly uncaring about the actual learning of its students.  There was no actual feedback on any school assignment ever.  None.  Not once. 

Yes, there were a couple of short, simplistic comments about grades that were sometimes thrown back at students, but these were nothing more than bureaucratic rationalizations, like “good overview” or “nicely done.” 

One semester I worked with a competent group of students on long marketing document, and we earned a 95% as a grade.  I didn’t care about grades because I knew they were a subjective sham, but two of my group members were disappointed.  They wanted to know why we lost 5 points.  When the marketing professor was asked, he replied that there was one minor issue with the executive summary, but the issue he addressed was not in his rubric for required parts of the assignment. 

Furthermore, his taking 5 points off for this very minor issue represented taking 50% off for that section of the assignment, which was worth a total of 10 points.  Such grading was logically unjustifiable, but that professor simply didn’t care.  He obviously didn’t want to give us a perfect grade of 100 percent, so he just took 5 points off, probably at random, and he didn’t even think about how or why graded the way he did until my group sent the email asking for feedback.

But we got lucky with that email.  Many of the professors did not respond to emails, and most did not address, let alone fix, problems.  One Marketing professor responded to only one of five emails that I sent one semester, in which I made requests or asked for clarifications. 

For real leaning to take place, not only does there need to be a teacher and a curriculum, both of which were completely absent at The Naveen Jindal School of Management, but the teacher needs to evaluate student learning and give students detailed feedback so that students can learn from their mistakes, try again, and get more feedback to see if the learning was better. 

This never happened, ever, at the state-sponsored diploma mill that I was enrolled in.  There was no feedback because the professors are not teachers.  They don’t care about teaching.  And they don’t care at all about student learning.  It was all just a formulaic, ritualistic charade.

But wait, it gets even worse. 

Surely, as The Naveen Jindal School of Management likes to brag about on its website and in mass emails to the department, all these high-quality professors who publish all this high-quality research must be giving amazingly high-quality lectures of cutting-edge importance, right?  Wrong.

Most professors simply rambled off the cuff about the topic of the day.  There was rarely an actual organized lecture, like you would hear at a scholarly academic conference.  Most professors never actually lectured about course textbooks, and when they did, the information they shared was merely summative.  If you actually read the book, which most students did not, lectures were a complete waste of time.  Often, professors would go off on long tangents, taking 30-45 min to go off topic, like discussing famous Super Bowl advertisements, or asking students to talk about trips abroad that they had taken.

Several professors, mostly in the Marketing department, used badly designed and cluttered Power Point slides with outdated images and old examples.  Why?  Mindless tradition.

The management consultant Martin Lindstrom argues in his new book, The Ministry of Common Sense, that Power Point presentations, or “decks,” as they are known in corporate circles, have been a conventional way to waste time at corporate meetings for a long time.  Ignorant managers carry on this largely pointless tradition because they don’t actually know how to lead meetings. 

Lindstrom explains, “In workplaces worldwide, employees seem to be in an intense competition to construct the biggest, longest, most graph-filled, diagram-heavy PowerPoint deck possible” (p. 129).  In You’re About To Make a Terrible Mistake, business management professor Olivier Sibony argues that PowerPoint presentations have their “own special way of smothering discussion” because they are used to “hide the weakness of an argument, to distract the audience with visual tricks, and to hold the floor and limit the time available for debate” (p. 215).  In his book What Were They Thinking?, Stanford University Management professor Jeffrey Pfeffer explained that PowerPoints “elevate format over content” and he argued that they should be banned from meetings (p. 167).

These employees clearly learned this relatively useless skill from old-fashioned business schools, like The Naveen Jindal School of Management.  Often, especially in the Marketing department, students were subjected to old and irrelevant factoids, images, and videos from the 1970s and 1980s.  Take, for example, my Advertising and Promotional Strategy class.  It was taught by the “best” teacher in the department, according to student evaluation survey results, and the fawning praise of other professors in the department. 

This supposedly “great” professor used PowerPoint slides that had been created around 2001, just as the worldwide web was taking off.  It was glaringly obvious to anyone with a brain that the professor was two decades behind the times.  Almost everything he talked about what out of date.  Take for example the topic of media.  He devoted weeks to tv, radio, magazines, and newspapers, but he didn’t devote any lectures to the web or social media, even though the textbook covered these topics.

These outdated materials were not used logically and historically to bring an extra layer of academic depth to the discussion.  The marketing professors at UTD had no understanding of, or appreciation for, historical analysis.  There was never any discussion of history, or politics, or the social sciences. 

No, these outdated lecture materials were used simply because the professors were too damned lazy to update their presentations.  They had clearly been using the same Power Point decks for decades – and many of the corresponding multiple-choice exam questions were also decades old and completely irrelevant. 

These marketing professors were simply lazy and incompetent. They knew that no one cared, not the students, and not the administration. 

My economics professor, Peter Lewin, was even worse.  His only interaction with the class was a couple of emails, each sent a few days before one of the three online exams.  I have included two of his emails below. 

Clearly, he did not care enough to even communicate clearly or professionally with the class, but the level of carelessness in his second email was astonishing (appearing first below).  A child could compose a more professional email.  This is one of the best pieces of direct evidence that I can give that displays the utter incompetence and uncaring attitude of most of the professors in the Business School at UT Dallas.  I also included his first email to the class below it, which displays the same carelessness.

Lewin was an extremely arrogant man who was a bad communicator.  He offered students a taped lecture of PowerPoint slides and printed comments below, which he read.  Many sentences on his PowerPoint slides were grammatically incorrect, like his email.  Many of his statements were ambiguous and hard to understand.  Likewise, many questions on his standardized tests were ambiguous and unclear. 

After one test, on which I did poorly, I pointed out five questions where his language and his logic were unclear at best, which lead to at least two (or more) correct answers, rather than the supposed single “correct” answer to earn the point for the question. 

I met with him to discuss these questions.  He was visibly and audibly angry with me.  He made it very clear that nothing was wrong with any of his slides or quiz questions.  The only problem was me.  I was supposedly completely ignorant and he was completely blameless.  He grudgingly gave me only two extra points because he admitted, albeit with qualifications, that his language was ambiguous in at least to test questions, albeit he only grudgingly gave up those two points.

One of the questions he did not give me credit for had four options: chose one right answer or select “all of the above.”  Two of the answers matched the definition of the concept verbatim from his lecture slides, so there were clearly two right answers, which he acknowledged.  So, logically, choosing only one answer could not be correct, so the right answer, accepting the logical structure of his multiple-choice question, had to be “all of the above.”  However, he denied that the third answer could be correct, although it was worded very vague and could be considered correct depending on how you interpreted the statement.  He wouldn’t give me the extra point.  He kept asking why I didn’t just select the two answers as right, and I kept telling him over and over that the test only allowed for ONE right answer even though two answers were correct, and the third possibly correct.  The logical structure of the test question gave the student only two options: choose one answer, or choose all three answers.  So, if two answers are correct, how are you supposed to answer?  He angry blamed me for being ignorant and stupid, almost shouting at me. 

Clearly this Economics professor was epistemically and psychologically naive, having never heard of basic foundational insights in social psychology established about a half-century ago.  But this dimwitted economics professor was not alone because almost every professor I delt with at The Naveen Jindal School of Management suffered from the same ignorance, especially my math and statistics professors.

By the 1970s and 80s, social psychologists had experimentally demonstrated that, “Rather than being retrieved as static units from memory to represent categories, concepts originate in a highly flexible process that retrieves generic and episodic information in working memory” and that there is “a nontrivial degree of instability even in such familiar, often-used categories,” to say nothing of more esoteric and argumentative concepts (Ross & Nisbett, 2011, p. 68).[14]  There is always “substantial variability from one person to another in the meaning even of rather fundamental concepts.  Hence, any two people are likely to interpret the same situation in somewhat different ways” (p. 65). 

Furthermore, most people “do not recognize the inherent variability in our own construal of events; hence we predict our own behavior with too great confidence.  We similarly fail to recognize both the random (or at least unpredictable) differences between our own and others’ construals of events and the systematic, stable differences.  Consequently, we predict other people’s behavior too confidently and, when confronted with surprising behavior on the part of another person, attribute it to extreme personality traits or to motivational differences between ourselves and the other person, rather than recognizing that the other person may simply have been construing the situation differently” (p. 69).

To put it simply: “The same stimulus often can be interpreted in different ways by different people or by the same person in different contexts” (p. 69).  Almost all people “fail to recognize the degree to which their interpretations of the situation are just that – constructions and inferences rather than faithful reflections of some objective and invariant reality” (p. 85).  This fundamental insight makes all multiple-choice tests of questionable validity in principle.  But then again, every true educator knows that multiple-choice tests were never designed based on learning principles or care for the education of students.  These high stakes tests are political instruments designed to sort and rank students, not educate them.

And what about logical and conceptual consistency between classes?  Nope.  Many of the professors at The Naveen Jindal School of Management can’t get their “facts” straight.  In one marketing class “store brands” and “private labels” where completely different concepts, according to a lecture and a multiple-choice midterm exam.  But in another class, these concepts were considered two ways of stating the same thing. 

I missed several points on two exams because of this inconsistency.  At first, I considered these concepts similar, which they are, but got the question wrong on an exam.  The “right” answer was they were different.  On another exam in a different class, I remembered my punishment on the first exam.  So, I said they were different, only get that question wrong.  Now these concepts were considered the same.

What is the true lesson that students take away from the UT Dallas MBA program? Don’t be rational.  Don’t think.  Just blindly memorize whatever you are told to memorize, no matter how badly stated or inconsistent it may be, and then regurgitate that arbitrary information on a ritualized multiple-choice test. 

At The Naveen Jindal School of Management, students are evaluated mostly for their dog-like ability to blindly follow the dictates of authority and memorize whatever they are told to memorize.

Ironically, one of the marketing professors raved to students about a colleague in the department who was supposed to be an outstanding teacher.  What made this professor such a great teacher?  Student evaluations, which obviously a valid measures of teaching ability.

Since I had done a lot of research on student evaluations as a failed accountability metric, I signed up for several courses with this instructor thinking he was probably just an easy teacher who gave high grades.  Sure enough, he was a lousy teacher, one of the worst teachers I’ve ever had to suffer through.  He rewarded mindless memorization.  That’s all he wanted from students.

Students liked him because all his classes used the same boring formula: pre-packed lectures, multiple-choice tests largely tailored to the lectures and sample quizzes, and simple group assignments. 

However, I suspect that this professor gets the highest student evaluation ratings for another reason.  Students are star struck.  This professor brags about how many times he has been on television as an “expert,” and he shares some of his t.v. interviews with the class.  Of course, none of the students seemed to know that there is an inverse correlation between professors and the media: The more a professor is on television as a media consultant, the lower the quality of his or her scholarship, and the less academic influence he or she has on the disciplinary field.  Most students seemed to equate t.v. time with brilliance. 

I was deeply disturbed with the sloppy, “I don’t care” attitude of most of my professors, especially that sham marketing professor with high student evaluation ratings.  To take another example from my marketing classes, which were embarrassingly low on quality, several multiple-choice questions gave the same answer twice (out of four choices), which goes to show that the professor simply didn’t care.  He couldn’t even be bothered to proof-read his exams to find a glaring mistake.  And this problem happened in more than one marketing class with different professors.  In one class, many of the exam questions were repeated twice.  How does that happen?

Most professors at The Naveen Jindal School of Management used sloppy, unprofessional teaching materials and tests with lots of typos and errors, but the marketing department was the worst.  One marketing professor had many multiple-choice questions on his midterm and final exams with major typos.  Some of these typos included highly generalized and unclear language, which made the questions impossible to rationally answer.  Of course, rationally answering questions was not the point of these exams.  You were just supposed to recognize the same language as the lecture slides and retrieve the “correct” answer from memory. 

There were also many typos due to poor grammar, which made both questions and answers hard to understand.  One marketing professor had cluttered, poorly written slides with many typos, like “word-of-mouse,” instead of “word-of-mouth,” which was actually repeated twice in the same lecture on different slides.  That same professor had slides discussing market demographics and listed baby boomers age range from 55-75.  But later, on the midterm exam, a test question wanted you to choose 45-64 as the “correct” age range for baby booms.  The other key words in the exam sentence matched the professor’s power point slide, so clearly the professor updated his “facts” on his lecture, but not on his test.

In many classes at The Naveen Jindal School of Management all we got were outdated questions on exams corresponding to outdated lectures.  One question on a marketing exam asked about which company had the top brand rating in 2005 (and again in 2007), which was useless trivia on an exam in 2021.  A bunch of questions on a marketing exam used lists silly acronyms, like POP, instead of actually using the names of concepts.  I guess because the professor was too lazy to type out the full names.

Professors expected students to memorize lists of highly general and arbitrary information, sometimes in order, like arbitrary words for the perceptual process.  These words were not standard conceptual labels in the field of psychology, just a professor’s pet labels, which he wanted students to memorize.  Lectures were nothing more that cluttered PowerPoint slides bloated with generalized concepts with formulaic definitions listed in bullet form.  Listen, read, memorize, regurgitate.  Repeat process.  Good dog! 

In one of my marketing classes, the one with the supposedly best teacher in the department, the professor didn’t even bother to write exam questions.  With every group assignment, he asked for five multiple choice questions from students.  Thus, he would collect about a hundred student questions in every class.  Why write your own exams when you can get your students to do it for you?

His exams consisted of these really badly worded, and sometimes nonsensical, student-written questions.  Because he was so lazy, and cared so little about education, he would re-use many of the exact same questions from the mid-term test on the final exam.  And I’m not sure if it was laziness, or simply carelessness, but many of the final exam questions were repeated.  You read that right, he asked students the exact same question twice, not once but six or seven times.

Often, the concepts that we memorized were so highly general as to be meaningless, not to say highly inaccurate.  For example, one marketing professor stated, “consumers experience satisfaction or dissatisfaction.”  Not only is this useless, common-sense trivia, of course they are either happy or not happy, but it also actually false in its oversimplification.  These two reactions are not the only two emotional states that consumers can have with a product or service.  There is actually a complex range of responses that consumers can have.  Here is another example from an exam:

Which of the following is not one of the sociological variables used to explain how families function?

a.

Cohesion

b.

Adaptability

c.

Communication

d.

Structure

e.

All of the above are sociological variables that can be used to explain how families function.

Let’s just move past the grammatical monstrosity of a question with the spelling error.  That’s just par for the course.  Instead, let’s look at the substance of the question. 

Now I have studied sociology extensively, and all four of these concepts are used by sociologists to study all types of social phenomenon, including families.  But “all of the above” was the wrong answer.  The “correct” answer was “structure,” which is false and non-sensical, as social structure is one of the most important and central concepts in sociology.  All sociologists study the structure of families to understand the function of families.  But, when you look at his PowerPoint slide, that word is NOT on the slide, so obviously it must be a “wrong” answer.

Oh, by the way, did you know that a morpheme is the smallest linguistic unit that has meaning?  Few do.  Relevance?  None.  Usefulness?  None.  What’s the point?  Don’t ask such silly questions.  Just memorize whatever factoids you are told to memorize. 

And just to remind you, this was all from the HIGHEST RATED professor in the department based on student evaluation surveys.  This just goes so show you how utterly flawed and useless student surveys really are, which I discuss at length in my new book, The Myths of Measurement and Meritocracy: Why Accountability Metrics in Higher Education are Unfair and Increase Inequality.

As you can clearly see, knowledge, reason, critical thinking are the enemies of success at The Naveen Jindal School of Management and UT Dallas.  Don’t know.  Don’t think.  Don’t question.  Just listen, memorize, and regurgitate on a standardized test.

In this school, students are just supposed to memorize whatever meaningless generalizations the professor put on a PowerPoint.  What are the core skills being taught to students, other than memorization?   University of Texas at Dallas students have been carefully trained to select the “correct” answer on a multiple-choice exam. 

And I use the word “teach” very loosely because only one or two professors (out of over 20) actually “taught” students in any way, shape, or form. 

Most professors did only two activities.  First, they talked at students, but none of these professors had anything to say that was not already said in the textbook. 

Second, some professors quickly assigned numerical grades to assignments.  They did not really read or assess assignments, they just quickly attached a subjective number.  However, I said “some” professors on purpose, because most professors had a graduate student Teaching Assistant who did all the work for the class.

Finally, I want to briefly mention the curious phenomenon of “teamwork” at the University of Texas at Dallas.  Almost every assignment in almost every class, besides exams, were team assignments.  There was almost no individual assignments. 

Working is teams can be very valuable, and under the right circumstances, a team can be more productive, and produce higher quality work, than an individual produce alone.  But, under the wrong conditions, a team produces little and the quality of the work can be really low.  Even worse, one or two people can end up doing all the work, placing unnecessary stress on higher performing students. Furthermore, productive, high-performing groups do not simply “happen,” they must be created and nurtured over time with the proper resources and environmental conditions.

Psychologist Benjamin Schneider has argued that “people make the place,” by which he meant that the abilities, personalities, and motivation of people largely determine the success of an organization or group (qtd. in Pfeffer & Sutton, 2006, p. 86).  Thus, to achieve capable working groups who will successfully reach goals, you need to have the best people with quality resources. 

There is some empirical research, which backs up this notion (p. 87).  Michael Schrage, research fellow at MIT, expressed this notion slightly differently: “A collaboration of incompetents, no matter how diligent or well-meaning, cannot be successful” (qtd. in Pfeffer & Sutton, 2006, p. 90).  Tom Kelley, one of the founders and general manager of IDEO, argued in his book The Art of Innovation that team members have to be “selected for ability” in order to maximize the contribution of each member to the group.  Kelley also argued that group members need to have the freedom and autonomy to choose their groups, projects, and roles (p. 75). 

However, Jeffrey Pfeffer and Robert I. Sutton, management professors at Stanford, argued that the natural talent argument for success is only half-true because talent is not a fixed quality. They explained, “Exceptional performance depends heavily on experience and effort.  No matter how gifted (or ordinary) team members are to start out, the more experience they have working together, the better their teams do…Experienced teams perform better, because over time members come to trust each other more, communicate more effectively, and learn to blend each other’s diverse skills, strengths, and weaknesses” (p. 94). 

Pfeffer and Sutton also point out that “people’s performance depends on the resources they have to work with, including the help they get from colleagues, and the infrastructure that supports their work” (p. 96).  In short, as Tom Kelley explained, “there’s an art to putting teams together” (p. 83).  Successful teams don’t simply happen.  They must be created and nurtured. 

That is why a lot of teams are not only ineffective, as political scientist Cass R. Sunstein and psychologist Reid Hastie (2015) have pointed out, but also much worse at making decisions than individuals.  Groups often fail to correct and thereby “amplify” the mistakes each member to create a worse outcome than if people worked alone (p. 13).

At UT Dallas, every group that I was assigned to was both inefficient and ineffective.  I would have accomplished a lot more, with higher quality and less time, if I would have been allowed to work on my own. 

None of the professors at The Naveen Jindal School of Management had never heard of the art and craft of forming teams and leading effective teamwork.  Never did any professor discuss the principles and practices that make teams effective, or the pitfalls to avoid.  The business school professors at UT Dallas simply took random people and threw them together into groups. 

And not once did any professor ever monitor or evaluate the performance of any team.  They didn’t care.  Professors simply graded the final assignment.  Sometimes teams were asked to evaluate each other with a grade, but this exercise simply rewarded popularity and conformity. 

In most of my classes, there were many group members who were grossly incompetent and lacking in even the basic motivation to contribute anything at all.  In several classes when students were put into working groups, many students said nothing and contributed nothing.  Almost all of the work was done hastily days before an assignment was due. 

In every class, the stronger and more motivated students did most of the work, including all of the scheduling, task assignment, communication, and monitoring of group work.  This made group work much easier than individual work for lower performing students, and it made group work much, much harder and more work for higher performing students.

And why did we do so much group work?  Almost every professor will pay lip-service to the intellectual importance of teams and their practical use in the workplace.  But that is not why groups are used.  That is simply a rationalization.

If professors cared about the concept and practice of teamwork then they would actually devote time and effort to teach the subject.  They would monitor its practice, and evaluate students as they learned to work in groups.  But that NEVER happened.  Not even once.

No, there is only ONE reason why groups are used.  It saves professors time and effort.  Professors are lazy.  Why grade 30 or 60 or 90 assignments when you can quickly grade 5 or 10?  It’s simple, self-serving math.  Again, at The Naveen Jindal School of Management, the institution serves itself, not students or society.

In closing, here is the most important lesson I learned at business school: I could have gotten the same information by reading books at a fraction of the cost, about $1000 instead of over $40,000 – and I wouldn’t have been subjected to meaningless, subjective numbers that supposedly judged my “learning” or “knowledge.”  Plus, I would have been able to learn useful information that I wanted to learn.

As a customer, I was very “dissatisfied” by this dumbed-down curriculum delivered by no-nothing, lazy instructors. 

In fact, I was insulted.  This wasn’t an education.  It was a meaningless institutional ritual. 

Or worse, it was fraud. 

I was spending over $40,000 dollars to be force-fed bullshit, questionable generalizations, and falsehoods by professors who didn’t give a damn about my learning or providing students with a useful education.

I know I shouldn’t have, but I took all of this personally. 

I love to learn so much, and I care a great deal about teaching and the process of education.  Thus, my consumer experience at The Naveen Jindal School of Management and UT Dallas was a mix of rage, disgust, disappointment. 

I also experienced a great deal of apathy.  I hated the fraudulent classroom experience.  I was utterly demoralized because there was nothing that I could do to change it, other than document my experience by writing this essay.

But it wasn’t all bad.  As a lifelong learner, I can honestly say that I learned a great deal during my tragic experience at UT Dallas.  I learned all that I have put into this essay, which hopefully will help others make better decisions about graduate school one day.  But I also learned a great deal about business management, marketing, and economics – from the several hundred interesting books I read while ignoring my ignorant professors.  I gave myself the education I wanted and deserved, as I have always done for decades. 

The diploma I earned from UT Dallas signified useless ritualized bullshit.  My real learning was never acknowledged by any institution or solemnized by any piece of paper.  Such is the tragedy of playing school, as I have written about in my new books.

Tradition or Transformation?

Business management consultants B. Joseph Pine II and James H. Gilmore published a groundbreaking book in 1999 with Harvard Business Review Press, which was called The Experience Economy.  They argued that advanced economies in the 21st century would be defined by “experiences,” rather than commodities, goods, or services. 

In the 21st century, consumers want a new level of value that creates rich experiences that transforms them into new and better human beings (p. 268).  Pine and Gilmore explained, “The experiences we have affect who we are, what we can accomplish, and where we are going, and we increasingly ask companies to stage experiences that change us” (p. 242), although this business model has been the modus operandi of educational institutions for thousands of years. 

Good ideas take a while to break into the C-suite, or the business school classroom.

When businesses focus on experiences and transformations, the customer “is the product,” as they are telling a business, “Change me” (p. 255).  The “transformation economy,” that Pine and Gilmore describe, as any educator would recognize, is the learning economy or the education economy, which was first systematically described by the philosopher John Dewey over a century ago.  Good ideas take a while to break into the consulting business as well.

For thousands of years, educators have been in the business of educating and transforming their students into capable adults who had the knowledge and the skills to survive and thrive.  But you don’t often find education in most schools, as I’ve written about extensively, especially in my two most recent books.

While Pine and Gilmore focused on heralding the new experience economy, they also warned about complacent businesses that do not care about consumers or their needs.  These traditional, old-school businesses operate with “an attitude of ‘They won’t mind,’” which leads “inevitably to operational practices replete with customer sacrifice” (p. 123). 

The Naveen Jindal School of Management at the University of Texas at Dallas is one of those thoughtless, traditional businesses that deliver sub-standard, flawed products because they simply don’t care about their customers.  The professors at this school also know that the largely ignorant consumers who buy their shoddy product “won’t mind” because they can’t recognize the low quality of the services being offered, or at least are two powerless and afraid to complain because they know their professors will retaliate with lower grades.

It's interesting that Pine and Gilmore should quote at length an article by John Quelch, a former Dean at the London Business School, who wrote,

“We’re not in the education business.  We’re in the transformation business.  We expect everyone who participates in a program at the London School of Business – whether it’s for three days or for two years – to be transformed by the experience.  We want people to look back on their time here as something that significantly influence their career and possible their entire life…everyone here – from the custodians to deputy deans – has become much more motivated.  People are eager to take part in having an impact on the students who come here” (qtd. in Pine & Gilmore, 1999, p. 248).

Pine and Gilmore argue that business and organizations that focus on the transformation of customers, not just selling a product or service, requires leaders who are willing to “sacrifice their own needs in favor of the employees,” and also for employees who are willing to “sacrifice their needs in order to eliminate the sacrifice of the customers” (p. 269).  A transformation-focused organization seeks to offer customized and “truly engaging” experiences for customers who will get a “one-to-one relationship” with the organization (p. 285).

Now I agree that this all sounds quite idealistic.  However, an ideal, as the educational philosopher John Dewey once pointed out, is more of an orientation, not a destination.  An ideal is a direction one walks toward, like north, but never reaches.

Thus, the truly engaging, experiential organization that offers life-changing transformation is an idealization that is more fiction than fact.  But it is a worthy goal to strive for, especially for educational organizations who should be in the business of establishing personal relationships with students and seeking to transform their lives.

It’s sad that there so many schools like The Naveen Jindal School of Management at the University of Texas at Dallas, which offers an out-of-date, depersonalized, and largely useless curriculum.  At UT Dallas, factoids are pushed into the passive minds of sheep who simply follow commands and do what they are told by arrogant professors who could care less about teaching, let alone transforming students.

One of the best business management books that was ever written was published in 2006 by Stanford professors Jeffrey Pfeffer and Robert I. Sutton.  It was called Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-Based Management.  It’s speaks volumes that at The Naveen Jindal School of Management at the University of Texas at Dallas this book was never even mentioned in any class, let alone assigned as a textbook. 

That is because schools like The Naveen Jindal School of Management and UT Dallas, and at other schools I’ve taught at in Texas, don’t care about evidence, best-practices, or even basic levels of educational professionalism.  These schools are all mindless, rigid, soulless, bureaucratic machines that offer a single, one-size-fits-all mold that stamp out credit hours, course grades, and diplomas. 

No learning required.  No growth.  No transformation. 

In The Knowing-Doing Gap, Stanford management professors Jeffrey Pfeffer and Robert I. Sutton argued that real, useful knowledge is “acquired from learning by doing” rather than from “learning by reading, listening, or even thinking” (p. 6).  Most schools, including most institutions of higher education, don’t know how to produce real learning.  They can only produce the fake, superficial type of knowledge that is easily memorized and regurgitated on a test and then is forgotten not long after.  This is a useless, trivial form of knowledge, if it can even be called knowledge.

Can the Naveen Jindal School of Management at the University of Texas at Dallas, or any of the other schools mentioned above, be reformed into an educational institution that seeks to actually educate and transform students? 

Not likely. 

Not now, anyway.  Probably not ever.

Why not?

Because, as Jeffrey Pfeffer and Robert I. Sutton argued in The Knowing-Doing Gap, almost all of the professors and all of the administrators at the Naveen Jindal School of Management suffer from the delusion that “talking about something” is “equivalent to actually doing something.”  The essence of ineffectual schooling is simply talk, talk, talk and no actual doing of anything useful or important (p. 48).

No one should pay $40,000 or $200,000 for empty, useless talk from a bunch of arrogant hypocrites who do not, and probably cannot, practice what they preach.

Can We Learn from Failure?

Most organizations also suffer from the “smart talk trap,” as Pfeffer and Sutton documented.  Universities, in particular suffer, from this dreaded disease.  Almost all professors simply talk and talk and talk and assume that somehow all the talk seeps into students’ brains.  Professors also assume that all that talk is understood and remembered by students.  And further, somehow all that talk enables students to actually do something useful, especially something useful on a job. 

All of these assumptions are false.  Smart talk doesn’t help anyone do anything useful, other than puff up the ego of the person doing all the talking.  Nobody is really listening or knows what to do with all that talk.

But hey, that’s what the research shows.  But who actually cares about research and best practices?  Certainly not most university faculty when it comes to teaching, or university administrators when it comes to the fostering high quality educational practices.

As Pfeffer and Sutton pointed out back in 2006, “many companies and leaders show little interest in subjecting their business practices and decisions to the same scientific rigor they would use for technical or medial issues (p. 12).  Instead, most organizational leaders, including university presidents, deans, and faculty, simply act on “beliefs rooted in ideology or in cultural values,” which “resist disconfirming evidence and persist in affecting judgments and choice, regardless of whether or not they are true” (p. 12). 

I’ve never seen, and rarely heard about, an educational leader at a university, college, or community college who cared about evidence and best-practices, and who was committed to institutional change and actually serving students.  Some certainly talked-the-talk, but none cared or dared to do anything substantial, partly because the personal, institutional, political, and economic costs are simply too great, and the consequences too uncertain.

Jeremy Rifkin is a bestselling author and Lecturer at the Wharton School’s Executive Education Program.  In his 2014 book, The Zero Marginal Cost Society, he argued that technological advances were causing “the pedagogy of learning” to undergo “a radical overhaul.”  Rifkin criticized traditional forms of modern schooling for transforming the school classroom into “a microcosm of the factory” with “authoritarian, top-down models of instruction.” 

He explained, “Students were thought of as analogous to machines.  They were conditioned to follow commands, learn by repetition, and perform efficiently.  The teacher was akin to a factory foreman, handing out standardized assignments that required set answers in a given time frame.  Learning was compartmentalized into isolated silos” (pp. 133-34).  Rifkin naively believed that technology was ushering in an educational utopia, largely through the development of Massive Open Online Courses (MOOCs), which Rifken thought were going to revolutionize education.  

But MOOCs didn’t revolutionize schooling.  In fact, as educational historians have pointed out for over half a century, nothing has ever revolutionized schooling, as I have discussed in a pair of recent books.  In the 21st century, schools operate pretty much like they did in the 19th century, if not the 17th century.

Rifkin’s critical summary of old-fashioned 19th century schooling is still a valid description of what goes on in most schools in most countries in the year 2021, two decades into the 21st century.  And most likely, the factory model of authoritarian schooling will be going strong in the 22nd century too.

The Naveen Jindal School of Management at the University of Texas at Dallas still operates based on this outdated 19th century authoritarian model, as do most graduate schools around the world.  The Naveen Jindal School of Management delivers a traditional useless, formulaic, and ritualistic form of schooling that promotes conformity, mindless obedience to authority, and the ranking of students based on jumping through arbitrary academic hoops.

But that doesn’t mean we can’t collectively learn from the failures of The Naveen Jindal School of Management.  That is my hope at least. 

Professor of civil engineering and historian Henry Petroski explained in his book Success through Failure that “The most successful improvements ultimately are those that focus on the limitations - on the failures...Though a focus on failure can lead to success, too great a reliance on successful precedents can lead to failure” (p. 3).  In You’re About To Make a Terrible Mistake, business management professor Olivier Sibony has argued that professionals should study “worst practices” rather than “best practices” so as to better understand why most organizations fail.  When people focus on winners, they neglect the majority of organizations “who took the same risks, adopted the same behaviors” but still failed (p. 46).

Most business school professors would be doing their students a big favor if they focused more on failure, especially the failure of students to learn what they are being asked to learn in school.  Failure is the best teacher.  Learning through failure imparts the best lessons.

Petroski argues that truly innovative people “see failures where most of us see only successes.  These are the inventors, the engineers, the designers of the world, who are forever trying to improve it through the things in it…a failure of any kind is not so much a disappointment as an opportunity.  They tweak the things we know to turn them into things we did not even know that we needed…They recognize that a failure not only provides them the opportunity to carry out the process of design and development anew but also enables them to conceive of something new and improved to obviate the triggering failure” (pp. 62-63).

The Naveen Jindal School of Management is a failure because it fails to educate its students in any meaningful and practical way.  It also fails society by depriving the business community of truly educated and empowered entrepreneurs and managers. 

But most other business schools have the same kinds of sinking ships, as management professor Henry Mintzberg demonstrated almost twenty years ago in his book Managers Not MBAs: A Hard Look at the Soft Practice of Managing and Management Development.  Most business schools, as Mintzberg pointed out, offer an “IKEA model” of education: “The schools supply the pieces, neatly cut to size; the students do the assembly.  Unfortunately, the schools don’t supply instructions.  Worse still, the pieces don’t fit together” (p. 37).  And in the case of schools like The Naveen Jindal School of Management, it gets even worse: The pieces are cheap, broken, and made from generic, substandard materials.  The University of Texas at Dallas sells counterfeit goods.

I learned a great deal while I was enrolled in this terrible school, just not very much in the classroom.  Grouch Marx once quipped, “I find television very educating.  Every time somebody turns on the set, I go into the other room and read a book.”  When I was enrolled at the Naveen Jindal School of Management, I spent most of my class time ignoring my professors and instead reading the articles and books of more knowledgeable and competent scholars, like W. Edwards Deming, Jeffry Pfeffer, Robert Sutton, Peter M. Senge, Wayne Baker, and Amy C. Edmondson, to name just a few.

I learned relatively little, often nothing, in most of my classes.  Much of the information that I was forced to memorize short-term for tests were meaningless, useless, and sometimes, utter nonsense.  Almost nothing of value came from my professors or my classes, although a couple of the textbooks had valid information that was useful. 

To be honest, I had a better education in business management reading The Economist magazine for 25 years than earning my MBA at the University of Texas at Dallas. 

Herbert Simon was a Nobel Prize winning economist and professor of sociology and business management.  He wrote in his memoirs, Models of My Life, “Anything that can be learned by a normal American adult on a trip to a foreign country (of less than one year’s duration) can be learned more quickly, cheaply, and easily by visiting the San Diego Public Library.”

I would modify Simon’s assessment for those of you thinking about going to graduate school, especially for an MBA.  You can get a better education more quickly, cheaply, and easily by reading the right books and articles by the leading scholars, and by reading The Economist magazine every week.

The Naveen Jindal School of Management and its faculty do not care about students or student learning because the school does not care about education or teaching.  This organization does not care about the sacred, transformational process of learning, nor does it care about the powerful impact a real educational experience can have on students and the world at large. 

At my core, I am an unrepentant idealist who cares a great deal about teaching, learning, the value of education, and contributing positively to the world.  Thus, I have been utterly demoralized while I’ve been at The Naveen Jindal School of Management.  It is a terrible school.

I hope that some members of this school will read this essay, pause, and think deeply about why they work in the field of education.  I hope that some faculty will reflect on what they hope to accomplish besides earning a paycheck and publishing academic articles that few will ever read, and that fewer people will ever use in any productive endeavor. 

Failure is a powerful teacher, if there is an openness and willingness to learn.  I hope that the professors of this school can learn from their failure and do better.

But I doubt it.  I seriously doubt any change will happen at this institution.  Here’s why.

Entrepreneur and venture capitalist Ben Horowiz has talked about the many differences between good organizations and bad organizations.  One of the key distinctions is the ability of management to listen to front-line workers and customers in order to recognize and learn from organizational failures.  Bad organizations not only don’t recognize or learn from failures, they usually ignore all criticisms and pretend nothing is wrong.

In The Hard Thing about Hard Things: Building a Business When There Are No Easy Answers, Ben Horowitz explains, “To make it all much worse and rub salt in the wound, when they finally work up the courage to tell management how fucked-up their situation is, management denies there is a problem, then defends the status quo, then ignores the problem” (p. 101). 

I expect my report on the failures of The Naveen Jindal School of Management to meet a similar fate.  Most likely, my concerns will be completely ignored, and new generations of students will have to suffer like I have suffered, passing through more useless rituals of schooling without any real benefits.

Sara and Jack Gorman wrote a great book, Denying to the Grave: Why We Ignore the Facts that Will Save Us, which helps to explain why both individuals and organizations, like the Naveen Jindal School of Management, will never change, no matter what the facts are. 

For one, understanding the facts about what is really going on rarely changes anyone’s mind.  “Irrational behavior occurs even when we know and understand all the facts,” Gorman and Gorman explained (p. 6).  While I know that most professors in the department have a “knowledge deficit” about teaching and learning, the root of the problem is that faculty just don’t care to “put in the time and effort” (p. 13) to educate themselves about the proper way to educate their students, let alone spend the countless hours needed to engage in educational practices both inside and outside of the classroom.

More importantly, many faculty either don’t believe that teaching and student learning is important, or they believe in false and outdated tractional myths about schooling, or some combination of the two.  Overcoming divergent values, apathy, and engrained myths is very hard, and often impossible.  As Gorman and Gorman wrote, “at present we are not certain how best to convince people that there are scientific facts involved…and that ignoring them has terrible consequences” (p. 26). 

Management professor Richard P. Rumelt wrote a wonderful book called Good Strategy, Bad Strategy: The Difference and Why It Matters, which was one of the best management books that I’ve ever read (and never once mentioned by any professor in The Naveen Jindal School of Management.  He explained how most organizations have no focus and no strategy, which is why so many organizations fail to be productive and achieve objectives.  Instead of creating clear, coordinated, and logical strategy, most organizations operate based on “ritualized formalism,” which achieves little (p. 43). 

From what I experienced, The Naveen Jindal School of Management is clearly a failed organization that is wholly based on ritualized formalism, which is utterly useless to students.  It is the exact opposite of a successful organization. 

Ironically, I can remember an accounting class that I had at UT Dallas, which actually turned out to require a bit more independent and critical thought than almost any other class I took.  We read about a failed company called MiniScribe, which was led by a blustering idiot who forced his staff to memorize his silly management philosophy and ruled by naked authority and fear, which caused his staff to frequently inflate sales numbers and cook the books – which eventually led to the company’s downfall.

Exactly the same sort of failed management practices demonstrated by almost every professor and Dean at The Naveen Jindal School of Management, except at UT Dallas there are no independent auditors and there is no market accountability, so nothing will ever change or get better at this failed organization, just like at so many other educational institutions and non-profits with no customer responsiveness and little external oversight.

Richard P. Rumelt sardonically explained that “business schools teach strategy but rarely apply the concept to themselves” (p. 112).  You could extend this insightful analysis and say that business schools, and many other professional degree programs in every university, teach important concepts and practices, but they rarely actually practice what they preach, and they never actually check to see if their verbiage ever translate into actual results that make the world a better place. 

I believe that is called hypocrisy, or fraud. 

Both are endemic to higher education in America, as well as to many non-profit organizations and for-profit firms.  As many educational and business management professors have documented, there is so much useless bullshit at the core of both academia and business.

I contacted several journalists to see about publishing this story.  None of them were interested, in part because none of them saw what I described as a problem.  It was just reality, or rather a blizzard of subjective reality.  One reporter at the Washington post suggested that there was no way to objectively gauge the “quality of learning” in college and that “different people” would find different experiences acceptable.  This reporter also suggested that most students “don’t have the time or will to investigate” university programs, thus, nobody will care about my warnings and it probably will not deter any students from enrolling in The Naveen Jindal School of Management. 

This reporter’s simple and naive solution: transfer to another school.

I looked into transferring to another MBA program, but is much harder to transfer as a graduate student, and often it is impossible.  Many programs do not except transfers, especially the best programs.  And when you do change from one graduate program to another, few of your credits will actually transfer, so it is basically starting from scratch, a very costly proposition in terms of time and. Money, which is not a good option for most people.

While my MBA experience was bad, I wasn’t all that surprised.

Unfortunately, poor quality and the lack of authentic educational value extends throughout higher education. What I found in the business school of the University of Texas at Dallas can be seen across higher education at large.  I’ve taught at top-level research universities, non-selective state universities, small liberal arts colleges, and community colleges.  I saw the same dumbed-down-teach-to-the-test curriculum and the cynical practice of playing school at every school I worked out, except for one, a very expensive liberal arts college.  There is also a large body of research on the lack of “higher education” in institutions of higher education, including in Harvard business professor Rakesh Khurana’s award winning book From Higher Aims to Hired Hands: The Social Transformation of American Business Schools and the Unfulfilled Promise of Management as a Profession.

Donald Levine, former Dean of the University of Chicago, once lamented, “The scandal of higher education in our time is that so little attention gets paid, in institutions that claim to provide an education, to what it is that college educators claim to be providing.” 

I’ve spent 11 years as a college student, and over 20 years as a lecturer in higher education, and I can tell you that most colleges don’t provide much of value to students, especially in terms of learning practical knowledge and useful skills.  Most colleges are good at giving students fun social experiences, like football games and parties.  But I’ve rarely seen an institution of higher education that actually cared about students’ learning or goals, let alone giving them a “higher education” that will improve their lives.  This is why some critics of higher education, like Kevin Carey at New America, have described college as a “scam,” especially expensive master’s degree programs, which are rarely connected to actual jobs.[15]

So, if you’re interested in real learning, instead of looking at rankings, do your due diligence. We live in a twisted world where students have to “search for evidence” on institutions of higher education in order to keep from “getting ripped off” by fraudulent scams,[16] like the MBA program at The Naveen Jindal School of Management. 

So here is some advice.  Visit the school.  Talk to professors and current students.  Ask them about the quality of learning in the classrooms.  And dig into the research of professors to find one that writes about what you want to learn.  You can tell a lot about the quality of a teacher by reading their work and listening to them talk. 

You might also read about the pitfalls of grad school from other sources, especially Henry Mintzberg’s book Managers Not MBAs: A Hard Look at the Soft Practice of Managing and Management Development.  He not only criticizes MBA programs, but also argues that these college programs are counterproductive for businesses because MBA programs produce ignorant and ineffective “managers” who can’t actually manage well.  In The Knowing-Doing Gap, Stanford management professors Jeffrey Pfeffer and Robert I. Sutton argued over 20 years ago that “There is little evidence that being staffed with people who have an advanced education in business is consistently related to outstanding organizational performance” (p. 3).

And above all else, understand the true purpose of higher education in the 21st century, which I explained in my recent book, The Myths of Measurement and Meritocracy: Why Accountability Metrics in Higher Education Are Unfair and Increase Inequality.  You go to college to buy a credential, not to earn an education.  In fact, Mintzberg discussed this issue twenty years before I did in his Managers Not MBAs, and the sociologist Randal Collins documented this phenomenon over forty years ago in his book The Credential Society

An MBA, as Mintzberg pointed out, “is not a process of educating so much as a method of screening.  The MBA is a convenient credential to justify hiring choices” (p. 83).  Business schools are just “expensive employment agencies,” according to economist Samuelson (quoted in Mintzberg, p. 83).  Business schools are like “bottling plants,” business professor Richard West explained, where the “product is about 90% done before we ever get it.  We put it in a bottle and we label it” (quoted in Mintzberg, p. 83).  This is why MBA graduates get “little value” from their degree, which is mostly a “selection mechanism” for the employment market rather than a badge of human capital development.[17]

And even worse, as business professors Jeffrey Pfeffer and Christina Fong demonstrated in their 2002 journal article “The End of Business Schools? Less Success Than Meets the Eye,” most MBA graduates don’t get rewarded in the labor market when they graduate with their fancy certified label.  There are almost no economic gains for most students.  Only the few students who manage to graduate from elite, top-ranked programs see improved labor market returns. 

Most students graduate with no real knowledge, no real skills, and a lot of debt.

I paid about $45,000 for my credential.  I now have my MBA.  Its official.

But as for my real business management education?  It only cost about $2,500 for the books I bought on Amazon, and the time it spent to read them. 

 

 NOTES:

[1] https://news.utdallas.edu/campus-community/jsom-bloomberg-rankings-2021/; https://www.bloomberg.com/business-schools/

 

[2] https://news.utdallas.edu/campus-community/best-value-princeton-review-2021/

 

[3] https://www.usnews.com/best-colleges/rankings/national-universities/best-value?schoolName=University+of+Texas+at+Dallas&_mode=table

 

[4] Diep, F., & Gluckman, N.  (2021, Sept 13).  Colleges still obsess over national rankings.  For proof, look at their strategic plans.  The Chronicle of Higher Education.  Retrieved www.chronicle.com

 

[5] Chandler, D. L.  (2021, July 16).  “Malcom Gladwell Examines Why HBCUs Score So Low in U.S. News & World Report College Rankings.”  The Grio.  Retrieved from www.thegrio.com.  To listen to the podcast see: Gladwell, Malcom.  (2021).  “Lord of the Rankings.”  Revisionist History.  Retrieved from https://www.pushkin.fm/episode/lord-of-the-rankings/

 

[6] Driver, C.  (2022).  Breaking ranks: How the ranking industry rules higher education and what to do about it.  Baltimore: Johns Hopkins University Press.

 

[7] Surluga, S.  (2022, July 1).  Columbia to skip U.S. News rankings after professor questioned data.  The Washington Post.  Retrieved from www.washingtonpost.com

 

[8] Carey, K.  (2021, Nov 5).  The college degree is in shambles.  The Chronicle of Higher Education.  Retrieved from www.chronicle.com; Weissmann, J.  (2021, July 16).  Master’s degrees are the second biggest scam in higher education.  Slate.  Retrieved from www.slate.com/business.  See also Bannon, L., & Fuller, A., (2021, Nov 9).  UCS pushed a $115,000 online degree.  Graduates got low salaries, huge debts.  The Wall Street Journal.  Retrieved from www.wsj.com

 

[9] Pfeffer, J., & Fong, C. (2002). The end of business schools? Less success than meets the eye. Academy of Management Learning and Education, 1, 78-95. See also: Pfeffer & Fong (2004), The business school “business”” Some lessons from the U.S. experience. Journal of Management Studies, 41(8), 1501-1520.

 

[10] Quoted in Resse, W. J.  (2013).  Testing wars in the public schools: A forgotten history.  Cambridge, MA: Harvard University Press, 136, 202.

 

[11] Ibid, 202.

 

[12] Ibid, 202.

 

[13] Kridel, C., & Bullough, Jr., R. V.  (2007).  Stories of the Eight-Year Study: Re-examining Secondary Education in America.  Albany, NY: State University of New York Press, 145.

 

[14] Ross, L., & Nisbett, R. E.  (2011).  The person and the situation: Perspective of social psychology.  Revised Ed.  London: Pinter & Martin.

 

[15] Carey, K.  (2021, Nov 5).  The college degree is in shambles.  The Chronicle of Higher Education.  Retrieved from www.chronicle.com; Weissmann, J.  (2021, July 16).  Master’s degrees are the second biggest scam in higher education.  Slate.  Retrieved from www.slate.com/business.  See also Bannon, L., & Fuller, A., (2021, Nov 9).  UCS pushed a $115,000 online degree.  Graduates got low salaries, huge debts.  The Wall Street Journal.  Retrieved from www.wsj.com.  See also Beach, J. M. (2021).  The myths of measurement and meritocracy: Why accountability metrics in higher education are unfair and increase inequality.  Lanham, MD: Rowman & Littlefield.

 

[16] Carey, K.  (2021, Nov 5).  The college degree is in shambles.  The Chronicle of Higher Education.  Retrieved from www.chronicle.com

 

https://www.aacsb.edu/accredited/t/the-university-of-texas-at-dallas

 

[17] Moldoveanu, M. C., & Martin, R. L.  (2008). The future of the MBA: Designing the thinker of the future. Oxford: Oxford University Press, 4.

What is 21st Century Literacy?

Addressing the Knowledge Gap

 

This is the Introduction to my book How Do You Know? The Epistemological Foundations of 21st Century Literacy, which was published in 2017.

 
Schools around the world are failing to prepare students for the social, political, and economic challenges they will face in the 21st century.
— J. M. Beach
 

K-12 schools and colleges around the world are failing to teach basic literacy and critical thinking.  While many people have at least some valid factual knowledge, most lack knowledge about knowledge, specifically metacognitive knowledge, emotional intelligence, and mindware. Most people also do not understand how knowledge is constructed, evaluated, debated, disseminated, and used, especially by scientists.  In the 21st century, schools need a empirically validated curriculum that teaches students not only how to communicate, but also teaches them practical knowledge and the ability to think rationally, which together would enable students to actively construct, evaluate, communicate, debate, and use knowledge for personal and professional ends. 

What’s the Problem?

Schools around the world are failing to prepare students for the social, political, and economic challenges they will face in the 21st century.  For example, in the United States of America, most high school seniors can’t read, write, or think proficiently.  According to the National Center for Educational Statistics in 2011, only 27 percent of American 12th graders had “advanced” (3%) or “proficient” (24%) writing skills.[1]  In 2015, only 37 percent of American 12th graders were “proficient” or better in reading,[2] and only 22 percent were “proficient” or better in science.[3]  Currently, less than a third of American high school seniors have the foundational skills of reading, writing, critical thinking, and a basic understanding of science, which are prerequisite for success in college and the global labor market.

But K-12 schools in the United States should not be singled out for blame.  Many American students are not learning much in college either.  The majority of students in college are not completing degrees.  Nationally, only 30 percent of American community college students earn a degree, vocational certificate, or transfer to a university, and only half of college and university students earn a bachelor’s degree.[4]  While the majority of students fail to graduate, more than 80 percent of all grades are in the A to B range, signaling massive grade inflation.[5]  But wait, it gets worse.  One study found that around 45 percent of university students had no statistically significant gains in core learning areas over the first two years in college.[6]  Even successful students earning high grades are not learning much because, as educational scholar Ken Bain explains, they just “plug and chug” facts for exams, leading to a “bulimic education” of binging and purging information, gaining little actual knowledge, skills, or personal growth.[7]

Around the world, colleges are distributing credentials for labor markets.  Most do not deliver a real education focused on learning useful knowledge and skills.  As professor of Education David F. Labaree argues, colleges and universities have become businesses “selling” the “commodity” of “credentials to consumers,” an economic mission which “undercuts learning.”[8]  Thus, as Ken Bain explains, many college students play a “strategic grade game”[9] to get high marks and graduate with a degree, rather than earn an education by learning real knowledge and practical skills.  These cynical students “memorize formulae, stick numbers in the right equation or the right vocabulary into a paper, but understand little.”[10]  Bain goes on to add, “When the class is over, they quickly forget much of what they have ‘learned.’”[11] 

One student Bain interviewed lamented the lack of real learning in college: “To this day, I don’t understand that material, but I made A’s…I learned to study in the right way and pass the examinations with flying colors, but I never really learned anything.”[12]  Part of the problem is that many students merely want to “look smart” rather than learn real knowledge because they want to obtain a credential with the least amount of effort.[13]  For these and several other reasons, as political scientist Tom Nichols pointed out in his trenchant critique of higher education, colleges are “failing to provide their students the basic knowledge and skills that form expertise.”[14]

In addition to lack of knowledge and practical skill, there is also evidence that institutions of higher education are exacerbating the social and economic inequalities that affect students’ educational and labor market success.  Research reveals that students with high socio-economic status demonstrate much more learning in college than low SES students.[15]  Advantaged students go on to earn more after college and they are more likely to end up in both the top 20 percent and top one percent of the income distribution.[16]  And these financially advantages students are more prepared and motivated to seek out additional education in the future.[17]

Why Aren’t Students Learning?

Why are K-12 schools and colleges not preparing all students for success? Why aren’t schools giving students an education focused on real knowledge and useful skills?  And what can be done about it?  Part of the problem lies in the inequitable distribution of educational resources, which is tied to historical traditions of socio-political inequality.[18]  The majority of students in every country come from the bottom-half of the income distribution.  They do not have access to the social and economic capital they need to be successful in school or the labor market.  These same disadvantaged students also lack educational capital because they do not have access to the best schools with the best teachers, the best support staff, and the best facilities.  Environmental and cultural influences have the greatest effect on developing children, a process starting in the womb.[19]  Lack of early education can create a negative feedback loop, as educationally disadvantaged children grow up to raise more disadvantaged children, in what some scientists have called the “biology of disadvantage.”[20]

 As Political Scientist Robert D. Putnam explains the situation in the United States of America, “Rich Americans and Poor Americans are living, learning, and raising children in increasingly separate and unequal worlds, removing the stepping-stones to upward mobility.”[21]  The same can said about most countries.  Because of the happenstance of birth, the majority of students on this planet will experience not only fewer educational opportunities, but lower quality educational programs, which contributes to lower levels of educational success, whether measured as knowledge, skills, grades, persistence, or earning educational credentials. 

The problem, however, is not just how students are taught and by whom in what facilities, but it is also what students are taught in school, and their learning experiences while they are taught.  This is the educational domain of the curriculum.[22]  Many schools, even some of the best, have deficient curriculums because they are not coherently focused on useful real-world knowledge and critical thinking, especially metacognitive skills, emotional intelligence, and what psychologists call “mindware.”[23]  Many schools offer curriculums based on little more than tradition and common sense, void of any theoretical foundations or data to validate their effectiveness.[24] 

 

We Need a Real Education

In the 21st century, schools need an empirically validated curriculum that teaches students not only how to communicate, but also teaches them practical knowledge and the ability to think rationally, which together would enable students to actively construct, evaluate, communicate, debate, and use knowledge for personal and professional ends.[25]  Literacy education and science education need to be blended together into a seamless educational program that teaches students how to know, how to communicate, and how to argue about their knowledge.[26]

A real education should challenge what students believe they know.  Students should question their beliefs in order to gain accurate, rational knowledge through “deep learning”[27] about the world and themselves, including an investigation of their own thinking processes.  Only deep learning enables both reflective thinking and deliberate action.[28]  Different from mere schooling,[29] Ken Bain explains how a real education enables students to discard faulty beliefs and “build new mental models of reality” that are more accurate and useful.[30] 

But before you can discard faulty beliefs, you first have to recognize and understand your own ignorance and irrationality, which is a difficult and uncomfortable experience.[31]  Most people are blissfully unaware of their ignorance and irrationality, which leads to overconfidence and incompetence.[32]  Cognitive scientists Steven Sloman and Philip Fernbach have shown how “people are more ignorant than they think they are.  We all suffer, to a greater or lesser extent, from an illusion of understanding…when in fact our understanding is meager.”[33]  A real education, Nobel Prize winning psychologist Daniel Kahneman once said, builds rational mindware that enables us to recognize our own ignorance so as to think and build useful knowledge: It’s “knowing what to do when you don’t know.”[34] 

While learning facts about the world is important, Ken Bain argues, students need to be able to critically evaluate and use facts “to make decisions about what they understand or what they should do.”[35]  A real education should produce what psychologists and philosophers call “wisdom,” the ability to make rational judgments in order to take strategic action.[36]  And finally, a real education empowers students with habits of mind and practical skills that can be developed into expertise and professional excellence over decades of sustained practice.[37]

For centuries, philosophers have conceptualized epistemology too narrowly, often ignoring empirical research and scientific methodology.[38]  But over the last couple decades, psychologists and philosophers have revised our understanding of knowledge.  Scientists now know that acquiring an epistemology entails four practical skills.  First, knowledge means knowing facts about the objective world.  Psychologists call this “declarative knowledge,” and it is the simplest and most basic way of knowing. 

The other three kinds of knowledge fall under a category called “instrumental rationality” or “procedural” knowledge because these ways of knowing entail skilled behaviors, not just awareness of facts.  The second kind of knowledge is knowing how to critically think, which philosophers have been calling “rationality” for centuries, and what psychologists more recently labeled “epistemic rationality” or “epistemic cognition.”[39]  A third kind of knowledge involves knowing how to think about thinking, or “metacognitive” knowledge, which is a relatively recent scientific discovery, although at root it is an ancient philosophical practice.  Psychologist John Flavell coined the term “metacognition” in the 1970s, a term which is now used to describe how people manage and guide their thinking processes, including their emotions and mental biases.  Psychologist Robert J. Sternberg explains the concept as “mental self-management.”[40]  Finally, knowledge entails the practical ability to make wise judgments and skillfully act, which is called “instrumental rationality” or “procedural” knowledge.[41]  This is the most advanced and difficult form of knowledge.

 

Meeting the Demands of the 21st Century

In America, as in most other countries around the world, most people do not have these four kinds of core knowledge, especially rational decision-making skills and metacognitive awareness – not even most college graduates.[42]  According to many philosophers and scientists, most of us are “epistemologically naïve,”[43] which means the average person cannot effectively think or make rational decisions.  This is doubly concerning.  First, many people playing vital roles in the community, the nation, and the global economy are incompetent, which is concerning in and of itself.  But more importantly, as philosopher Robert Nozick pointed out,[44] the most significant 21st century problems are very complex and more technical than ever.  Thus, everyone – professionals, policy makers, and the general public – needs to have more education and sophisticated knowledge in order to find solutions to the collective problems we face, particularly in democracies. 

The incompetence of the general public negatively affects the health and wellbeing of individuals, as well as their professional and economic achievement, but also the social foundation of political democracies and the global economy.  Tragically, too many people believe that “my ignorance is just as good as your knowledge,” to quote the late science fiction writer Isaac Asimov.[45]  These opinionated people walk around oblivious to their own incompetence, posing a danger to themselves and everyone else, including the very survival of the human species.[46]

For many centuries, earning a college degree was a symbol of success.  A student did not need to graduate with real knowledge or skills to land a good job and become successful.  But the proliferation and massification of higher education in the 21st century has caused “credential inflation.”[47]  A college degree no longer automatically bestows economic success and stability, although it helps.[48]  Upon graduation, college students now must demonstrate real knowledge and skill in order to be successful.  In the 21st century, rapid technological and economic change will continue to place unprecedented demands on workers, primarily in developed economies.  In order to succeed, people will need not only practical skills and real knowledge, but also the ability to make rational judgments, to learn new knowledge and skills, and to adapt quickly to changing social and economic conditions.[49] 

Lack of knowledge and critical thinking also poses serious social, economic, and political consequences.  In the 21st century, we live in a frightening age of “post-truth” politics.[50]  Majorities of people in every country are uninformed about basic facts and easily manipulated by unscrupulous politicians, marketers, and partisans.[51]  Professor of law Ilya Somin recently confirmed, “The low level of political knowledge in the American electorate is still one of the best-established findings in social science.”[52]  Uneducated voters are vulnerable to deceptive business practices and authoritarian politics and are, therefore, highly likely to make counterproductive choices, which threaten not only the wealth of national and global security, but also threaten their own self-interests.[53]  Many people are trapped by their own self-destructive habits in unsafe environments because of lack of basic facts and poor decisions.[54] 

With the rise of smart algorithms like Google, automation, robots, and AI, the problem of ignorance is potentially becoming more important, and more dangerous.  In the developed world, people are becoming more dependent upon technology to complete a range of essential tasks, like banking, shopping, communicating, flying planes, driving cars, and performing surgery.  Our reliance on machines has caused an “automation paradox:”[55] While machines make us substantially smarter, they also can make us a lot dumber.  Many people become incapacitated when their machines fail or breakdown because people don’t have any real knowledge or skills anymore: Their cell phones and computers do everything for them.[56]

While many people have at least some valid factual knowledge about the world, most lack knowledge about knowledge, in particular metacognitive knowledge, emotional intelligence, and mindware.  Most people also do not understand how knowledge is constructed, evaluated, debated, disseminated, and used, especially by scientists.  A small minority in every country has such knowledge and these people are disproportionately successful.  This minority also passes their educational, social, and economic capital on to their children.  This in turn generates and exacerbates a “knowledge gap,”[57] which has become a serious social and political problem, although the knowledge gap is less often discussed than more tangible problems, such as the achievement gap and economic inequality.  Lack of knowledge, rational thinking, and metacognitive skill affect not only the opportunities available to individuals and their families, but also the profitability of businesses, the growth of economies, and the political stability of nations, most decisively in democracies.  Not only are knowledge and rationality the foundation of human freedom,[58] but knowledge is also the foundation of the 21st century economy.  The Economist recently argued that data had become the world’s “most valuable resource,” and that data analysis would be the most important job of the 21st century knowledge economy.[59]

We need to address the “knowledge gap,” or what others have called the “mindware gap,”[60] because it directly contributes to the educational achievement gap, but also because it is contributing to larger adverse trends, like socio-economic inequality, post-truth politics, environmental destruction, and climate change.  In most countries around the world, only a small, privileged elite has the prerequisite knowledge and skills needed, not only for educational and labor market success, but also for a healthy and flourishing life.[61]  This elite possesses social and educational capital, which enables them to gain more knowledge than other socio-economic classes.  This elite can also use their knowledge more effectively so as to achieve their strategic objectives, such as passing classes, earning educational credentials, obtaining good jobs, solving personal and professional problems, and gaining social and political power.[62] 

Some scholars have termed this inequitable predicament the “Matthew effect.”  The privileged few who have gained proficient knowledge, language skills, and thinking skills are able to use their proficiency to gain significantly more knowledge and skill than those without proficiency.[63]  The Matthew effect also works in reverse.  People who don’t have much knowledge fall victim to false beliefs and get trapped in a cycle of irrationality, whereby they fall further and further behind.[64]

 

What Will This Book Do?

How do we address the knowledge gap?  To begin with, there needs to be a new 21st century curriculum for the way we teach literacy to all students, in particular advanced literacy in high school and college.  In the 21st century, literacy means much more than reading and writing.  It entails knowledge and the complex ability to think and make rational judgments. 

The human species has reached a turning point in its history: extraordinary advances in science and technology, a globalized economy, increasing levels of global migration, and unprecedented cultural change.  These changes have led to destabilized labor markets, global economic crises, contentious intranational culture wars, violent international conflicts, and terrorism.  Unlike the past, 21st century students need more core knowledge and more advanced cognitive and emotional skills in order to be successful readers, writers, and thinkers, and not only for school, but also in the labor market, in politics, and in life.[65] 

This book seeks to redefine what literacy should mean in the 21st century by offering a framework for the core knowledge, rational thinking, and metacognitive skill, which should be the foundation of all 21st century literacy education, particularly for higher education.  Educators need to look far beyond the basic literacy skills of the past.  In this century, competent human beings need to be able to do more than read, write, and speak.  21st century communication and critical thinking need to be built on a complex foundation of cultural knowledge and scientific knowledge, particularly knowledge about knowledge, which includes metacognition, emotional intelligence, and mindware.  And as psychologist Deanna Kuhn has empirically pointed out, these advanced critical thinking skills are not “universal human attributes” – they are learned “cognitive achievements” that need to be skillfully taught,[66] primarily through formal schooling, mainly in higher education.

Knowledge about culture and science not only enable more robust metacognitive skill, but also the ability to productively use knowledge to better oneself and society.[67]  21st century citizens of the world need to be able to think critically and to self-monitor their own thinking so they can actively construct, evaluate, debate, and use their knowledge in diverse multi-cultural settings.  This book seeks to sketch out the parameters of a 21st century literacy curriculum, which would enable human beings to not only know information, but also to know better and to know how to use their knowledge more effectively.[68] 


Endnotes

[1] NCES, The Nation’s Report Card: Writing 2011 (2012), Retrieved from http://nces.ed.gov/nationsreportcard/pubs/main2011/2012470.asp

[2] NCES, The Nation’s Report Card : Mathematics & Reading (2015) Retrieved from http://www.nationsreportcard.gov/reading_math_g12_2015/#reading

[3] NCES, The Nation’s Report Card : Science (2015) Retrieved from http://nationsreportcard.gov/science_2015/#?grade=4

[4] On community college success rates see J. M. Beach, Gateway to Opportunity: A History of the Community College in the United States (Sterling, VA: Stylus, 2011); James E. Rosenbaum, Regina Deil-Amen, and Ann E. Person, After Admission: From College Access to College Success (New York: Russell Sage Foundations, 2006).  On college and university success rates see James Rosenbaum, Beyond College for All: Career Paths for the Forgotten Half (New York: Russell Sage Foundations, 2001) 57.

[5] Tom Nichols, The Death of Expertise: The Campaign against Established Knowledge and Why It Matters (Oxford: Oxford University Press, 2017) 95.

[6] Richard Arum and Josipa Roksa, Academically Adrift: Limited Learning on College Campuses (Chicago: University of Chicago Press, 2011), 36.  On the debate over the study’s validity see John Aubrey Douglass, Gregg Thomson and Chun-Mei Zhao, “The Holy Grail of Learning Outcomes,” University World News Global Edition 211 (4 March 2012), 3 Dec. 2012  <www.universityworldnews.com>; Ou Lydia Liu, Brent Bridgeman, and Rachel M. Adler,  “Measuring Learning Outcomes in Higher Education: Motivation Matters,” Educational Researcher 41.9 (2012): 352-362.  For another study showing college graduates lack of skills see: Educational Testing Service, America’s Skills Challenge: Millennials and the Future (Princeton: Educational Testing Service, 2015).  For a study showing some measurable gains in college student’s critical thinking see Patricia M. King and Karen Strohm Kitchener, Developing Reflective Judgment: Understanding and Promoting Growth and Critical Thinking in Adolescents and Adults (San Francisco: Jossey-Bass, 1994), see especially chart on 161, and also Christopher R. Huber and Nathan R. Kuncel, “Does College Teach Critical Thinking? A Meta-Analysis, Review of Educational Research 86.2 (2016): 431-468.

[7] Ken Bain, What the Best College Teachers Do (Cambridge, MA: Harvard University Press, 2004) 24, 41.  The term “bulimic education” comes from Robert de Beaugrande, “Knowledge and Discourse in Geometry: Intuition, Experience, Logic,” Journal of the International Institution for Terminology Research 3/2 (1992): 29-125.

[8] David F. Labaree, How to Succeed in School Without Really Learning: The Credentials Race in American Education (New Haven: Yale University Press, 1997) 258-59.  See also Denise Clark Pope, “Doing School:” How We Are Creating a Generation of Stressed Out, Materialistic, and Miseducated Students (New Haven: Yale University Press, 2001); Alison Wolf, Does Education Matter?  Myths about Education and Economic Growth (London: Penguin Books, 2002); W. Norton Grubb and Marvin Lazerson, The Education Gospel: The Economic Power of Schooling (Cambridge, MA: Harvard University Press, 2004); Tom Nichols, The Death of Expertise: The Campaign against Established Knowledge and Why It Matters (Oxford: Oxford University Press, 2017) ch 3.

[9] Ken Bain, What the Best College Students Do (Cambridge, MA: Harvard University Press, 2012), 9.

[10] Bain, What the Best College Teachers Do, Ibid., 24.

[11] Ibid.

[12] Qtd. in Bain, What the Best College Students Do, Ibid., 9.

[13] Carol S. Dweck, “Beliefs that Make Smart People Dumb,” Why Smart People Can Be So Stupid.  Ed. Robert J. Sternberg (New Haven: Yale University Press, 2002) 24.

[14] Nichols, The Death of Expertise, Ibid.,72.

[15] Arum and Roksa, Academically Adrift, Ibid., 38-57

[16] Brad Hershbein, “A College Degree Is Worth Less If You Are Raised Poor,” Brookings (Feb 19, 2016) Retrieved from www.brookings.edu; “Skipping Class,” The Economist (Jan 28, 2017) 27.  See also Daniel Golden, The Price of Admission: How America’s Ruling Class Buys Its Way into Elite Colleges (New York: Crown, 2006).  The disparities in post-college earnings and likelihood of ending up rich are largely erased if a poor student is lucky enough to attend an elite university, especially an Ivy League school.  But to take the case of Princeton, students from the bottom 20 percent represent only 2 percent of the student body.  A student from the top 0.1 percent is 315 times more likely to make it in to Princeton than a student from the bottom 20 percent.

[17] Learning and Earning: Special Report on Lifelong Education, The Economist (Jan 14, 2017) 1-16.

[18] Jeanne M. Powers, Gustavo E. Fischman, and David C. Berliner, “Making the Visible Invisible: Willful Ignorance of Poverty and Social Inequalities in the Research-Policy Nexus,” Review of Research in Education 40 (March 2016) 744-776; Linda Darling-Hammond, The Flat World and Education: How America’s Commitment to Equity Will Determine Our Future (New York: Teacher’s College Press, 2010); Jennifer L. Hochschild, Facing Up to the American Dream: Race, Class, and the Soul of the Nation (Princeton: Princeton University Press, 1995); Robert D. Putnam, Our Kids: The American Dream in Crisis (New York: Simon & Schuster, 2015); Steven Brint and Jerome Karabel, The Diverted Dream: Community Colleges and the Promise of Educational Opportunity in America, 1900-1985 (Oxford: Oxford University Press, 1989).

[19] Putnam, Our Kids, Ibid.; James J. Heckman, Giving Kids a Fair Chance (Cambridge, MA: MIT Press, 2013); Mischel, The Marshmallow Test, Ibid.

[20] Walter Mischel, The Marshmallow Test: Why Self-Control is the Engine of Success (New York: Little, Brown and Company, 2014) 244.

[21] Putnam, Our Kids, Ibid., 41.

[22] Ralph W. Tyler, Basic Principles of Curriculum and Instruction (Chicago: University of Chicago Press, 1949).

[23] David Perkins, Outsmarting IQ: The Emerging Science of Learnable Intelligence (New York: Free Press, 1995); Deanna Kuhn, The Skills of Argument (Cambridge, UK: Cambridge University Press, 1991) 289; Keith E. Stanovich, What Intelligence Tests Miss: The Psychology of Rational Thought (New Haven: Yale University Press, 2009) 67; Antonio Damasio, Descartes’ Error: Emotion, Reason, and the Human Brain (New York: Penguin, 1994); Robert J. Sternberg, Wisdom, Intelligence, and Creativity Synthesized (Cambridge, UK: Cambridge University Press, 2003) 38.  Perkins coined the term “mindware,” like the term “software” for a computer, as learnable programs that help us think: “whatever people can learn that helps them to solve problems, make decisions, understand difficult concepts, and perform other intellectually demanding tasks better” (p. 13, 102).  See also Richard E. Nisbett, Mindware: Tools for Smart Thinking (New York: Farrar, Straus and Giroux, 2015).

[24] Sternberg, Wisdom, Intelligence, and Creativity Synthesized, Ibid., 84.

[25] Benjamin S. Bloom, Ed. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain (New York: Longmans, Green & Co., 1956); E. D. Hirsch, Jr., Cultural Literacy: What Every American Needs to Know (New York: Vintage, 1988) 2-3; E. D. Hirsch, Jr., The Knowledge Deficit: Closing the Shocking Education Gap for All American Children (Boston: Houghton Mifflin, 2006); Bain, What the Best College Students Do, Ibid., especially ch 5; Robert J. Sternberg, The Triarchic Mind: A New Theory of Human Intelligence (New York: Viking, 1988); Daniel Goleman, Emotional Intelligence (New York: Bantam, 1995); Keith E. Stanovich, What Intelligence Tests Miss: The Psychology of Rational Thought (New Haven: Yale University Press, 2009); Perkins, Outsmarting IQ, Ibid.; Nisbett, Mindware, Ibid.; Deanna Kuhn, The Skills of Argument (Cambridge, UK: Cambridge University Press, 1991); Deanna Kuhn, Education for Thinking (Cambridge: Harvard University Press, 2005); Harvey Siegel, Rationality Redeemed? Further Dialogues on an Educational Ideal (New York: Routledge, 1997); Robert Nozick, The Nature of Rationality (Princeton: Princeton University Press, 1993) 64.

[26] Few argue for a blending of English and Science education.  For a review of research on English and Science education see Melanie Sperling and Anne DiPardo, “English Education Research and Classroom Practice: New Directions for New Times,” Review of Research in Education 32 (Feb 2008): 62-108; Richard Duschl, “Science Education in Three-Part Harmony: Balancing Conceptual, Epistemic, and Social Learning Goals,” Review of Research in Education 32 (Feb 2008): 268-291; Marcia C. Linn, Libby Gerard, Camillia Matuk, and Keven W. McElhaney, “Science Education: From Separation to Integration,” Review of Research in Education 40 (March 2016): 529-587.

[27] Bain, What the Best College Teachers Do, Ibid., 27.  See also Siegel, Rationality Redeemed? Further Dialogues on an Educational Ideal, Ibid.

[28] Stanovich, What Intelligence Tests Miss, Ibid; Patricia M. King and Karen Strohm Kitchener, Developing Reflective Judgment: Understanding and Promoting Growth and Critical Thinking in Adolescents and Adults (San Francisco: Jossey-Bass, 1994); Robert J. Sternberg, The Triarchic Mind: A New Theory of Human Intelligence (New York: Viking, 1988).

[29] Labaree, How to Succeed in School Without Really Learning, Ibid.

[30] Bain, What the Best College Teachers Do, Ibid.

[31] Tom Nichols, The Death of Expertise: The Campaign against Established Knowledge and Why It Matters (Oxford: Oxford University Press, 2017) 76.

[32] Michael Lewis, The Undoing Project: A Friendship that Changed Our Minds (New York: W. W. Norton, 2017) 192; Nichols, The Death of Expertise, Ibid.

[33] Steven Sloman and Philip Fernbach, The Knowledge Illusion: Why We Never Think Alone (New York: Riverhead, 2017) 8.

[34] Lewis, The Undoing Project, Ibid., 140.

[35] Bain, What the Best College Teachers Do, Ibid., 29.  See also Sloman and Fernbach, The Knowledge Illusion, Ibid.

[36] Stephen S. Hall, Wisdom: From Philosophy to Neuroscience (New York: Knopf, 2010); Robert J. Sternberg, Wisdom, Intelligence, and Creativity Synthesized (Cambridge, UK: Cambridge University Press, 2003) ch 7; Siegel, Rationality Redeemed? Further Dialogues on an Educational Ideal, Ibid.

[37] K. Anders Ericsson and Jacqui Smith, Eds., Toward a General Theory of Expertise: Prospects and Limits (Cambridge, UK: Cambridge University Press, 1991); K. Anders Ericsson, Ed., The Road to Excellence: The Acquisition of Expert Performance in the Arts and Sciences, Sports and Games (Mahwah: Erlbaum, 1996).

[38] Edward Stein, Without Good Reason: The Rationality Debate in Philosophy and Cognitive Science (Oxford: Oxford University Press, 1996) 2, 38.

[39] William A. Sandoval, Jeffrey A. Green, and Ivar Braten, “Understanding and Promoting Thinking about Knowledge: Origins, Issues, and Future Directions of Research on Epistemic Cognition,” Review of Research in Education 40 (March 2016): 457-496.

[40] Robert J. Sternberg, The Triarchic Mind: A New Theory of Human Intelligence (New York: Viking, 1988) 11.

[41] Stanovich, What Intelligence Tests Miss, Ibid., 67, 148; King and Kitchener, Developing Reflective Judgment, Ibid., 1, 69; Perkins, Outsmarting IQ, Ibid., 85, 241, 107-108; Jonathan St. B. T. Evans, Bias in Human Reasoning: Causes and Consequences (Hove: Lawrence Erlbaum, 1989) 66-67.

[42] Perkins, Outsmarting IQ, Ibid., 8, 117.

[43] Deanna Kuhn, The Skills of Argument (Cambridge, UK: Cambridge University Press, 1991) 265, 270; Siegel, Rationality Redeemed? Further Dialogues on an Educational Ideal, Ibid.

[44] Robert Nozick, The Nature of Rationality (Princeton: Princeton University Press, 1993) xiv-xv.

[45] Qtd. in Tom Nichols, The Death of Expertise, Ibid., 1.

[46] Jared Diamond, Collapse: How Societies Choose to Fail or Succeed (New York: Viking, 2005).

[47] Nichols, The Death of Expertise, Ibid., 75.

[48] Barbara Ehrenreich, Bait and Switch: The (Futile) Pursuit of the American Dream (New York: Metropolitan Books, 2005); Louis Uchitelle, The Disposable American: Layoffs and Their Consequences (New York: Vintage, 2006); Learning and Earning: Special Report on Lifelong Education, Ibid.

[49] Learning and Earning, Ibid.

[50] “The Art of the Lie,” The Economist (Sept 10, 2016) 9; “The Post-Truth World,” The Economist (Sept 10, 2016) 17-20; Amy B. Wang, “Post-Truth Named 2016 Word of the Year,” The Washington Post (Nov 16, 2016) Retrieved from www.washingtonpost.com.

[51] Rick Shenkman, Just How Stupid Are We? Facing the Truth about the American Voter (New York: Basic Books, 2008); Susan Jacoby, The Age of American Unreason, Revised Edition  (New York: Vintage, 2009); Richard H. Thaler, Misbehaving: The Making of Behavioral Economics (New York: W. W. Norton, 2015); Martin Lindstrom, Brandwashed: Tricks Companies Use to Manipulate Our Minds and Persuade Us to Buy (New York: Crown, 2011).

[52] Ilya Somin, “Political Ignorance in America,” The State of the American Mind, Eds.  Mark Bauerlein and Adam Bellow (West Conshokocken, PA: Templeton, 2015), 163-64.

[53] George A. Akerlof & Robert J. Shiller, Phishing for Phools: The Economics of Manipulation and Deception (Princeton: Princeton University Press, 2015); Thaler, Misbehaving, Ibid; Larry M. Bartels, Unequal Democracy: The Political Economy of the New Gilded Age (New York: Russell Sage Foundation & Princeton, NY: Princeton University Press, 2008); Tom Nichols, The Death of Expertise: The Campaign against Established Knowledge and Why It Matters (Oxford: Oxford University Press, 2017).

[54] Thaler, Misbehaving, Ibid.; Richard Thaler and Cass R. Sunstein, Nudge: Improving Decisions about Health, Wealth, and Happiness (New Haven: Yale University Press, 2008).

[55] Steven Sloman and Philip Fernbach, The Knowledge Illusion: Why We Never Think Alone (New York: Riverhead, 2017) 143.

[56] Nicholas Carr, The Shallows: What the Internet is Doing to Our Brains (New York: W. W. Norton, 2011); Nichols, The Death of Expertise, Ibid., ch 4.

[57] Hirsch, Jr., The Knowledge Deficit, Ibid., xvii, 12, 25; Tichenor, Phillip J., George A. Donohue, and Clarice N. Olien, “Mass Media Flow and Differential Growth in Knowledge,” Public Opinion Quarterly (1970) 159–170.  This article first proposed a theory of differential knowledge based on class whereby wealthier individuals not only have more knowledge, but they have more educational capital to acquire more knowledge at faster rates then lower class individuals.  See also Viswanath, Kasisomayajula, Nancy Breen, Helen Meissner, Richard P. Moser, Bradford Hesse, Whitney Randolph Steele, and William Rakowski, “Cancer Knowledge and Disparities in the Information age,” Journal of Health Communication (2006) 1–17.

[58] Daniel C. Dennett, Freedom Evolves (New York: Viking, 2003).

[59] “The World’s Most Valuable Resource,” The Economist (May 6, 2017) 9; “Fuel of the Future,” The Economist (May 6, 2017) 19-22.  See also Alex Wright, Glut: Mastering Information Through the Ages (Ithaca: Cornell University Press, 2007).

[60] Stanovich, What Intelligence Tests Miss, Ibid., 67.  Stanovich argues, “The tools of rationality – probabilistic thinking, logic, scientific reasoning – represent mindware that is often incompletely learned or not acquired at all” (p. 67).

[61] This new class was first documented by Daniel Bell who argued that knowledge and technology would become the “central” resources of post-industrial society (p. 263).  The Coming of Post-Industrial Society: A Venture in Social Forecasting (New York: Basic Books, 1973), see especially ch 3, ch 6, and part 3 of the CODA.

[62] Ibid.  See also Arum and Roksa, Academically Adrift, Ibid., 38-57.

[63] Herbert J. Walberg and Shiow-Ling Tsai, “Matthew Effects in Education,” American Educational Research Journal 20.3 (Fall 1983): 359-73.  See also Hirsch, Jr., The Knowledge Deficit, Ibid., 25; Daniel Goleman, Emotional Intelligence (New York: Bantam, 2005) xvii; Keith E. Stanovich, Rationality, Intelligence, and Levels of Analysis in Cognitive Science: Is Dysrationalia Possible?” Why Smart People Can Be So Stupid.  Ed. Robert J. Sternberg (New Haven: Yale University Press, 2002) 148-49.

[64] Stanovich, Rationality, Intelligence, and Levels of Analysis in Cognitive Science,” Ibid., 148-49.

[65] Hirsch, Jr., The Knowledge Deficit, Ibid; Daniel Goleman, Emotional Intelligence (New York: Bantam, 1995); Learning and Earning, Ibid.

[66] Deanna Kuhn, The Skills of Argument (Cambridge, UK: Cambridge University Press, 1991) 270.  Paul L. Harris explains how “the end point of cognitive development is not objectivity and equilibrium.  It is a mix of the natural and supernatural, of truth and fantasy, of faith and uncertainty” (p. 7).  Trusting What You’re Told: How Children Learn from Others (Cambridge, MA: Harvard University Press, 2012).

[67] Richard E. Nisbett, Mindware: Tools for Smart Thinking (New York: Farrar, Straus and Giroux, 2015); Dana S. Dunn, Bryan K. Saville, Suzanne C. Baker, and Pam Marek, “Evidence-Based Teaching: Tools and Techniques that Promote Learning in the Psychology Classroom,” Australian Journal of Psychology 65 (2013): 5–13; Deanna Kuhn, Education for Thinking (Cambridge: Harvard University Press, 2005) 60 72, 187; Daniel Goleman, Emotional Intelligence (New York: Bantam, 1995) ch 4; Stanovich, What Intelligence Tests Miss, Ibid.

[68] Stanovich, What Intelligence Tests Miss, Ibid. 67, 148; King and Kitchener, Developing Reflective Judgment, Ibid., 1, 69; Perkins, Outsmarting IQ, Ibid., 241, 107-108.

Why Community Colleges?

The Institutionalization of Community Colleges in the United States

 

This is the Preface to my book, Gateway to Opportunity? A History of the Community College in the United States, which was published in 2011.

 
Can an institution which was ‘born subordinate’ as the lower-level holding pen for the university overcome its own legacy and develop into a truly meritocratic and democratizing institution?
— J. M. Beach
 

At the dawn of the 21st century some 30 percent of American adults have earned a bachelors degree or higher.  This is the highest percentage of Americans earning a higher education in this country’s history; however, higher education is still not equally available for all American citizens and the returns to a college credential still brings differential earnings based on ethnicity, sex, and class.  Access to institutions of higher education and the knowledge and economic returns of a college education is not for everyone.  It continues to be restricted to a minority of the American population, although this educated minority has grown significantly over the past century.[i]  

It took the United States almost two centuries to grant all citizens full political rights, but at the start of the 21st century not all citizens have equal access to the political process, nor equal claim on their political representatives.  Centuries of social and political struggle have enabled a large minority of American citizens to gain a measure of economic, educational, and political success, but the sacred principles articulated in the Declaration of Independence have yet to become a reality for all citizens, let alone the millions of immigrants and foreigners living in this country: not all Americans are living free and equal in their pursuit of happiness, nor is the government (which was supposedly instituted by and for the people) equally responsive or just in protecting the rights and safety all citizens.[ii]

It is questionable whether conditions will improve for the majority of Americans in the 21st century, especially given the global economic collapse of 2008-2010.  Will most citizens have increased access to and success in higher education?  Will most citizens have increased access to and participation in the political process?  Will most citizens experience a more just and equitable distribution of income?  Will most citizens be able to rely upon quality social services and safety nets, like public schooling, affordable health care, and retirement benefits?  And how much access will immigrants have to participate in American society, higher education, and the political process? 

Peter Drucker, an influential business scholar, predicted in the 1990s that a new elite class of “knowledge workers” would form.  With higher education credentials and specialized technological skills, these knowledge workers would one day foment a “new class conflict” in America.  According to Drucker, this new socio-political order would not only be inevitable, but also “right and proper” because those citizens privileged enough to become educated deserved to rule.  Drucker’s message was clear: American citizens must either scramble up the competitive ladder of success by earning college degrees and gaining technology oriented skills, or they shall rightfully fall beneath a new class of technocratic elites.[iii]  Drucker’s version of the American dream reformulates what was once a mythic hope, and turns the ideology of Americanism into a dystopic threat: better yourself or else!  For those citizens of the United States who hold sacred the democratic principles outlined in The Declaration of Independence, this dark and foreboding prophecy betrays the very hope this nation supposedly embodies.  But in order to make an accurate assessment of future social, educational, and political possibilities, the past must be revisited and understood in order to contextualize the complexity of the present. 

But complex understandings of history rarely inform public policy in this country.  Even when policy makers are aware of history, rarely does historical knowledge impact the political process through which public policy is fought over, negotiated, and compromised.  As Deborah Stone has argued, the policy making process is about power and “the struggle over ideas,” as politicians rhetorically dance in the many political fires of competing interests.  Policy makers seek to “control interpretations” by framing, or spinning, present problems under the rhetorical guise of what is legitimate, what is feasible, or what is good.  But rarely do policy makers consider the historical complexity of how the past might inform the present.  Under constricted political conditions, history is valued to the extent that it can be fashioned into useful political tools.  In most instances this amounts to the denial of history and the creation of “myth” – policy history almost always becomes a quasi-fictional “political narrative” shaped by the powerful to legitimate their power and, thereby, secure social, political, and economic resources.[iv]  

Higher education policy in the U.S. rests on the foundational myth of meritocracy and equal access to higher education, but neither narrative has much concrete historical validity.  A look at the history of higher education in the U.S. and the changing dynamics of student access confirms this.  History reveals some expansion of access and equity in terms of increasing amounts of post-secondary education for a broader swath of Americans, but inequality remains constant.  Traditionally underserved populations, like the economically disadvantaged and non-white ethnic/racial minorities, still struggle to achieve equality of opportunity in American society and its institutions of higher education.  Financial returns for postsecondary degrees are still lower for women and non-white minorities because regional labor markets continue to perpetuate a long history of institutionalized discrimination.  As the U.S. moves into a post-industrial “knowledge economy” in a highly globalized world, the issue of student access to higher education has become one of the most pressing political problems for those concerned with both socio-political equity and economic development.  The educational and economic success of the student and the economic development of the nation have become intertwined political issues.  Can the U.S. keep its dominant economic position in the highly competitive world economy with only 30% of the American population holding bachelor’s degrees?  If the majority of U.S. citizens lack access to higher education, can the U.S. live up to its democratic principles and preserve its political institutions? 

At the center of these questions is the policy issue of access to higher education.  Who has access to what forms of higher education at what cost?  For most Americans, access is restricted to the open-access, low-cost American community college.  This institution enrolls around half of all first-time freshmen in the U.S.  Most students who enroll in community colleges have the goal of transferring to a four-year college or university in order to earn a bachelors degree, but the vast majority of these students will never earn any degree.  Community colleges have been praised for almost a century as an efficient way to handle the vast surge of Americans looking for access to higher education and as an economical path for social mobility.  However, it is unclear if this institution actually helps students, let alone how it might help.  Scholars have never been able to completely agree on the mission of the community college and, therefore, have never been able to adequately determine what it is the community college is supposed to do, not to mention how it is supposed to do it. 

The junior college, later renamed the community college in the 1960s and 1970s, was originally designed to limit access to higher education in the name of social efficiency.  But students and local communities utilized the democratic rhetoric of Americanism and the promised made by junior college leaders to refashion this institution as a tool for increased social mobility, community organization, and regional economic development.  Thus, this institution, much like the country itself, was born of contradictions and continues to be an enigma.  These contradictions have been sewn into the very fabric of what has become a celebrated, yet beleaguered, institution of higher education.  This institution has promised for over a century to be the foundation of increased access to college and to the middle-class, as well as a host of other lofty goals.  But what has this institution actually done?  Unraveling the institutional complexity and contradiction of the community college will be the central focus of this historical study.  At the heart of this study is the policy issue of access to higher education: Does the community college offer increased access to higher education and social mobility, or is this “semi-higher” institution just a diversion keeping the economically disadvantaged and ethnic minorities from realizing the American Dream? 

But a word of warning:  Even at the end of this study the reader will find no clear answers to these questions.  However, the broad trends of history will reveal some hope that these questions might be more definitely answered in the near future. 

 

What is an Institution?

This study is an analysis of the creation, institutionalization, and politicized debate over the American community college.  It is a study of an unique American institution, which developed over the course of the twentieth century.  But in order to understand the community college as an institution, we need to understand the basic theory of social institutions.  The study of social institutions reaches back to the 19th century, deriving its origins within the disciplines of history and sociology.  Given the concept’s long life, this term’s meaning has greatly varied in usage and definition.  However, there does seem to be an analytical core to this word: institutions are the “self-evident” and “taken for granted” social or organizational structures found within a particular human society.  Institutions as social structures can be described as the “organized, established, procedure(s)” that structure a whole society or particular parts of society.  Institutions can also be described as the “constituent rules” of a society or part of society.  The early historical-sociologists, Karl Marx, Emile Durkheim, and Max Weber, each studied different constituting rule systems and social structures of Western society so as to understand the underlying logics that established the social, political, economic, and religious rules, organizations, procedures, rituals, and ideas that ordered the modern Western world.  Such institutions included: capitalism, Christianity, property, individualism, the state, bureaucracy, rationality, and education. 

This study focuses specifically on the institution of the American junior college, which later became the community college.  This study seeks to uncover the historically conditioned rules, procedures, rituals, and ideas that have ordered and defined a particular type of educational structure.  Institutions are human creations, so to study institutions is to study the actions, ideas, and organizations of human beings.  At the core of this study are those individuals, organizations, ideas, and other social phenomenon that have contributed to defining the junior/community college’s educational missions and have enabled or constrained this institution from enacting those missions: What have been the purposes of the junior/community college?  Who conceptualized these purposes and why?  How has this institution been able to achieve these purposes?  How have these issues changed over time? 

Of course embedded in these questions are unexamined political assumptions:  Who has the right or power to define this institution’s mission?  Who has the responsibility for supporting and enacting its roles?  And whose values are to be used to judge this institution? 

But there are also deeper assumptions embedded within all scholarly research and public policy.  The taken for granted rationality of modernity claims that human individuals have enough knowledge and power to control society, the people in that society, and the institutions that define and structure that society.  But this hypothesis continues to be unproven, and important questions remain unanswered: how rational are human beings?  How much do human beings control their psychological, social, and physical environments?  Do human beings have the power to control and change social institutions?[v]  Academic scholarship on institutions has only begun to ask these questions, unearthing the grounding assumptions of modern rationality. 

We will only briefly discuss these issues of rationality and social institutions here in this preface.  Examining these important assumptions would take a sustained, interdisciplinary study that far exceeds the more limited parameters of this book, narrowly focused as it is on the institutionalization of the community college.  However, it is important to discuss these issues because our understanding of human institutions, historical change, and the future of the community college in particular, rests upon our assumptions of individual rationality, the power of human agency, and the capacity for social change. 

The concept of social institutions hurdles a social-scientific dualism that has been unresolved for the past century.  At the center of the social sciences has been a central debate over how societies and social institutions are constituted and how they change.  Societies and social institution can be seen, on the one hand, as the “product of human design” and the “outcome of purposive” human action, but they can also be seen as the “result of human activity,” but “not necessarily the product of conscious design.”  One of the paradigmatic examples of this dualism is language.  Human beings are born speaking a particular language with pre-defined words and a pre-designed grammar; however, individual human beings are also able to adopt new languages, create new words, and change the existing definition of words or grammatical structures.  But is any individual or group of individuals in conscious control of any particular language?  The obvious answer is no, but each individual has some measure of effect; however, just how much effect is subject to debate.  For the past quarter century or so, scholars have rejected the idea that societies, institutions, and organizations can be reduced to the rational decisions of individuals.  The new theory of institutions focuses on larger units of analysis, like social groups and organizations “that cannot be reduced to aggregations or direct consequences of individual’s attributes or motives.”  Individuals do constitute and perpetuate social structures and institutions, but they do so only half aware, and not as completely or as freely as they often imagine.[vi] 

The new institutional theory has focused mainly on how social organizations have been the locus of “institutionalization,” which is the formation and perpetuation of social institutions.  While groups of human beings create and sustain social organizations, these organizations develop through time into structures that resist individual human control.  Organizations also take on a life of their own that sometimes defies the intentions of those human beings “in charge” of directing the organization.  While institutions can sometimes begin with the rational planning of individuals, the preservation and stability of institutions through path dependent processes is often predicated on ritualized routines, social conventions, norms, common sense, and myths.  Once an institution becomes “institutionalized,” the social structure perpetuates a “stickiness” that makes the structure “resistant” to change.  Individual human actors, thereby, become enveloped and controlled by the organization’s self-reinforcing social norms, rules, and explanatory myths, which are solidified through positive feedback mechanisms that transcend any particular human individual.  These organizational phenomena, thereby, shape individual human perception, constrain individual agency, and constitute individual action.  As one institutional theorist has argued, all human “actors and their interests are institutionally constructed.”   To a certain extent humans do create institutions and organizations, but more immediately over the course of history, institutions and organizations create us!  Many millions of individuals have consciously shaped the English language, but as a child I was constituted as an English speaking person without my knowledge or consent.  It is perhaps more accurate to say that English allowed for the creation of my individuality than it is to say that I have shaped the institution of English.[vii] 

But if all human thought and action is constituted by previously existing institutions, do human beings really have any freedom to shape their lives or change society?  This is actually a very hard question to answer and it remains at the center of longstanding debates.  Durkheim and Parsons seemed to solidify a sociology that left no room for individual volition.  Marx stressed human control, but seemed to put agency in the hands of groups, not individuals.  Weber discussed the possibility of individual agency, especially for charismatic leaders, but he emphasized how human volition was always “caged” by institutions and social organizations.  Michel Foucault conceptualized human beings as almost enslaved by the various modern institutions of prisons, schools, and professions.[viii]

Some recent neo-institutional theorists have left open the possibility of individual rationality and freedom.  Human agency sometimes defined as the mediation, manipulation, and sometimes modification of existing institutions.  Human beings can also refuse institutionalized norms and procedures, thereby, highlighting another type of agency.  Humans can also exploit contradictions between different institutional structures, and use one institution to modify another.  Ronald L. Jepperson argues that there can be “degrees of institutionalization” as well as institutional “contradictions” with environmental conditions.  This means that certain institutions can be “relative[ly] vulnerab[le] to social intervention” at particular historical junctures.  Jepperson is one of the few institutional analysts who conceptualizes a theory of human action and institutional change, which allows for “deinstitutionalization” and “reinstitutionalization.”  But Jepperson does not validate rational choice theories of individual agency.  He argues instead that “actors cannot be represented as foundational elements of social structure” because their identity and “interests are highly institutional in their origins.”  However, this position does not disavow institutionally mediated individual choice and action.  As Walter W. Powell has argued, “individual preferences and choices cannot be understood apart from the larger cultural setting and historical period in which they are embedded,” but individual actors have some freedom within institutional environments to “use institutionalized rules and accounts to further their own ends.”   Roger Friedland and Robert R. Alford argue that “the meaning and relevance of symbols may be contested, even as they are shared.”  “Constraints,” Powell paradoxically argued in one essay, “open up possibilities at the same time as they restrict or deny others.”[ix]

The anthropologist Sherry B. Ortner has developed a more comprehensive theory of human agency that allows individuals more power to consciously participate in and, thereby, shape and modify institutions.  She defines the individual agent as in a “relationship” with social structures.  This relationship can be “transformative” on both parties: each acts and shapes the other.  While the individual is enveloped by social structures, there is a “politics of agency,” where individual actors can become “differentially empowered” within the layered “web of relations” that make up the constraints of culture.  Individuals can act through a process of reflexivity, resistance, and bricolage.  Humans use an awareness of subjectivity and negotiate their acceptance and refusal of the status quo.  Through this process, humans can re-create existing social structures by reforming traditional practices and also by introducing novel practices.  Ortner conceptualized the process of agency as the playing of “serious games,” utilizing a metaphor originally deployed by Wittgenstein.  She argued forcefully that existing cultural structures and social reproduction is “never total, always imperfect, and vulnerable,” which constantly leaves open the possibility of “social transformation” to those who dare to act out against the status quo.[x]    

Traditionally, social scientists have assumed an inflated notion of rationality, agency, and control for human individuals.  Neo-institutional theory has sought to correct these fallacies.  But traditional social science has also assumed these same qualities for social organizations as well: societies, political states, and economic corporations.  Structural functionalist sociologists, using classical organizational theories, often conceptualized modern society as a highly structured and rationalized field populated by various bureaucratic organizations, with specific and clearly defined social functions.  It was assumed that social organizations were driven by rationalized processes, efficient technologies, and controlling managers.  Organizations were seen as a totalizing social structure that “use[d] human beings to perform organizational tasks.”  Organizations were also seen as insulated structures, which were clearly differentiated and autonomous from the larger society.[xi]

Later organizational theorists, still embracing a structural functionalism, pointed out how organizations were only quasi-rational and largely constrained by other social structures.  These new organizational theorists also pointed out that the functions of an organization were often “loosely coupled” or in “conflict” with its actual operations.  Organizational actors could be “limited in their knowledge and in their capacities,” and thus, merely “’subjectively’ rational,” which meant that individuals, even corporate managers, were not in complete control of themselves, let alone their organizations.[xii]  This led some organizational theorists to describe social organizations as “anarchical,” because nobody seemed to be in complete control, but yet the organization did seem to function because it was not falling apart.  Organizations also came to be seen not as isolated entities, but as connected to and influenced by other organizations and larger social structures, like the state or the regional economy.[xiii] 

The new institutionalism took this quasi-rational and constrained line of analysis even further.  New research has shown that social organizations are more often structured by “the myths of their institutional environments” then they are by the functional “demands of their work activities.”  In fact, the supposedly functional technology of modern organizations in the post-industrial West have come to be seen as not very functional and not very efficient.  Instead of an objective system of rationality, organizations are now seen to be ordered by “myths” of rationality that “codify” various “institutional rules” based on the “authoritative” normative design and isomorphic power of “rationalized bureaucracies.”  Thus, neo institutional theorists argue that organizations are driven by social “legitimacy” and “survival” within an “institutional environment,” instead of rationality and productivity, especially organizations like schools and churches, which operate in highly “institutional environments.”  Paul J. DiMaggio and Walter W. Powell have argued, “Organizations compete not just for resources and customers, but for political power and institutional legitimacy, for social as well as economic fitness.”  Under such circumstances, managers do not necessarily control production or efficiency, but instead are often “ceremonial” figures who preside over a “loosely coupled” organization driven by the assumption that “everyone is acting in good faith.”[xiv]

The notion of rationality and control was also assumed to play a large role in the transmission of social structures and social ideologies.  Many traditional institutional studies have generally explained how “social knowledge” becomes institutionalized, thereby becoming “part of objective reality,” and then transmitted “directly” as an object.  But such accounts seem to de-legitimize and overly reify ideas in a Marxian type base/superstructure duality, whereby, ideas seem to have value only as objective structures.  Instead, Friedland and Alford argue, “Institutions must be reconceptualized as simultaneously material and ideal”: institutions are not only material practices or objects that can be reproduced, but they are also “symbolic systems, ways of ordering reality, and thereby rendering experience of time and space meaningful.”[xv] 

The new institutionalism has developed this more complex theory of ideology from combining the insights of cultural anthropology and the sociology of knowledge.  An ideology is a cultural system, whereby, people share “common ideological orientations,” a notion of “common sense,” and a shared, “taken for granted” understanding of “the social reality of everyday life.”  As the cultural anthropologist Ruth Benedict argued in the early 20th century, “No man [sic] ever looks at the world with pristine eyes.  He sees it edited by a definite set of customs and institutions and ways of thinking.”  But many institutional researchers seem to fall into the totalizing trap of a homogenized cultural anthropology, whereby, power and socio-political conflict is ignored in favor of a monolithic, functional cultural system of common sense.  Ortner is one of the few theorists who acknowledges the power of structural constraints, but still allows for the possibility of a limited form of human agency.[xvi]

Some institutional researchers address issues of power and allow for competing or conflicting institutional structures within a single society, but more could be done to address issues of power, conflict, and political struggle within the conceptual field of “culture.”[xvii]  Neo-institutional theory needs to revise current anthropological conceptions of ideology to incorporate more of the neo-Marxist focus on politics because many institutions are mythologized in order to “justify the exercise of power” and to “sustain relations of domination.”  The neo-Marxist focus on individual agency, praxis, and “liberation” from oppressive social structures should also be incorporated into the study of institutions and institutional change.[xviii] Institutions can embody contradictions or conflicts.  Ideologies can be contested.  Power can reside not only in the actions of individuals, but within the very structure of institutions.  And actors can use the presence of multiple institutions, multiple ideologies, or institutional contradictions and conflict to exploit competing “institutional logics” as a “basis for resistance.”  John L. Campbell offers a compelling “typology of ideas” for understanding not only how ideas embody institutional structure, but also how institutionalized ideas can be used by actors in political debates over public policy.[xix] 

Recent scholarship has also approached organizations and institutions as social structures “embedded” within an “organizational ecology,” whereby a series of “mutual interactions” between interdependent social groups shape the evolution and survival of organizations, organizational forms, and institutionalized practices and norms.[xx]  Organizations are connected to and influenced by a host of social sectors, including federal nations, geographical regions, states, sub-states, local governments, organizations, and social groups, like the family.[xxi]  Within each sector there are diverse “clusters of norms” and organizational typologies that institutionally define and constrain individual and organizational actors, and thereby, a host of institutional norms and forms are continually reified and perpetuated across a diversely populated social and organizational landscape, which slowly changes through time.  Because societies are characterized by such diversity of social sectors, each with their own institutions and norms, different institutions can be “potentially contradictory,” which can allow for social conflict and social change through time as institutions develop in relation with the institutional and physical environment.[xxii]  However, it is still unclear how institutions “change” and what change actually means.  Theorizing the nature and extent of institutional change is an unresolved issue.  Institutions are seen as stable social structures outside the control of rational agents which seem to slowly adapt to internal and environmental conditions through an incremental process, although there is some evidence to suggest that rapid changes can occur in short periods due to environmental shocks.[xxiii]

 

The Institutionalization of Junior Colleges and Community Colleges

Little attention has been paid to the process and degree of institutionalization within complex ecological and temporal social networks.  How are particular institutions socially constructed and reproduced in specific historical and geographical contexts?  Does the power and stability of particular institutions fluctuate in relation to specific historical and geographical contexts?  Powell has argued that “institutionalization is always a matter of degree, in part because it is a history-dependent process.”[xxiv]  It has become clear that institutions do change in relation to their environments and, thus, scholars who study institutions must be aware of the changing dynamics of historical circumstances and how socio-political and economic environments affect the structure, purpose, and power of existing institutions.

There have been only four historical-sociological case studies focused on the institutionalization of community colleges in the United States,[xxv] although most of these studies have not directly used the theory of institutions as an analytical framework.  These studies have mostly investigated the early formative stages of junior/community colleges in the early 20th century.  Conceptually, these studies trace “communities of practice” or “social movements” that have become ritualized, institutionalized, reproduced, and contested through time, but most of these studies do not follow the junior/community college from its origins to the 21st century, nor have many of these studies focused on how this institution has changed in relation to external socio-political environments.[xxvi]  Only Steven Brint and Jerome Karabel have attempted to trace the institutionalization of the community college into the later half of the 20th century; however, their study only went so far as the 1970s and 80s, and their focus on vocationalism precluded a larger analysis of the contested institutionalization of the community college in relation to other historical trends in the U.S., like racial segregation and the mid-twentieth century Civil Rights movement, or the late twentieth century movement for educational standards and institutional evaluation.  There is also a deeper methodological issue that has not been fully explored: the extent to which university scholars have exercised (and continue to exercise) a primary power over defining, legitimizing, and reforming both institutional discourse and practice.  Thus, the scholarly literature surrounding the junior/community college (and the actual impact this literature has had on practice) is perhaps the most revealing aspect of the public phenomenon of this institution, and thereby, the most important source for a history of the institutionalization process.

The American junior college turned community college will be used in this book as a case study for understanding the formation and historical evolution of a particular institution of higher education.  This case study draws upon the seminal work of Brint and Karabel, and it seeks to extend their basic thesis, with modifications, into the 21st century.  This study will seek to address the larger issues of increased access to higher education and social mobility, while also addressing the more particular issue of the institutional purposes and social role of the junior/community college.  As already mentioned, this study will focus on the political assumptions about who has the right or power to define this institution’s mission, who is responsible for enacting its roles, and how this institution is supposed to be valued and judged.  And this study will also leave open the much larger and more theoretical issue of whether or not particular human actors or groups have been able to control the contours of this institution. 

The basic narrative of the institutionalization of the junior college qua community college is one of social control and organizational anarchy.  While this institution was formed with clear purposes in mind, it became apparent early on that this institution was not operating as planned, nor was it efficient at achieving its stated missions.  Between 1920 and 1940 junior college leaders went through an intense identity crisis as they debated both the purpose of junior colleges and the placement of these institutions between secondary and postsecondary education systems.  Where junior colleges extended secondary schools or separate “junior” colleges?  Were junior colleges primarily supposed to prepare academically talented students for entry into a 4-year university or were they also supposed to train less talented vocationally-oriented students for local labor markets?  And were junior colleges only responsive to universities and labor markets by training and credentialing post-secondary students, or was there also supposed to be responsiveness to local community needs, which might include non-credentialing purposes, like literacy classes, citizenship classes, and general community education classes?  Arguably a measure of consensus over these questions among junior college leaders, federal and state educational authorities, and the general public did not congeal until the publication of the President’s Commission on Higher Education report in 1947.

In 1947 Higher Education for American Democracy seemed to not only legitimate junior colleges by arguing that half of the American population could benefit from two years of postsecondary schooling, but the report also seemed to sanction a broad comprehensive mission for these institutions by suggesting a new name, and thereby, a new institutional identity: the Community College.  Up until the publication of this report, junior college leaders had debated whether the primary function of the institution was to keep its traditional mission as a conduit for student transfers to 4-year universities, or whether it should adopt new missions like offering terminal occupational and semiprofessional programs.  Most junior college leaders lent towards the latter of these two options because it would increase the legitimacy of the institution within already established systems of secondary and postsecondary education.  There were also calls for expanding the institutional mission to incorporate adult education, like literacy and citizenship classes, and also programs that would meet diverse local needs.

However, not everyone at the time saw the community college in such lofty, democratic, and egalitarian terms.  From the start, university officials promoted junior colleges because of their value as a “screening service” to divert many postsecondary students away from the selective and resource limited universities.  State legislators also promoted junior colleges as a less expensive form of higher education for the masses that would allow for cost-effective means to democratize access to higher education, while also creating an institution that would filter out the unprepared or disadvantaged majority from actually earning a college degree.  The University of California, Berkeley sociologist Burton R. Clark famously called this the “cooling out” process.[xxvii]

Clark’s thesis was famously extended into an internationally acclaimed book published in 1989, Steven Brint and Jerome Karabel’s The Diverted Dream: Community Colleges and the Promise of Educational Opportunity in America, 1900 – 1985.  Brint and Karabel argued that the educational system in the U.S. has always been a “hierarchically differentiated” system that has been structurally connected to the labor market and class structure.  But the American educational system has also been relatively “open” and democratic, especially in the 20th century, and most Americans have seen it as a “ladder of opportunity” and “upward mobility.”  The institution of community colleges offered an “egalitarian promise,” but at the same time it also reflected the “constraints” of the capitalist economic system in which it was embedded.  Part of the reality of that system is an optimistic society that generates more “ambition” than its can structurally satisfy, which creates a need for an elaborate and often “hidden” tracking system to channel students into occupationally appropriate avenues largely based on their socio-economic origins.[xxviii] 

From its beginnings the community college has had the “contradictory” function of opening higher education to larger numbers of students from all socio-economic backgrounds while at the same time operating within a “highly stratified” economic and educational system, which created a need to “select and sort students.”  This “cooling-out function” (or “the diversion effect”) caused ever increasing numbers of lower SES students in higher education to be diverted into more “modest positions” at the lower end of the labor market.  As Burton Clark once admitted, “for large numbers failure is inevitable and structured.”  Brint and Karabel argued that not only do community colleges help “transmit inequalities” through their sorting function, but they also “contribute to the legitimization of these inequalities” by upholding meritocratic rhetoric that often blames the victim for failing to succeed in an structurally rigged class-system: “The very real contribution that the community college has made to the expansion of opportunities for some individuals does not, however, mean that its aggregate effect has been a democratizing one.  On the contrary, the two-year institution has accentuated rather than reduced existing patterns of social inequality.”[xxix]

The majority of students who enrolled in junior colleges during the first half of the 20th century were middle class high school graduates looking to earn their bachelor’s degree and enter a white collar profession.  Working class high school students either dropped out of high school early to get a job, or they waited until earning their high school diploma to enter the work force.  Very few working class students entered junior colleges.  However, the point of Brint and Karabel remains substantial: junior college leaders in conjunction with community business leaders actively tried to manipulate junior college student aspirations by engineering more and more occupationally oriented terminal programs.  They also encouraged this route more passively by neglecting a pedagogically appropriate curriculum and adequate student support services geared toward less academically prepared students.  Many junior college students tended to either drop out or settle for a terminal occupational certificate.  By 1970s, around 75 percent of low achieving students would drop out during their first year in urban community colleges.  Critics also pointed out that it was not an accident that the lowest achieving students in both secondary and postsecondary schools have historically been, and continue to be, the economically disadvantaged, ethnic/racialized minorities, immigrants, the disabled, and dislocated low-skilled workers.       

Despite the transfer mission remaining a primary emphasis for most community colleges throughout the 20th century, the apparent manipulation of institutional purposes by community college leaders, state governments, and the business community has remained constant, if not intensified.  Recent scholarship on the community college has demonstrated that community college administrators have increasingly adopted an ideological stance of neo-liberal corporatism over the last couple of decades, which has directed them to focus on efficiency, productivity, and marketplace needs.  This has lead to a much larger array of occupationally oriented terminal programs.  Some have claimed that these occupational offerings may be crowding out academic transfer-oriented programs, and leading away from an institutional climate focused on higher education.

A look at the history of higher education in the U.S. and the changing dynamics of student access does reveal some expansion of access and equity in terms of increasing amounts of post-secondary education for a broader swath of Americans.  However, traditionally underserved populations like the economically disadvantaged and many ethnic/racial minorities still struggle to achieve equality of opportunity in American society and its systems of higher education.  As the U.S. moves into a post-industrial “knowledge economy” in a highly globalized world, the issue of unequal student access to higher education remains a prominent and pressing political problem, and it has recently become intertwined with the issue of outcomes in terms of the educational and economic success of the student and the economic development of the nation. 

The open access mission of the community college was forged in an environment of socio-political inequality, educational elitism, and restricted educational and financial resources.  Community colleges were designed to be under-funded and marginalized institutions in hierarchical state systems of education.  While access in community colleges was open to all, no provisions were made to ensure the success of students in community colleges, nor access to the more advanced and economically rewarding levels of the higher education system.  In fact, it was assumed that a great many students enrolled in community colleges would be drawn away from higher education and redirected to terminal, lower-status and lower-paid vocational careers.  Now that more and more students are clamoring for a university education because of economic conditions that heavily reward university credentials, the notion of community colleges as holding pens for the underprivileged has been questioned, and new policies are being promoted in order to make state systems of higher education more equitable and just.  Community colleges hold immense promise if they can overcome their historical legacy and be re-institutionalized with the proper staffing and financial resources.  However, the path dependent nature of institutions and the limited rationality and power of institutional actors make these social structures incredibly resistant to change.  Can an institution which was “born subordinate” as the lower-level holding pen for the university overcome its own legacy and develop into a truly meritocratic and democratizing institution?  This study will not provide any easy answers.  Instead, this book will try to illuminate the parameters of this question by unfolding the historical trajectory of this institution in all its complexity, and along the way suggest possibilities for future change.[xxx]


Endnotes

[i] Brint & Karabel, The Diverted Dream: Community Colleges and the Promise of Educational Opportunity in America, 1900-1985 (Oxford: Oxford University Press, 1989); Christopher J. Lucas, American Higher Education: A History (New York: St. Martin’s Griffin, 1994); W. Norton Grubb and Martin Lazerson, The Education Gospel: The Economic Power of Schooling (Cambridge, MA: Harvard University Press, 2004), 158; Kent A. Phillippe and Leila Gonzalez Sullivan, National Profile of Community Colleges: Trends & Statistics, 4th ed. (Washington, DC: The American Association of Community Colleges, 2005), 70-73.

[ii] Lawrence R. Jacobs and Theda Skocpol, eds., Inequality and American Democracy: What We Know and What We Need to Learn (New York: Russell Sage Foundation, 2005).

[iii] Peter F. Drucker, “The Age of Social Transformation,” The Atlantic, 274, no. 5 (Nov 1994), 7, 10.

[iv] Robert Reich, “The Lost Art of Democratic Narrative,” The New Republic, 21 March 2005; Deborah Stone, Policy Paradox: The Art of Political Decision Making, revised ed. (New York: W. W. Norton, 2002): 1-54.

[v] Peter L. Berger and Thomas Luckmann, The Social Construction of Reality: A Treatise in the Sociology of Knowledge (New York: Anchor Books, 1966), 23; Paul J. DiMaggio and Walter W. Powell, “The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields.” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis  (1983; reprint, Chicago: The University of Chicago Press, 1991),  41-62; Anthony Giddens, Capitalism and Modern Social Theory: An Analysis of the Writing of Marx, Durkheim, and Max Weber (Cambridge: Cambridge University Press, 1971); Ronald L. Jepperson, “Institutions, Institutional Effects, and Institutionalism,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis  (Chicago: The University of Chicago Press, 1991), 143; Douglas C. North, Structure and Change in Economic History (New York: W. W. Norton & Co, 1981); John R. Searle, The Construction of Social Reality (New York: Free Press, 1995).  I think the most profound discussion of these problematical questions can be found in the work of Michel Foucault.

[vi] DiMaggio and Powell, “The Iron Cage Revisited,” 8; Roger Friedland and Robert R. Alford, “Bringing Society Back In: Symbols, Practices, and Institutional Contradictions.” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (Chicago: The University of Chicago Press, 1991), 232-263; James Miller, The Passion of Michel Foucault (New York: Simon & Schuster, 1993), 150; Searle, The Construction of Social Reality.

[vii] Berger and Luckmann, The Social Construction of Reality, 15; DiMaggio and Powell, “The Iron Cage Revisited,” 20, 23, 26, 28; Andre Lecours, “New Institutionalism: Issues and Questions,” In Andre Lecours, ed., New Institutionalism: Theory and Analysis (Toronto: University of Toronto Press, 2005), 3-25; John W. Meyer and Brian Rowan, “Institutionalized Organizations: Formal Structure as Myth and Ceremony,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (1977; reprint, Chicago: The University of Chicago Press, 1991), 41, 44; Paul Pierson, Politics in Time: History, Institutions, and Social Analysis (Princeton: Princeton University Press, 2004), 20-21, 43, 51; Lynne G. Zucker, “The Role of Institutionalization in Cultural Persistence,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (Chicago: The University of Chicago Press, 1991), 85; Searle, The Construction of Social Reality.

[viii] Anthony Giddens, Capitalism and Modern Social Theory; James Miller, The Passion of Michel Foucault, 15, 150, 336.

[ix] James A. Berlin, “Postmodernism in the Academy,” In Rhetorics, Poetics, and Cultures: Refiguring College English Studies (West Lafayette, IN: Parlor Press, 2003): 60-82; Friedland and Alford, “Bringing Society Back In,” 232, 254; Jepperson, “Institutions, Institutional Effects, and Institutionalism,” 145, 149, 151-52, 158; Walter W. Powell, “Expanding the Scope of Institutional Analysis,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (Chicago: The University of Chicago Press, 1991), 188, 194-195; Steven Brint and Jerome Karabel, “Institutional Origins and Transformations: The Case of American Community Colleges,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (Chicago: The University of Chicago Press, 1991), 337-360.  Paul Pierson argues that the term institutional “change” is misleading because it is almost impossible to change institutions fundamentally.  Instead, Pierson recommends the term “institutional development” as a more accurate description of how institutions change through time.  Pierson, Politics in Time, 133, 137.

[x] Sherry B. Ortner, Anthropology and Social Theory: Culture, Power, and the Acting Subject (Durham, NC: Duke University Press, 2006), 7, 18, 127, 130, 133, 139, 147, 152.  See also: Pierson, Politics in Time, 137.  On the relationship between institutions and agency see: Ludwig Wittgenstein, Philosophical Investigations, Trans. G. E. M. Anscombe (1953; reprint, Oxford: Blackwell, 2001), part I, 23.  Michele Foucault is perhaps the most profound philosophical exploration of the very limits of human agency.  He reformulated Kant’s “critique” as the only way human beings can get outside the totalizing power of institutions and refuse to be “subjected.”  James Miller, The Passion of Michel Foucault, 302.

[xi] Ronald L. Jepperson and John W. Meyer, “The Public Order and the Construction of Formal Organizations,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (Chicago: The University of Chicago Press, 1991), 204-231; James March and Herbert Simon, Organizations, 2nd ed. (Oxford: Blackwell, 1993), 130; Meyer and Rowan, “Institutionalized Organizations;” Pierson, Politics in Time; W. Richard Scott, Organizations: Rational, Natural, and Open Systems, 3rd ed. (Englewood Cliffs, NJ: Prentice-Hall, 1992).

[xii] Karl E. Weick, “Educational Organizations as Loosely Coupled Systems,” Administrative Science Quarterly, 21, no. 1 (March 1976): 1-19; March and Simon, Organizations,157, 159; Michael D. Cohen and James G. March, “Leadership in an Organized Anarchy,” In M. C. Brown II, ed., Organization and Governance in Higher Education, 5th ed. (1974; reprint, Boston: Pearson, 2000), 16-35.

[xiii] Paul J. DiMaggio and Walter W. Powell, “Introduction,”  In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis  (Chicago: The University of Chicago Press, 1991), 1-38; Meyer and Rowan, “Institutionalized Organizations,” North, Structure and Change in Economic History; Scott, Organizations: Rational, Natural, and Open Systems; W. Richard Scott and John M. Meyer, “The Organization of Societal Sectors: Propositions and Early Evidence,” In Paul J. DiMaggio and Walter W. Powell, eds., The New Institutionalism in Organizational Analysis (Chicago: The University of Chicago Press, 1991), 108-140; Theda Skocpol, “Bringing the State Back In: Strategies of Analysis in Current Research,” In Peter B. Evans, Dietrich Rueschemeyer, and Theda Skocpol, eds., Bringing the State Back In (Cambridge: Cambridge University Press, 1985), 3-37.

[xiv] DiMaggio and Powell, “The Iron Cage Revisited,” 66, 68; Meyer and Rowan, “Institutionalized Organizations,” 41, 44-46, 48-50, 58, 60; Scott, Organizations: Rational, Natural, and Open Systems,176; Scott and Meyer, “The Organization of Societal Sectors,” 123-24; Powell, “Expanding the Scope of Institutional Analysis,” 184.

[xv] Friedland and Alford, “Bringing Society Back In,” 243; Zucker, “The Role of Institutionalization in Cultural Persistence,” 83.

[xvi] Daniel Beland, “Ideas, Interests, and Institutions: Historical Institutionalism Revisited,” In Andre Lecours, ed., New Institutionalism: Theory and Analysis (Toronto: University of Toronto Press, 2005), 29-50; Ruth Benedict, Patterns of Culture: An Analysis of Our Social Structure as Related to Primitive Civilization (New York: Penguin Books, 1934), 2; Berger and Luckmann, The Social Construction of Reality, 23, 33; Clifford Geertz, “Ideology as a Cultural System,” In The Interpretation of Cultures (1964; reprint, New York: Basic Books, 1973), 193-233. Clifford Geertz, “Common Sense as a Cultural System,” In Local Knowledge: Further Essays in Interpretive Anthropology (1975; reprint, New York: Basic Books, 2000), 73-93; Karl Mannheim, Ideology and Utopia: An Introduction to the Sociology of Knowledge, Louis Wirth & Edward A. Shils, trans. (New York: Harvest Book, 1936); North, Structure and Change in Economic History; Ortner, Anthropology and Social Theory.

[xvii] Friedland and Alford, “Bringing Society Back In,” 253; Ortner, Anthropology and Social Theory; Powell, “Expanding the Scope of Institutional Analysis.”

[xviii] Terry Eagleton, Ideology: An Introduction (London: Verso, 1991), 224; John B. Thompson, Studies in the Theory of Ideology (Berkeley: University of California Press, 1984), 11, 131.  Eagleton makes several arguments about how the studies of ideology can be used towards an applied political science.  He argued, “If a theory of ideology has value at all, it is in helping to illuminate the processes by which such liberation from death-dealing beliefs may be practically effected” (p. 224).  Friedland & Alford also seem incorporate such a possibility by highlighting the importance of subjectivity and actors in understanding institutional change (p. 254).  They also seem to agree with Eagleton that social scientists can knowingly or unwittingly get caught up in the reproduction of the status quo through uncritical study of “dominant institutional logics” (p. 260).  Friedland and Alford, “Bringing Society Back In.”

[xix] John L. Campbell, “Institutional Analysis and the Role of Ideas in Political Economy,” In John L. Campbell and Ove K. Pedersen, eds., The Rise of Neoliberalism and Institutional Analysis (Princeton, NJ: Princeton University Press, 2001), 160; Friedland and Alford, “Bringing Society Back In,” 254-55.

[xx] Joel A. C. Baum and Jitendra V. Singh, “Organizational Hierarchies and Evolutionary Processes: Some Reflections on a Theory of Organizational Evolution,” In Joel A. C. Baum and Jitendra V. Singh, eds., Evolutionary Dynamics of Organizations (Oxford: Oxford University Press, 1994), 5; DiMaggio and Powell, “The Iron Cage Revisited;” Friedland and Alford, “Bringing Society Back In;” Scott and Meyer, “The Organization of Societal Sectors,” 137; Powell, “Expanding the Scope of Institutional Analysis.”

[xxi] Jepperson, “Institutions, Institutional Effects, and Institutionalism;” Doug McAdam and W. Richard Scott, “Organizations and Movements,” In Gerald F. Davis, Doug McAdam, W. Richard Scott, and Mayer N. Zald, eds., Social Movements and Organizational Theory (Cambridge: Cambridge University Press, 2005), 4-40; Scott and Meyer, “The Organization of Societal Sectors,” 117; Powell, “Expanding the Scope of Institutional Analysis;” Zucker, “The Role of Institutionalization in Cultural Persistence.”

[xxii] Friedland and Alford, “Bringing Society Back In,” 232; McAdam and Scott, “Organizations and Movements;” Pierson, Politics in Time; Zucker, “The Role of Institutionalization in Cultural Persistence,” 84.

[xxiii] Siobhan Harty, “Theorizing Institutional Change,” In Andre Lecours, ed., New Institutionalism: Theory and Analysis (Toronto: University of Toronto Press, 2005), 51-79.

[xxiv] Jepperson, “Institutions, Institutional Effects, and Institutionalism,” 151-52; Pierson, Politics in Time; Powell, “Expanding the Scope of Institutional Analysis,” 195; Zucker, “The Role of Institutionalization in Cultural Persistence,” 104.

[xxv] Brint & Karabel, The Diverted Dream; Brint and Karabel, “Institutional Origins and Transformations: The Case of American Community Colleges;” John H. Frye, The Vision of the Public Junior College, 1900 – 1940 (Westport, CN: Greenwood Press, 1992); Kenneth Meier, The Community College Mission: History and Theory, 1930 – 2000 (Chico, CA: Unpublished manuscript, 2008); David F. Labaree, “From Comprehensive High School to Community College: Politics, Markets, and the Evolution of Educational Opportunity,” Research in Sociology of Education and Socialization: A Research Annual, 9 (Greenwich, CN: JAI Press, 1990): 203-240.

[xxvi] On practice theory see: Ortner, Anthropology and Social Theory; Etienne Wenger, Communities of Practice: Learning, Meaning, and Identity (Cambridge: Cambridge University Press, 1998).  On social movement theory see: Gerald F. Davis, Doug McAdam, W. Richard Scott, and Mayer N. Zald, eds., Social Movements and Organizational Theory (Cambridge: Cambridge University Press, 2005).

[xxvii] Burton R. Clark, The Open Door College: A Case Study (New York: McGraw Hill, 1960).

[xxviii] Brint & Karabel, The Diverted Dream, 5-19, 56, 59, 91, 205-32.

[xxix] Ibid.

[xxx] Brint and Karabel, “Institutional Origins and Transformations: The Case of American Community Colleges,” 349.

Educational Malpractice at the University of Texas at San Antonio

This essay was originally part of a series of documents that were shared with my former department chair at the University of Texas at San Antonio over several years. Later, I shared some of this information with the new incoming President in a letter I wrote him in December, 2017. After it was clear that the university would do nothing to fix these issues, I complied these essays and data, as well as other data and essays, and I submitted it to university officials in the fall of 2018. Then I quit teaching in Texas.

 
On paper, her school claimed that almost all of its graduates were headed for college.  In fact, the principal said, most of them ‘couldn’t spell college, let alone attend.’
— quoted in Sharon L. Nichols & David C. Berliner, Collateral Damage: How High-Stakes Testing Corrupts America's Schools
 

Introduction: Cheating Cultures in Educational Institutions

UTSA has a serious problem that urgently needs to be acknowledged and addressed by administrators, faculty, and student affairs staff.  Most freshman entering UTSA are not prepared for academic success in college, which is why UTSA has traditionally had low retention and graduation rates.  In order to support that claim, I will be providing an interdisciplinary synthesis of many bodies of academic literature in conjunction with original observations and research.  Rather than dealing with this serious problem, some faculty at UTSA are engaging in various forms of malpractice and fraud, especially in the Writing Program, which will be the focus of this report. 

Why are students unprepared for success in college?  There are many reasons for this predicament, including inadequate Texas funding of K-12 schools, especially during the Great Recession of 2008 and its aftermath when hundreds of millions of dollars were cut from Texas schools, thousands of teachers were laid off, class sizes were increased, and the curriculum was cut.  Other reasons include inequitable educational resources in Texas K-12 schools, and low student motivation to succeed academically, which is tied both to student’s psychology and their parent’s socio-economic status. 

In Texas, as in most states around the U.S., students are graduating high school not only unprepared for college, but also without basic literacy and numeracy skills, including the ability to read and write.  Worst of all, many Texas high schools are fraudulently awarding diplomas to students who should not be graduating because these students lack basic numeracy, reading, and writing skills.  Nichols and Berliner (2007) analyzed a 2003 study, which researched 108 schools in Texas.  In half of these schools, “70 percent or more students were considered at risk of academic failure,” and yet these schools graduated nearly all of their students and claimed a dropout rate of only “1 percent or less” (p. 83).  In 2000-2001, Houston, Texas boasted a 1.5 percent drop out rate, yet one Houston principal admitted to a researcher, “On paper, her school claimed that almost all of its graduates were headed for college.  In fact, the principal said, most of them ‘couldn’t spell college, let alone attend’” (qtd. in  Nichols & Berliner, 2007, p. 83). Texas is not alone.  This kind of educational fraud is happening all over the U.S., most famously in Atlanta, where 35 educators where charged, and 11 were convicted, on state racketeering charges (Mitchell, 2017).

Just a couple of months ago, a teacher in Bastrop, Texas published a resignation letter that went vial around the country (Mulder, 2018).  She discussed the demoralizing environment in her school where she doesn’t have the materials that she needs to teach due to budget cuts, so she spends her own money to invest in supplies, only to have her students damage or destroy them.  She also explained that she would be failing almost half of her class because they don’t have the necessary skills, and because they won’t complete assignments.  She explains that Bastrop administrators are doing nothing to address this situation, they side with non-performing students and their parents, and they blame teachers for students’ problems, all of which makes the situation worse.  She exclaimed, “My administrator will demand an explanation of why I let so many fail without giving them support, even though I’ve done practically everything short of doing the work for them” (para. 7).

These low motivated and academically unprepared students not only manage to graduate from Texas high schools, but then they end up enrolling in community colleges and non-selective state universities, like UTSA.  There is enormous social pressure for everyone to go to college, but most unprepared high school students will drop out of college without ever earning a degree (Rosenbaum, 2001), and many of these dropouts will have unprecedented levels of student loan debt (Golderick-Rab, 2016).  At open-door community colleges across the U.S. the dropout rate exceeds 70 percent (Beach, 2011).  Since its inception, UTSA has been a non-selective university and it admits many unprepared students.  Thus, it has always had low retention rates and low graduation rates, with currently less than 40% of freshman graduating with a degree in six years, which is the highest graduation rate that UTSA has ever had. 

I have taught first-year freshmen in the Writing Program at UTSA for the past eight years.  I have found that most of my students do not have the basic student skills, let alone the prerequisite reading, writing, and thinking skills required to successfully pass Writing classes, let alone earn a college degree.  I will be documenting the academic metrics of my students later in this report.  Worst of all, many of these students who lack basic skills have already passed through writing classes at UTSA where they learned almost nothing, even though the majority of them received A and B course grades.

K-12 schools in Texas are failing to prepare all students for success in college, which is the root of the problem that I am addressing, but institutions of higher education in Texas are exacerbating this problem by not properly screening students, by not keeping academic standards high, and by not providing the necessary support that high-risk students need.  In particular, UTSA has inadequate enrollment procedures to ensure that incoming students are emotionally and academically prepared for success in college.  This includes the psychological readiness of students, as I have seen that about 1-2 percent of my students suffer from serious mental health problems, which prevent them from being successful students.  UTSA also has faculty inadequately trained to teach, especially adjunct faculty teaching freshmen.  And most importantly, rather than understand the problems of unprepared students and do the hard work of educating them, many faculty at UTSA are lowering academic standards and passing these unprepared students with inflated grades. 

Lowering academic standards harms unprepared students in several ways.  First, students are allowed to pass without demonstrating real knowledge, which gives them an inflated sense of accomplishment, which will hinder future attempts at learning when they encounter more responsible faculty with higher academic standards.  A lack of learning will also eventually catch up with these students.  They are at high risk of failing more difficult classes as they move into their junior and senior year, and this will increase their chances of dropping out.  Worst of all, because they were socially promoted, these students spent thousands of dollars and accumulated higher amounts of debt without learning any real knowledge or skills, and so they will drop out of college worse off than when they started. 

Plus, pandering to unprepared students with lower academic standards also harms high achieving students who will not get the full education they want, which will limit their opportunities to be successful in graduate school or the labor market.  This situation also harms committed scholar-teachers who have high academic standards and use evidence-based teaching practices because students complain about the hard work and double standards of more competent faculty, and administrators complain about higher D/W/F grades. 

I have spent the last eight years at UTSA wresting with this situation by spending thousands of unpaid hours pouring through the academic literature and tirelessly innovating in my classrooms, including developing my own curricular materials tailored to my students’ needs.  As the philosopher Richard McKeon (1953/1990) pointed over a half century ago, teaching and scholarship should be deeply intertwined (p. 34).  I have always used my passion and commitment as a scholar to better inform my teaching and the development of my curriculum.  I have used my academic skills to study not only the subjects of literacy, epistemology, and communication, which I teach, but also the predicaments of my students at UTSA and the Writing Program that is supposed to be training them. For the past eight years, I have been formally assessing and documenting student motivation and academic proficiency.  I have also been studying the quality of the faculty and the curriculum of the Writing Program, as well as some of the institutional policies of UTSA.  I have tried to understand the complex causes of student failure so that I could develop innovative ways of increasing student motivation and achievement, and I have also tried to share this knowledge with my colleagues to promote department and institutional reform. 

While I am focusing on the Writing Department at UTSA, I want to reiterate that many schools around the country have endorsed “playing school” and/or adopting a “cheating culture” to not only deal with unprepared students who cannot or will not learn, but also to deal with unrealistic policy directives that have been pushed by politicians and school administrators (Nichols & Berliner, 2007, p. 33).  I will explain how UTSA is one of those schools, but it is not alone in its predicament.  I have talked to a colleague at UT El Paso, which is facing the same problems as UTSA, and I have personally seen these same problems at Austin Community College and St. Edwards University, where I have also worked as an instructor.  As educational scholars Nichols and Berliner (2007) have documented, “through the overvaluing of certain indicators, pressure is increased on everyone in education.  Eventually, those pressures tend to corrupt the educational system” (p. 34).  Because of the widespread political pressure to increase student success metrics, especially retention and graduation rates, there is widespread “potential” at all levels of our educational system “for manipulating data” (p. 84) and engaging in various forms of educational malpractice and fraud to cook the books. 

Around the country, educators feel enormous pressure to play school, pass underperforming students through the system, and award the maximum amount of credentials.  This pressure is corrupting not only K-12 schools, but also institutions of higher education.  In 2006 the Spellings Commission released its final report, A Test of Leadership: Charting the Future of U.S. Higher Education.  In this report the committee noted, “There are disturbing signs that many students who do earn degrees have not actually mastered the reading, writing, and thinking skills we expect of college graduates. Over the past decade, literacy among college graduates has actually declined. Unacceptable numbers of college graduates enter the workforce without the skills employers say they need in an economy where, as the truism holds correctly, knowledge matters more than ever” (p. vii). 

Like the cheating teachers in the Atlanta scandal, my colleagues in the Writing Program at UTSA are not bad people with sinister motives.  I believe that most of them sincerely want to help students succeed in college and in life, and most of them feel pressured by our Department Chair and the administration.  Yet despite their good intentions, as documented in studies around the country at all levels of schooling, unprofessional and fraudulent practices, including rampant grade inflation, end up hurting students in many ways, especially the most disadvantaged (Marcus, 2017).  The unprofessional and fraudulent practices in the Writing Program must stop, for the benefit not only of faculty and students, but also to protect UTSA’s academic integrity so that our university can rise to Tier I status, and thereby use its future position to better the people of San Antonio and the entire state of Texas.

I believe in transparency, the importance of data and scientific analysis, and the power of knowledge to transform policy and practice.  I want to publicize the problems that UTSA faces so that these problems can be acknowledged and addressed with new policies and better practices.  I sincerely want to see UTSA develop into a Tier I university that can make a real impact in the lives of students, especially disadvantaged students who will benefit the most from a college education.

Malpractice and Fraud Due to Mismanagement in the Writing Program at UTSA         

I have been a faculty member in the Writing Program at UTSA since 2010.  From the beginning, I noticed that this department was dysfunctional and beset by many problems.  I have formally studied higher education and published on the subject, so once I arrived, I immediately started collecting and analyzing data on student motivation and academic achievement, as well as the strengths and flaws of the Writing Program curriculum, program faculty, and UTSA policies.  Over the past eight years, I have shared my data and preliminary conclusions with my colleagues many times, but I came to realize that most of my colleagues did not want to address the problems our department faced.  Instead of researching and engaging with these problems, the Chair of the Writing Program has not only ignored the problems I addressed, but she has continued or initiated many counter-productive and unprofessional practices.  I also think that some of these practices constitute forms of educational fraud. 

In her defense, the Chair has told me that both our current Dean and our previous Dean have supported her unprofessional and fraudulent policies.  So while I am laying responsibility primarily on the Chair in this report, other senior administrators at UTSA, and the institutional culture they have fostered, have most likely contributed to the dysfunctional nature of the Writing Program.

First of all, it is important to understand that most faculty in the Writing Program have only a two-year master’s degree in English, most of which were earned low-tier public universities, like UTSA.  Further, some of our faculty are graduate students in the English department who are unprepared and unqualified to teach.  Only a few of our faculty have delivered conference papers, and just two or three have published an academic paper.  Most of our faculty don’t even read academic research of any kind, let alone write and publish research.  Many of our faculty watch tv, play video games, read novels, or use social media during their office hours, rather than engage in serious scholarship, like research, writing academic papers, conducting peer-review, or engaging with public policy. 

Maybe one or two members of our department, beside myself, actively engage in academic research or publish scholarship.  And as far as I know, I am the only person who is an active peer reviewer for academic journals or professional scholarly associations.  I am the only person serving on the editorial board of an academic journal.  I am the only person in the department to have written and published academic books, which have been peer reviewed and widely cited by other scholars around the world.  I am also the only faculty member in the department whose scholarship has been formally endorsed by a President of the MLA, the formal professional body that governs the discipline of English.

Most of my colleagues have only two years of graduate school, studying fictional literature, a subject that is completely unrelated to the current curriculum of the Writing Program at UTSA, which is focused on academic writing, critical thinking (including quantitative reasoning), and argumentation.  Almost none my colleagues have any formal training in quantitative reasoning, and many are hostile to the very subject of math.  Further, most of my colleagues have never professionally written anything, which is deeply troubling to say about a Writing Program at a research university aspiring to teach Tier I status.  Unlike many other traditional academic subjects, which are abstract and theoretical, writing is a practice and a craft, and you can only teach it well if you actually practice it as a professional, especially I would argue, if you are teaching academic or scientific writing, which includes the important process of critical peer review.

Low-quality faculty is one of the core problems our department faces, but no one in the Writing Program wants to discuss this problem, especially the Chair.  I would argue that this topic is taboo largely because of the widely documented “Dunning-Kruger Effect” (Nichols, 2017, pp.  43-44).  The more ignorant and incompetent people are, the less likely they can see their ignorance and incompetence; thus, the more likely they believe they are competent, and the more likely they will resist new information to fix the problem they cannot see or acknowledge.  Low-quality faculty simply cannot see, let alone understand, their many inadequacies.

Why does the Writing Program have so many low-quality faculty?  Largely it’s due to institutional policies at UTSA.  For adjuncts, UTSA offers low pay and poor working conditions teaching unprepared undergraduates, so it does not attract highly qualified faculty, especially for programs that focus on freshmen (the tenure track is a different story, and I am not addressing tenure track faculty in this report).  I have heard that many other departments at UTSA suffer from similar problems as the Writing Program, and many of my colleagues have claimed that the new Academic Inquiry program is the most dysfunctional program at UTSA, but I have no direct knowledge or data to address that claim. 

When I was hired at UTSA, my education, experience, and scholarship were ignored, and I was placed at the bottom of the pay scale at the poverty-level wage of $24,000 a year, plus benefits (before taxes).  To add insult to injury, this was the full-time salary, and as a new adjunct, I was only given full-time status fall semester and then part-time status spring, so I was making only $18,000 a year (before taxes and mandatory retirement contributions), with paid benefits for only four months because I could not afford the pay the premiums for the rest of the year.  This was much less then I was paid for the same position in California or Oregon.  Few competent scholars would put up with such dismal compensation.  Like almost all of my adjunct colleagues, I have had to work a second or third job to survive financially.

While these poor working conditions are a problem for all UTSA adjunct faculty, there is another problem unique to the Writing Program because it mostly employs faculty with English degrees.  As I will point out in the next chapter, due to the arbitrary history of English as an academic discipline, most of my colleagues have not been trained in the subjects they have been hired to teach.  With Masters degree in English, most of our faculty have no formal training in the core fields of rhetoric, composition, communication, or critical thinking (let alone quantitative reasoning) that govern our curriculum.  Further, I don’t think any faculty in the department, other than myself, has training in the field of education, which covers teaching, student learning, curriculum, and educational assessment.  Given the challenging student population we are trying to teach, knowledge of education is a must if our program seeks to be successful.

A poorly trained adjunct faculty would not necessarily be an insurmountable problem if they were properly supervised and trained with ongoing professional development.  However, there has been no proper supervision or real professional development in the Writing Program.  Traditionally, university faculty engage in faculty development via attending academic conferences and engaging in scholarship.  But adjuncts at UTSA are so impoverished that few of them can afford to attend a conference.  With only a Master’s degree, most are not adequately trained to engage in scholarship or research, and for those that do have the necessary training and aspire to publish, they are often too busy to do so because they have to work multiple jobs to survive.

But the Chair has also actively subverted basic university standards of faculty professionalism, largely because she had shirked her responsibility to adequately supervise or train low-skilled adjuncts.  She has disregarded the essential practice of “professional development” that is mandatory in all top-tier research universities.  Instead of engaging with the academic and scientific research that governs our multi-disciplinary curriculum, she uses bi-yearly meetings to do two things.  First, she takes hours to restate all of the basic policies published in the Faculty Handbook and discuss institutional and departmental news, which is a useless and demeaning ritual – all of this information could easily be sent to faculty via email.  Then she has untrained faculty members hastily put together half-baked presentations (rarely addressing even a single academic source) on topics they are usually unqualified to discuss.  Or she has unqualified textbook representatives give the department a presentation on how to teach to the textbook.  These practices do not constitute “professional” development.

I can remember only three times in eight years that we have had a trained scholar speak during our professional development days, and I was one of those three speakers.  I agree with Brint (2008) and Grubb (1999) that higher education faculty need greater professionalization when it comes to teaching and being responsible for student learning, especially adjunct faculty (hence one of the reasons why I wrote this report).  Brint (2008) has called for the “reconstruction of college teaching as a profession” (p. 5).  I have continually urged the Chair and the main coordinating committee to offer real professional development engaging our faculty with the academic and scientific literature, but no one wants to spend the time, money, or effort that real professional development would demand. 

Our department “norming” sessions are another example of how university standards of faculty professionalism have been subverted.  Once each semester, all faculty gather in small groups to evaluate three student essays from the previous year.  These meetings could be an invaluable time to discuss research on the core concepts of our curriculum, and also how best to teach these concepts, as well as the subjects of student learning, curriculum, and educational assessment.  But in most meetings there is no critical assessment of teaching or student learning, or any learned discussion of any topic.  The Chair selects mostly inexperienced and unqualified faculty to lead these meetings.  Most don’t know what they’re doing, so they just follow a prescribed and mindless ritual initiated by The Chair.  Rather than discuss best practices in the scholarly literature, these “norming” sessions usually reinforce unprofessional subjective opinions and common sense.  When listing to my colleagues evaluate student writing, it is clear that most of them suffer from what medical researcher Archie Cochrane calls “the God complex,” a common ailment of semi-professionals and non-professionals: They don’t need research or data to back up their claims; “they just know” the truth because their subjective intuition tells them its true (qtd. in Tetlock & Gardner, 2015, p. 31).      

While the Writing Program does engage in a yearly assessment process, I found that this process is deeply flawed, if not possibly fraudulent.  For one, if you read the actual assessment reports you will see that some of the numbers are inconsistent, which shows that these reports are not carefully put together.  Further, the department does not have specific, measurable, and valid SLOs or core course topics, nor are there any calibrated assessment tools to measure any specific or objective evaluative criteria.  In fact, The Chair has ignored my continued requests for clear Student Learning Objectives and core course topics so that faculty can better assess which students demonstrate the necessary skills needed to pass a class, especially in terms of the transition from Writing I to Writing II.  I finally got these core concepts into the latest report of the main coordinating committee this year; however, in a recent email telling the department about these changes, it appears that they have been deleted, so I’m not sure if these core concepts have been made official policy, or if they have been put on hold or discarded. 

In order to illustrate the Writing Program’s lack of valid SLOs, take for example the grading standards and learning objectives set forth in our Faculty Handbook for 2017-2018.  Here are some of the vague standards for an A paper: “not commonplace or predictable,” “original,” “polished,” “strong,” “varied,” and “well-chosen” (p. 10).  Or take some of the program goals for WRC 0203: “address the needs of different purposes” and “use appropriate format, structure, voice, tone, and levels of formality” (p. 49).  I don’t know what these words are supposed to mean, and I certainly could not see them or objectively evaluate them in a student paper. 

Philosopher Harry G. Frankfurt (2005) calls this sort of vague verbiage “bullshit,” which is language that cannot be clearly understood or empirically verified.  Frankfurt (2005) argues that bullshit is worse than lying because people are “not even trying” to be accurate, and more importantly, because they are not “committed” to the truth of their statements (pp. 32, 36).  Frankfurt (2005) criticizes bullshit as “empty, without substance or content…No more information is communicated than if the speaker had merely exhaled” (pp. 42-43).  When the Writing Program officially uses and endorses vague, “bullshit” criteria like this, each faculty member will subjectively grade in idiosyncratic ways.  Vague SLOs lead to invalid assessments of student work, and also to an unfair range of scores, not only between teachers, but also between assignments in the same class.

Another example of vague, “bullshit” SLOs can be found in Writing Program definitions of “critical thinking.”  These official definitions are vapid, widely inconsistent, almost completely false, and unconnected from the academic disciplines of psychology and philosophy, which govern the concept and practice of critical thinking.  For the portfolio assessment in the Faculty Handbook, the “critical thinking” objectives include: “summary, paraphrase, analysis, evaluation, and critique,” “thoughtful selection and meaningful synthesis of supporting evidence” (p. 74).  None of these terms are directly connected to the actually definition of critical thinking.  For the Core Curriculum Appendix I, critical thinking standards are: “creative thinking, innovation, inquiry, analysis, evaluation, and synthesis of information” (p. 45).  Again, this list of vague and unconnected words is not connected to the definition of critical thinking.  And for the program goals of WRC 0203, critical thinking supposedly means “use writing and reading as resources of inquiry and communication,” “recognize, understand, summarize, and evaluate the ideas of others,” “understand the power of language and knowledge,” and “understand the interactions among critical thinking, critical reading, and writing” (p. 49).  As far as I can tell, whoever wrote this handbook was just mindlessly writing vague words or just making stuff up.  None of these definitions of critical thinking is consistent, let alone coherent or accurate. 

How can a program teach the core skill of critical thinking when nobody actually knows what it means, let alone how it is done or how to assess it?  I made this exact point to the main departmental coordinating committee several times, and yet no one understood what I was saying. My colleagues think the meaning of critical thinking is obvious because it is simply a matter of common sense, and apparently they can’t see, or chose not to see, the inconsistent and incoherent lists of vague words in our Faculty Handbook.  As a scholar who has published on epistemology and cognition, I’m embarrassed to be associated with the meaningless, confused, and false official language of our department on critical thinking, which was obviously written by amateurs who had no understanding of what they were talking about. 

The vague and sometimes meaningless SLOs are just one part of an imprecise and invalid system of “holistic grading,” which the Chair has instituted, with the support of many colleagues.  This subjective grading system leads to inflated grades that are not correlated with actual student knowledge or skills. This system is usually applied inconsistently, and the data collected is often invalid because the rubrics allow a wide range of subjective opinions.  Writing Program faculty have no training in, or knowledge about, educational assessment, and so they don’t understand how and why their holistic rubrics and subjective evaluations are flawed.  In fact, the Chair and many of my colleagues are biased against the very idea of collecting objective data and conducting educational assessments.  No one in the department, except for myself, seems to understand the concept of validity or the value of assessing objective learning standards. 

As I will demonstrate later in this report, I have uncovered limited student learning in the Writing Program.  I believe that the lack of student learning is a direct result of the lack of objective SLOs and of invalid, holistic assessment instruments, which in conjunction with departmental and institutional pressure, lead to low and subjective academic standards and also to grade inflation, which can be seen as forms of academic fraud.  Faculty subjectively judge student work, use low standards, and inflate grades that are unconnected with real student knowledge or skills, rather than do the harder work of validly assessing objective criteria.

While I am very concerned about grade inflation and lack of student learning in the Writing Program, I am more concerned with the fact that the Chair seems to be orchestrating this fraud, both directly through departmental policy, and indirectly through her comments and the norms she has set.  She makes it clear to faculty that the majority of our students should pass our classes, regardless of actual effort or academic performance.  She also makes it equally clear that she views failing grades as the fault of incompetent instructors, not the result of low student achievement or low motivation. Thus, most of my colleagues are not only using holistic and subjective rubrics to award inflated grades that are disconnected with actual student knowledge or skills, but they are also awarding lots of extra credit and empty points for attendance and participation in an effort to artificially boost student grades and maintain the Chair’s mandated grade distribution. 

Picture1.png

Suspiciously, our department has maintained the same basic departmental grade distribution every semester for many years.  In the Writing Program, 80 percent of students consistently earn passing grades every semester, and over 60 percent consistently earn A or B grades.  Interestingly, this inflated grade distribution has always been much higher than our yearly program assessments, which reveal a passing rate about 11 to 15.5 percent lower.  On these yearly assessments, the Writing I average score went from 73 percent passing in 2015-2016 down to 70 percent passing the following year, while the Writing II average went from 75 percent in 2015-2016 down to 69 percent the next year and then down to 67.9 percent last year.  The yearly assessments show a similar downtrend, albeit with a much softer slope, as I have found in my own classes, which can be seen in the previous chart (data on my students’ performance at the start of the semester is indicated on the lower dotted line “Essay 1 Pass”).  While I think the yearly reports are a more accurate reflection of student skills than course grades, I still think these yearly assessments are inflated, due to untrained faculty, vague SLOs, and subjective holistic grading techniques.

The official numbers publicized by the Writing Program sound impressive, and that’s exactly what the Chair and the rest of my colleagues want UTSA administrators to think.  However, my research shows that the majority of students in our department cannot read, write, or critically think, and I have hundreds of student essays to prove it.   Worse, students are passing through Writing I at UTSA without mastering the basic prerequisite skills that are supposed to be taught in that class.  As the chart above shows (the lower dotted line “Essay 1 Pass”), last spring only 22 percent of students in my Writing II class could demonstrate core Writing I skills, which was down from 43 percent in Spring 2016.  I have consistently found that over half of the students who “pass” Writing I cannot demonstrate most core sills, including how to read.  I will discuss this issue with more data later in this report.  For now I want to point out, as the chart above illustrates, that my measurements of student competency have been steadily decreasing since 2016, yet the departmental grade average has held constant, which suggests that not only are my colleagues are artificially inflating grades to keep them in line with traditional standards, but they are consistently inflating grades in roughly the same proportion every semester, which suggests coordination, if not collusion.

The Chair prints the grade distribution of each faculty member every semester and she criticizes any faculty member whose distribution does not match departmental averages, which (as it states clearly in our Faculty Handbook) leads to lower performance ratings, which in turn leads to lower rates of promotion and less employment via less classes offered the following semester.  I have been a victim of this vicious circle for years.  The majority of my colleagues who consistently inflate their grades also consistently earn high scores on student evaluations, while my dropping grade distribution has matched my dropping student evaluation scores.  It is interesting to note that these scores can vary a lot between different sections of the same class during the same semester, which clearly show that it is not the instructor’s teaching that is being measured but the subjective feelings and opinions of students.  In effect, the Chair is coordinating department-wide fraud by demanding that faculty meet inflated grade distributions, which produce inflated student evaluation scores, and she punishes faculty that fall below her prescribed distribution levels.  I tried to collect more precise data on this issue last year, but my colleagues verbally assaulted me for even talking about this issue, and the Chair told me that the Dean doesn’t see any problem with the Writing Program’s inflated grade distribution.

This fraud is further enabled by the Chair’s misuse of student comments and student evaluations, which constitute 30% of our yearly employee evaluations (as stated in the Faculty Handbook), although I suspect that student comments used to carry much more weight in the past.  Faculty who lower standards and inflate grades get higher student evaluations and less complaints because students don’t have to work as hard to get high grades and most students pass a class regardless of effort or skill.  In contrast, faculty with high academic standards award objective grades tied to the actual academic performance of students, which means students have to work harder to demonstrate real learning, and often receive lower grades in the process, which causes students to complain and give instructors lower evaluation scores – which, in turn, leads to lower performance ratings by the Chair, lower or no promotions, and reduced or loss of employment.  It is a vicious circle.  So it is easy to see why most faculty in the department participate in the fraud that the Chair is orchestrating.

Picture2.png

Because my colleagues want steady employment with the least amount of student complaints, they lower their academic standards and they do not challenge students to demonstrate real learning.  I have personally seen some colleagues give students meaningless and vapid instruction, whereby students simply have to unthinkingly follow a prescribed ritual to receive a passing grade.  My colleagues are passing students (if not giving them As and Bs) who can’t read, write, or critically think.  One of my students last semester confided in an essay, “My last writing course I had taken my first semester had been a breeze” (Student #1).  Another student explained, “This course [my Writing II course] took me by surprise” because “in none of these schools [previously attended] was I introduced to the importance of objectivity over subjectivity and how to correctly identify points and thesis within works” (Student #2).  Students are passing classes at UTSA with an inflated sense of knowledge and skill, which will result in future struggles when they eventually reach a class (or job) where they actually have to meet objective standards to succeed.  I find this situation to be deeply troubling, and I consider it not only unethical, but also a violation of the academic integrity of UTSA and of the rights of our students as citizens and customers who deserve a real education for their tuition dollars.        

While unprofessionally low standards and the fraudulent practice of enforced grade inflation are the most serious issues affecting our department, the Chair has also intentionally subverted some other important UTSA policies, which have lowered the quality of teaching and student learning in our department.

Ironically, our department has been tasked by UTSA to teach quantitative reasoning, which is not actually happening in most of our writing classes because English teachers are notoriously fearful of math.  Instead of quantitative reasoning, many of my colleagues teach how to subjectively opine or lie with descriptive statistics.  For the most part, the Chair appointed unqualified faculty members to lead the quantitative reasoning and writing initiative (the one qualified faculty member retired soon after the program was initiated).  Most members of the department have no understanding of quantitative reasoning or descriptive statistics, and some members are actively hostile towards the practices data collection and scientific objectivity. 

In the Writing Program Faculty Handbook (pp. 19-20), someone has badly summarized and pasted block quotes from the UTSA report on Quantitative Scholarship.  I have not read the original documents, so I cannot judge its contents, but I can assess how it is being summarized and presented to our faculty.  There is a bunch of empty language framing a vague purpose, like “understand and evaluate data, assess risks and benefits, and make informed decisions” (p. 20).  What does this mean exactly and how is it done?  Few in our department could hazard a guess. 

Nowhere in the Handbook is there any definition of quantitative reasoning or how it is concretely done (for example a discussion of descriptive vs. inferential statistics, probability theory, the concept of validity, or sampling, including sample size and sample bias).  Instead of concrete and knowledgeable directions, there is a decontextualized and vague process (Explore, Visualize, Analyze, Understand, Translate, and Express).  The “Q” program, as it is known in our department, has been largely reduced in most classrooms to a meaningless bureaucratic shuffling of papers, whereby faculty take a couple of days to follow a prescribed worksheet.  Why?  Faculty have not been properly trained in qualitative thinking, and official documents offer nothing but vague and useless verbiage.  Worse, most of our faculty have strong biases against math and qualitative thinking.  Thus, the department assessments of quantitative reasoning have been deeply flawed with questionable validity. 

Likewise, there is an important system-wide UT initiative for peer-observation to improve faculty teaching (see UTSA HOP 2.20), which is being subverted.  Once again, the Chair assigned unqualified faculty to work on this program.  And while the basic letter of the law has been followed, for the most part, the spirit of the initiative has been undermined, which I think constitutes another from of fraud.  The leader of this initiative has no teaching experience (he was reporter before becoming a teacher a couple of years ago), and he has never done any formal study or research on education, teaching, or student learning.  For many years, I have asked the Chair to develop this program into a serious endeavor to teach faculty about the science of teaching, assessment, and student learning, but she has refused to listen or do anything to develop this program.  I was recently peer-reviewed by the leader of this program.  He sat in on a class…and that was it.  I assume he filled out the paperwork, but it doesn’t matter because this paperwork is ignored by the Chair and simply put in employee files.  The exercise is currently a pointless waste of time that is of no value to the observer, the observed, or the program.  Peer-observation is a vital component of faculty development, and the Writing Program has reduced it to another meaningless bureaucratic shuffling of papers.

Not only are many faculty inadequately prepared to teach, but competent faculty are prevented from teaching effectively.  First of all, the Chair has mandated an unprofessional “teach-to-the-textbook” curriculum, which many faculty in the department slavishly follow because they have no original ideas about teaching or learning, let alone about the subject matter we are employed to teach. Many of our faculty do little more than follow the prescribed readings and exercises in our textbooks without fully understanding what they are teaching or why, let alone designing original curricular materials.  Many faculty cannot recognize erroneous or outdated information in our textbooks, and so students get indoctrinated with useless material, which causes confusion when they are later presented with correct information in future classes.  When I first joined UTSA, I gave a short lecture to the whole department criticizing the theoretical faults and factual inaccuracies in one of our main textbooks.  The Chair and the rest of the department ignored my presentation; we continue to use this deeply flawed book, and most of our students get indoctrinated with some useless and inaccurate information.

Another issue that prevents quality teaching is our department’s over-reliance on student comments and evaluations, which constitute 30 percent of our yearly employment evaluations. These evaluations determine faculty raises, full-time employment, and promotion.  I will demonstrate later in this report how student evaluations are invalid, discriminatory, and inversely connected to actual student academic achievement.  Many faculty in our department are scared of honestly interacting with their students for fear that students will complain and their employment will be jeopardized.  One junior faculty member who had been with the program for only two years confided in me that he wishes he could talk to students the way I do, but he fears student complaints, which would cause retaliation by the Chair and the departmental promotion committee.

Despite earning a PhD, the Chair has not been properly trained in information literacy, and she has shared many “fake” and predatory “scholarly journal” emails requesting faculty research for publication, which are scams that usually involve requesting money to publish.  Somehow, these spam emails get through the UTSA security, which really needs to be addressed by the IT department.  I had to explain to the Chair on several occasions about predatory journals and Beall’s List, which apparently she had never heard about.  She was passing these emails on to the rest of the department as legitimate publishing opportunities, which I think is irresponsible and unprofessional. 

Early in my career here at UTSA, the Chair also passed along an email about a summer teaching program in China.  I responded to the email, thinking my Chair had actually screened emails she sent to the department and that she would only forward legitimate program opportunities for our faculty.  I went to China to teach for that program and it tuned out to be fraudulent in many ways (including the altering of final grades to pass every student).  I wrote a book about that fraudulent program to warn both faculty and students.  I also alerted an editor at the Chronicle of Higher Education and worked with a reporter to research these fraudulent programs in China, and I was the main source of an article on the subject in that journal.  A component department Chair would take personal responsibility to screen any information that is shared to the whole department because in sharing such information it can be seen as an endorsement.

The unprofessional and fraudulent practices that I have described in the Writing Program at UTSA have become commonplace in higher education.  Several decades ago, Dennis McGrath and Martin B. Spear (1991) published The Academic Crisis of the Community College, which documented a disturbing trend in community colleges that has since spread to research universities, especially non-selective institutions like UTSA.  McGrath and Spear (1991) argued that college classes are “conventional and mostly ineffective” (p. 48) because they “expect little commitment or effort from students and provide only meager models of intellectual activity” (p. 19).  The curriculum is often “mere compilations of facts, strung together by discrete concepts within a transparent theory” (p. 30).  Both faculty and students have lowered “expectation[s] about what counts as rigorous academic work” because “intellectual activity [has] bec[ome] debased and trivialized, reduced to skills, information, or personal expression” (p. 54).  Faculty focused on teaching freshmen and sophomores are “disengage[d] from disciplines” and this “spawns a progressive, if silent, academic drift – away from rigor, toward negotiated anemic practices” (p. 142).  Many students in America at open-door or non-selective institutions of higher education are getting a “scaled-down” (p. 93) “weak version” of college (p. 12) and a “significant leveling down of the ‘norms of literacy’” (p. 15), which limits their possibilities and sets them up for failure.

I have tried to understand this trend with my scholarship, especially my book on the history of community colleges in the U.S. (Beach, 2011), and I have tried to battle against this trend with my teaching.  Unlike my colleagues, I offer high and largely objective academic standards because I am very familiar with the research on good teaching and student learning, which I will discuss at more length later in this report.  Good college-level teaching should connect students to the objective world via scientific scholarship and critical thinking so that students can learn how the world works, and so students can learn real skills so they can successfully operate in the world (Gopnik, 2016, p. 180).  As development psychologist Alison Gopnik (2016) documents, authentic learning takes place through concrete activities, whereby students become “informal apprentices” (p. 181) and they should “practice practice, practice” (p. 182) the specific skills the teacher introduces.  Students move from ignorance and incompetence to competence, and then finally to “masterly learning,” whereby they take what they have “already learned and make it second nature” (p. 182, 204). 

The teacher’s job is to explain, demonstrate, critically analyze, and evaluate students’ practice so that they can learn from their mistakes.  Gopnik (2016) explains, “With each round of imitation practice, and critique, the learner becomes more and more skilled, and tackles more and more demanding parts of the process” (Gopnik, 2016, p. 185).  The learning process requires a lot of work and effort, and it can be “grueling” and painful (p. 185), which causes many students to complain about the effort they have to expend.  True learning is much more like the apprenticeship model found in sports and music than academic subjects (p. 186), which are more demanding practices than just memorizing information and filling in answers on standardized tests.  I seek to give my students a real education along the lines of the apprenticeship model that Gopnik (2016) discussed, and I hold my students to high academic standards that push them beyond their subjective preferences towards knowledge of the objective world. 

But faculty in institutions of higher education should not just be good teachers.  We should also be scholars, which political scientist Keith E. Whittington (2018) defines as those who “produce and disseminate knowledge in according with professional disciplinary standards” (p. 148).  As scholar-teachers, faculty should not be circumscribed by administrators and be told “what to teacher or how to teach it” (p. 142), as long as we can demonstrate professional research that legitimizes our practices.  This is the foundation of free speech in the academy (p. 7), which enables universities to be marketplace of ideas.  As Whittington (2018) argues, “The faculty members and staff of a university have an obligation to socialize and train students to engage in civil but passionate debated about important, controversial, and sometimes offensive subjects, and to be able to critically examine arguments and ideas that they find attractive as well as those they find repulsive.  Colleges and universities will have failed in their educational mission if they produce graduates who are incapable of facing up to and judiciously engaging difficult ideas” (p. 93).  As an instructor of writing, critical thinking, and argumentation, I wholeheartedly agree with Whittington (2018), and I take my mission very seriously because I know the social and political ramification if I do not.

Few faculty in the Writing Program follow my example of good teaching, let alone my commitment to scholarship, disciplinary standards, and academic integrity.  But I understand why.  It’s easier to teach to the textbook, have low standards, and inflate grades, especially when you are working two or three jobs.  My colleagues also fear the vicious circle, which is understandable: Students will complain, which leads to lower student evaluations, which leads to punitive measures, especially lower employment.  Plus, there is a psychological cost for being a committed scholar-teacher.  When a faculty member holds high standards, both for themselves and for students, these professional standards can cause a lot of frustration and demoralization.  Professor of education Doris A. Santoro (2018) has documented K-12 teachers’ “high level of dissatisfaction” with their jobs due to widespread “demoralization,” which she defines as the “inability to enact the values that motivate and sustain their work” as teachers (pp. 3, 43).  This same kind of demoralization happens in higher education.  I have suffered demoralization for years at UTSA. 

Many teachers, like myself, passionately care about “the integrity of the profession,” but they “cannot do what they believe a good teacher should do” because there is “dissonance between educators’ moral centers and the conditions in which they teach” (Santoro, 2018, pp. 88, 43).  Santoro (2018) talked about a teacher named Reggie who had to resign after 10 years because, “You play ball or leave with your ethics” (p. 1).  I know some colleagues who have had to quit UTSA (and other institutions) because they were demoralized by the dysfunctional policies and unprofessional practices.  For years, I have felt the pressure to lower my standards or to just quit, but instead I have worked harder to research the problems UTSA faces, I have continually tried to discuss these with colleagues, and now I have written this report.  I have always tried to stay true to my principles and best practices, and to stand up for what is right, trying to change UTSA for the better.

But I have paid a great cost over the years.  I have suffered a lot of stress and worry every semester, which has caused me health problems.  I have also psychologically suffered from being criticized by my peers and my Chair.  I have been passed up for promotions and raises, and so I earn a lower salary than most of my colleagues, even though I am the most productive and acclaimed scholar in the department.  Santoro (2018) has analyzed the “isolation” that many educators feel as “conscientious objectors” when they stand up “in the name of professional ethics” in order to demarcate the line between the “good work” of teaching from those actions that violate professional standards (pp. 8, 4).  Many teachers have a “craft conscience” so when dictated rules or norms violate professional standards these teachers feel that “they are degrading their profession” by being forced to acquiesce to what Santoro calls “moral violence” (pp. 91, 138).  I have acutely felt the “moral violence” perpetuated by the policies and practices of the Chair, and it has worn me down.

One issue that many teachers around the country complain about again and again is administrative pressure to pass students who do not meet professional standards by demonstrating adequate learning.  I would argue that this is the biggest problem in the Writing Program, although this pressure has come from top administrators at UTSA.  The Chair has repeatedly criticized me for not falling in line with the rest of the department.  As Santoro (2018) documented, some teachers lament that they “damaged the integrity of my work when I passed that student” (p. 32).  Other teachers have explained how they are sometimes pressured by administrators with what Santoro (2018) calls “moral blackmail,” which entails shaming teachers so they will change student grades or else face admonition or official reprimand, which could include dismissal (pp. 136, 97).  I have been a victim of both moral violence and moral blackmail because the Chair has often criticized my teaching and me personally, mostly because students complain about my high academic standards, and because I fail too many students.  The Chair has also used “moral blackmail” by threatening my promotions and employment over this issue.  But I have staid true to my principles and best practices, and I have suffered for it.

Over the past eight years, I have raised all of these issues in one form or another, and many issues I have repeatedly raised every year.  I have tried to appeal to the Chair of the department in private conversations and emails, to the main coordinating committee, and through emails to the whole department, but my ideas have been ignored, and I have often been criticized and ostracized, especially by the Chair.  I have also been stuck in a Lecturer II position for years, while almost all of my colleagues have been promoted to Lecturer III, some with lots of merit pay and awards, even though they all have less experience, less education, and less professional accomplishments than I do.  And worst of all, every semester I see and more and more unprepared students who can’t read and write get passed through the writing program without the skills they need to be successful in life. 

 

Unprepared for Success:  UTSA Students Lack the Motivation, Student Skills, and Academic Skills to Succeed in College

For the past eight years, I have been formally assessing student motivation and academic proficiency.  I have also been studying the quality of the faculty and the curriculum of the Writing Program, as well as some of the institutional policies of UTSA.  I wanted to both understand the complex causes of student failure and also to develop innovative ways of increasing student motivation and achievement.  It is important to note that most of the students that I see in Writing II have already passed through Writing I at UTSA learning almost nothing, Because my colleagues inflate grades with low standards, I am put in the almost impossible position of trying to teach unprepared and largely unmotivated students both Writing I and Writing II in a 16 week semester, a difficult situation which is hard on both students and myself.

1. Attendance and Persistence, Week 1-12

When I first started teaching at UTSA I had a mandatory attendance policy, but I did not specifically reward or penalize students for coming or not coming to class, other than the right to revise one of their major essays if they missed 2-3 classes or less.  I quickly noticed that absent rates were very high, comparable to what I have seen in community colleges.  From 2010 to 2012, I began to document absenteeism.  I found that 22% of my students missed almost two weeks of class by the 12th week of classes (about 14% of class time) and 17% of my students missed more than two weeks of class (about 19.5% of class time).  Almost all of these students would end up dropping or failing because they were not in class to learn, hear directions, or stay on top of assignments.

Picture3.png

By 2014, I not only made attendance mandatory, but I had to start grading attendance to keep more students coming to class.  Almost every day there were some points to be earned by being in class and participating.  Even with graded attendance, many students still missed a lot of classes, or never showed up at all.  In the fall of 2016, over the first four weeks of the semester, two students never came to class.  After a month, I emailed students and advisors in order to tell students to withdraw.  Two other students withdrew during the first two weeks: One of these students said there was a family emergency, and the other student did not explain reason for leaving.  These four students represented about 5% of the total students I had that semester.

During the first nine weeks of that same fall semester 2016, many students were coming to class late and/or not attending class regularly.  Several of these students effectively stopped coming to class, although only a few students officially “withdrew” from class.   I cannot teach or help students who do not demonstrate the basic student skills of coming to class prepared to learn.

 

Students with the most absences and late attendance, Week 1-9 (Absent/Late)

1)     6 Students:  5/0, 6/4, 4/6, 4/2, 2/3, 2/6 [2 of these students stopped attending]

2)     5 Students:  9/1, 3, 13/3, 9/2, 13/1 [5 of these students stopped attending; 1 was pregnant and got married]

3)     5 Students: 6/0, 8/0, 5/2, 9/0, 14/0 [3 students stopped attending; 1 with mental health issue]

4)     2 Students: 5/1, 8/0

5)     1 Student: 3/0 [1 student stopped attending: she was married, working 40 hours, taking 6 classes]

 

Students withdrawing from class during weeks 2-4 or week 8, grades for W students week 8

1)     1 (wk 2-4); 2 (wk 8):  [256 points out of 365], [262/365]

2)     4 (wk 2-4)

3)     2 (wk 8): [178/365], [214/365]

4)     1 (wk 8): [197/365]

5)     2 (wk 2-4)

 

Analysis of Attendance and Persistence

There were two periods where students formally withdrew from the class with a “W” grade.  The first period was between weeks 2-4 and the second period was around week 8.  During the first four weeks, seven students had dropped.  During this time and into week 9, around 20% to 30% of students in each class (except class #5) stopped coming to class regularly, were excessively late to class, or both. While attendance naturally fluctuates, some students had a pattern of coming to class late, or only attending 1 or 2 classes a week.  Many students would stop attending regularly before and after major assignments.

By week 9, twelve students had stopped coming to class regularly.  Of those twelve, five students formally withdrew from class by week 9, one got married and was on her honeymoon, and another student had a psychological breakdown due to personal issues.  The other five students have not communicated with me and they have not withdrawn.  One of these students was a serious, hard working student who had come into my office about 5 times during the first month of the semester.  She asked for advice on her major, transferring to another university, internships, and also help on class assignments.  But this student was married, working full-time, and taking 6 classes.  We had talked about her workload, and I’m assuming that is the cause of her absents, but she never explained why she stopped coming to class (she later formally withdrew from class, and I learned that she felt the class was too hard). 

How can students learn and pass a class if they are not in class, or actively engaged with their coursework?  The simple answer is they cannot.  Many UTSA students are unwilling or unable to attend all of their college classes, which is one of the significance causes of student failure. 

 

Why Do Students Miss Class?  Possible Causes:

There are many possible causes for excessive absences: not having prerequisite skills; succumbing to the added stress of having to learn two classes at once; poor student skills; not liking the responsibility of active learning; not liking my high standards.  But I think the main cause of absents/lateness was time management: Students have too many commitments, which is the focus of the next section of this report.  The majority of my students are working and/or talking 5-6 college classes.  However, there were some other unique circumstances as well:  One student had a mental health break-down, which took her away from class for almost two weeks, and another student was pregnant, getting married, and going on a honeymoon during the semester.  Every semester I have at least one or two students who discuss serious mental health issues with me (representing about 2-3% of all my students), and many students who discuss conflicts between school and work or school and family.

2. Assessing “Soft” Student Skills, Week 1-4

For the first four weeks at UTSA, my students were graded on “soft” student skills and attendance, so I was specifically assessing specific “soft” skills.  Having taught at the community college level for over a decade and having published many books and articles on community colleges, I have found that lack of student skills is one of the most significant problems causing low rates of success for community college students in terms of passing classes, persisting, graduating with degrees, or transferring to universities. 

Many community colleges now require mandatory “student success” classes, which teach students the basic skills of being a successful college student, like how to read, how to learn, how to take tests, how to prioritize goals, how to manage money, and how to navigate the institution of college.  In these classes, students are mostly graded mostly on participation.  When I have taught these classes, students could

Picture4.png

fail the major assignments but still pass the class if they came almost every day, participated in class, and completed almost every assignment.  And yet, many students still fail the class, mostly because of absenteeism and non-completion of assignments.  In the above chart, you will see that 20% to 40% of community college students failed my student success classes due to these reasons.  Also, I would like to explain why one class had almost 20% more successful students.  The class with a 79% success rate was at 7:30am while the other class with a 62% success rate was at 3pm.  I have found that early morning classes have significantly different student characteristics than classes held during the day or evening.  Generally, I have found that students who take early morning classes are more motivated and prepared. 

In my UTSA classes, on top of teaching soft skills, there are also two graded academic assignments testing prerequisite knowledge from their previous course, which here at UTSA would be WRC 1013 (Freshman Composition I), but many students take this prerequisite class in high school.  To be successful on these academic assignments, students needed to follow assignment directions in syllabus (also there were models of sample work in syllabus and on blackboard to help demonstrate assignments).  If student demonstrated all “soft” skills, but failed the two academic assignments with F grades, then they still would have had a C+ to B- grade for class.   In order to earn a failing grade of D+ or lower, a student needed to fail both “soft” skills assessments and the two academic assignments.  For the first month at UTSA, my assessment was focused on:        

            1)  Attendance

            2) Punctuality

            3) Effort/Motivation: Bringing textbooks/assignments to class

            4) Effort/Motivation: Taking lecture/discussion notes

            5) Effort/Motivation:  Following directions in syllabus and asking for help

            6) Effort/Learning:  2 academic assignments focused on prerequisite skills

 

Percentage and raw number of students failing class with D+ or lower (does not include 4 withdraws)

      Class 1)  31.5% (6 students)

      Class 2)  45.5%  (10)

      Class 3)  48% (12)

      Class 4)  33% (6)

      Class 5)  5.5% (1)

 

Analysis of Student Skills: Around 30% to 45% of students could not demonstrate core “soft” student skills, which included simply coming to all class on time with required materials and taking notes during class.  This is comparable with what I see at community colleges, and in many ways, UTSA students are nearly identical with “non-traditional” community college students in terms of academic risk factors.  Without foundational student skills and prerequisite knowledge, there is no way a student can pass a college class, let alone earn a degree.

Possible Causes: Poor student skills in high school; High school teacher’s not teaching soft skills; High school teacher’s teaching soft skills but not holding students accountable; 1st generation college students not aware of some soft skills (especially learning skills); Working 30-50 hours a week plus taking full load of classes; Some students taking 5-6 classes; Family responsibilities; illness

3. Assessing Prerequisite Core Skills through Essay Writing

The first major essay assignment was started week three and culminated at the beginning of week five.  This essay assignment was designed to specifically target and assess prerequisite skills from previous writing classes that students needed in order to learn and master new WRC 1023 skills.  These prerequisite skills have traditionally been taught in middle school, high school, and then covered at a higher level in Freshman Composition I course. 

I took three class periods to discuss these prerequisite skills.  These skills include basic sentence structure, basic paragraph structure, basic essay structure (topic, thesis, supporting points, evidence, and transitions), citations, plagiarism, and the “borrowing” skills of summary, paraphrase, and quoting.  Outside of foundational these skills, I took another day and a half to test for reading comprehension skills, which entails finding the basic parts of writing (topic, thesis, supporting points, and evidence) in order to understand an author’s argument. 

During class, students were required to take notes and ask questions about any concepts or skills they did not understand (few asked questions).  This information was also assigned in two textbooks and some extra readings.  I also told students to ask questions in office hours if there was any “review” information that they did not understand (few came to office hours).  I repeated all of this core prerequisite information three times, once on each of these three days, plus students were supposed to read their textbooks: All concepts and skills were covered four times or more

The first step of essay 1 was to read one source, roughly four-five pages in length.  Students were to annotate this reading in order to find topic, thesis, supporting points, and evidence.  Students used this information to write a summary and analysis essay, which was to be 3-4 pages in length. As mentioned, there was one and a half class periods devoted to reading comprehension in order to assess student-reading skills and help them understand the assigned text.  Students worked in groups to annotate the reading, making sure they could identify the parts mentioned above.  Before class, many students did not follow directions and/or they simply did not do the reading/annotation assignment.  For those that did, most found the topic, but few found the thesis.  Few students could distinguish supporting points from evidence/details, and most annotations consisted of random words and sentences underlined, most not part of central argument of text.  Because no student could effectively read, I had to tell students the thesis and main ideas of the text, writing the basic points on board, and then requiring them to go back to the text to fully state and explain these main ideas in their own words after class.

The second step of essay 1 entailed students working groups in order to create a typed outline for their summary & analysis essay.  I took 1 day of class to evaluate and grade each group outline, giving students oral and written feedback.  Students then had 1.5 weeks to re-read text, revise their outline, write a draft, revise their drafts, and then hand in the final version of the essay.  I specifically DID NOT take class time to discuss drafts and editing because this was a diagnostic essay assignment designed to test prerequisite skills, so I needed to see what students could do with only some guided help and review.  Even after telling the students the thesis and main ideas (as described in last paragraph) and having them work together in groups on the outline, most groups still had no comprehension of the thesis and main ideas of the text.  In office hours, I literally had to go paragraph by paragraph in order to read the text out loud to some students in order to explain thesis, main points, and essay organization.

This assignment was a diagnostic test to see if students knew A) how to effective read and annotate core parts of an essay, B) how to create an outline, and C) how to write a college-level summary essay demonstrating proper paraphrase, summary, quoting, and citation, and D) how to edit drafts.  More specifically, I was testing for knowledge of these core concepts: topic, thesis, detailed evidence, transitions, summary, paragraph, quoting, and citation, as well as writing a basic sentence and paragraph.  I was also testing to see if students can follow directions and complete work by deadlines (re-test of soft student skills mentioned above). 

Before entering my class, students should have basic knowledge of ALL prerequisite skills from previous classes.  However, I was not giving a true diagnostic because I explained multiple times ALL of the information they needed to be successful on this assignment.  First, as I stated above, I provided students with all the core concepts four times (3 times in class plus textbook readings).  Second, I provided students with models of sample student outlines and essays on blackboard.  Third, I provided students with a detailed essay structure for the assignment on the syllabus.  Fourth, I gave them a detailed “peer-review” assignment, which reinforced many of the core concepts.  Finally, I told them the topic, thesis, supporting points, and major details of the text so they were not fully responsible for reading on their own.  If students had a basic understanding of core skills, they should have been able to demonstrate all of these prerequisite skills on essay #1, especially with ALL the copious amount of help that I provided.

 

Percentage and raw number of students passing (C- or higher) or failing (D+ or lower) Essay 1

1)     47% Pass  /  53% Fail

2)     41% Pass  / 59% Fail

3)     40% Pass  /  60% Fail

4)     33% Pass  /  67% Fail

5)     71% Pass  /  29% Fail

 

Majority of students do not have prerequisite skills.  Where did students take Writing I class?

1)     5 UTSA WRC 1013; 3 other university; 1 community college; 8 high school

2)     4 UTSA WRC 1013; 3 other university; 1 community college; 9 high school

3)     1 UTSA WRC 1013; 2 other university; 3 community college; 14 high school

4)     6 UTSA WRC 1013; 3 community college; 10 high school

5)     2 UTSA WRC 1013; 2 other university; 12 high school

 

In a follow up survey the next year, I specifically asked students about their high school English classes.  Almost all claimed to have earned As and Bs in high school English for both their senior and junior years.  Also, 68% claimed they took AP English and 32% took duel-credit college-level English.  This leads me to believe that high schools are not teaching basic skills, not even at in duel-credit classes, and that high school and duel-credit teachers are inflating grades.  I have taught duel-credit college classes in Austin area high schools and I have found that many students, and in some cases most students, do not have the basic literacy skills of reading or writing.  In one lower SES Austin area high school, I taught a sophomore-level class, which meant that students had already passed two prerequisite college-level writing classes, yet half of the class could not effectively read or write a sentence, let alone demonstrate higher order reading comprehension or essay writing skills.

Picture5.png

Analysis of Prerequisite Skills:

The majority of students in four classes (53% to 67%) could not demonstrate the basic prerequisite skills needed for WRC 1023.  Most importantly, the majority of my students are functionally illiterate.  They cannot understand the core topic, thesis, and main ideas of a text: A) They have limited vocabulary and cannot understand all the words in a college-level text, and they do not look up unknown words; B) They cannot see, let alone understand, topic, thesis, or main points, so they “read” a text as just a jumbled list of details – They cannot see main parts of argument and how main parts are logically connected; C) They confuse details with points; D) They cannot link points and details to the person or group responsible for that information – They do not see the conversation/debate within the text, and even when this conversation is pointed out to them and explained, most quickly forget this concept.

Lack of prerequisite skills makes it very difficult, if not impossible, for many students to successfully pass my course.  Having to learn WRC 1013 concepts AND WRC 1023 concepts at the same time puts a lot of stress on students.  On top of this, circling back to the first issue or poor “soft” student skills, many students do not put much effort into coming to class, doing homework, following directions, understanding lectures/textbooks, asking questions, or fully completing assignments. 

Given these circumstances, there is not much that I can do as a teacher.  First, I have to lower standards and inflate grades; otherwise I would have to fail most of my students for not being able to demonstrate basic pre-perquisite skills.  I reward students with points simply for coming to class on time, brining their books, taking notes, and for doing open-book reading quizzes.  Even then, around 5-30% of my students cannot meet these basic requirements.  Around 10-15% of my students (often more) can’t even make it to class consistently.

But I still hold students to a relatively high academic standard, and I force students to be responsible for information by using Socratic teaching methods.  Students simply want to be “told what to do,” and then parrot the correct answer, which is not an effective way to learn.  Instead, I often answer student questions with a question, trying to get them to first identify the topic of their question, and then generate their own answer by using knowledge from class lectures, notes, and textbooks.  Many students find the Socratic method frustrating, too hard, and for some, rude.  Why?  Because I do not directly answer their questions, and I make them think for themselves.  Many students do not know how to think and they don’t want to put much effort into learning thinking skills.  But what is college if it is not teaching students how to think for themselves so they can be self-directed learners?

Possible Causes: The majority my students are coming from an AP class in high school or a community college.  It is clear that some students were simply not taught all the basic prerequisite skills.  I talked to one adult student who took Writing I in a community college, and she said she wrote four personal essays and one research paper.  Most of the information that we “reviewed” in class was new to her.  It is also likely that some students were taught core skills, but the teacher used ineffective methods, so skills only made it into short-term memory, and then were quickly lost.  Some students do not want to invest much time and energy into the learning process: They simply want a teacher to tell them what to do.  Some students are overconfident in their skills, or just unaware of how college works, because they expect that they will be able to pass all assignments without much effort.

For those who took WRC 1013 here at UTSA, all should have been introduced to core skills (although one of the essays in the norm session this semester showed that not all UTSA instructors are teaching core skills, i.e. the student with flawed punctuation and limited higher order skills who got straight As in WRC 1013).  But while most WRC 1013 teachers do cover core concepts and skills, some are most likely using ineffective teaching methods, so knowledge and skills gained in class are quickly lost, and they are not retained for future courses.

 

4. Student Class Load and Employment

Competing responsibilities and time management are high risk factors for low student achievement and non-completion of degrees.  This is amply documented in the literature on higher education students.  In 2016, I took a survey, asking students about how much they work and how many classes they were taking.  My working hypothesis was that many students could not manage competing responsibilities, and therefore, they simply did not have the time be successful in all of their college classes, especially a writing/thinking intensive course like WRC 1023.

 

Student Employment Hours: 20 hours or less / 21-35 / over 36 hours

1)     10 Students Working:  7 / 3 / 0

2)     11 Students Working:  9 / 1 / 1

3)     8 Students Working:  7 / 1 / 0

4)     5 Students Working:  3 / 2 / 0

5)     5 Students Working:  5 / 0 / 0

Student Class Load: 4 classes / 5 classes / 6 classes

1)     3 / 11/ 0

2)     3 / 11 / 1

3)     9 / 8/ 2

4)     3 / 7 / 1

5)     0 / 14 / 1

Analysis of Competing Responsibilities

In some classes, over 50% of my students were working 20 hours or more, on top of taking 5-6 university classes.  I took a follow up survey in 2017 and found that 39% of my students were taking 5-6 classes.  Few students can be academically successful with such a burdensome workload.

Picture6.png

Possible Causes:

In conversations with several students, financial reasons motivated them to not only work in order to help pay for school, but also to take more classes so as to finish quickly, thereby paying less tuition.  I counseled many students about the virtues of working less and taking out loans so they would have more time and energy to devote to academic success.

5. Does Lack of Prerequisite Skills Predict Success in WRC 1023?

I did not have the time to collect or organize all the data needed to run a statistical analysis to begin to answer the above question.  But from studying the scholarly literature on the subject and from my experience, I suspect I know at least some of the main causes of absents, withdrawals, and failing grades in WRC 1023: lack of prerequisite skills and poor time-management due to working and taking too many classes.  There are other causes, but I think these are the two most important.  In order to test the connection between prerequisite skills and success on new skills, I have put the grades of essay #1 and essay #2 next to each other in the table below:

 

            Does Essay #1 Grade Predict Essay #2 Grade and Withdrawing? 

            1)         9 passed Essay 1;  3 (33%) of these students failed Essay 2 or withdrew

                        10 failed Essay 1;  9 (90%) of these students failed Essay 2 or withdrew

 

            2)         9 passed Essay 1; 3 (33%) of these students failed Essay 2 or withdrew

                        13 failed Essay 1; 6 (46%) of these students failed Essay 2 or withdrew

 

            3)         10 passed Essay 1;  4 (40%) of these students failed Essay 2 or withdrew

                        15 failed Essay 1;  12 (80%) of these students failed Essay 2 or withdrew

 

            4)         6 passed Essay 1;  3 (50%) of these students failed Essay 2 or withdrew

            12 failed Essay 1;  8 (67%) of these students failed Essay 2 or withdrew

 

5)       12 passed Essay 1;  2 (17%) of these students failed Essay 2 or withdrew

                       5 failed Essay 1; 2 (40%) of these students failed Essay 2 or withdrew

 

Analysis

The data indicate a clear trend.  Students without prerequisite skills (as measured by diagnostic Essay 1) are more than twice as likely to fail Essay 2 or withdraw from class than students with prerequisite skills.   Also, while prerequisite skills necessary, they are not sufficient for success with higher order skills, as some students who passed Essay 1 still failed the harder Essay 2.  Finally, the data also show the limited, but powerful effect a competent teacher can have on un-prepared students.  Around 10-60% of un-prepared students who failed Essay 1 went on to pass Essay 2.  In terms of measuring teaching effectiveness, I think that these data are the most significant measure.

Possible Causes:

The probable cause is intuitive.  Students need prerequisite skills in order to learn and master higher order skills.

Picture7.png
Picture8.png

6. Why Do Students Fail Open Book Exams?

Most students could not recall basic information from textbook readings during class discussions.  After class lectures and discussions, which were reiterations of textbook reading, many, if not most, students still could not recall or explain basic concepts.  Students could also not understand basic concepts after getting detailed feedback on essay #1.  Core concepts were explicitly noted on graded student essays.  After getting these essays back, students were responsible for reviewing information, fixing each part of their essay, and then coming to see me in office hours to discuss their mistakes and to see if they were able to fix them.  I offered most students extra credit for this process.  Many students came to my office with essays that had hardly been changed at all.  I would put my finger on concept one, then put my finger on where that concept was demonstrated poorly in draft #1, and then I would put my finger on the new essay where that part was supposedly “fixed.”  In many cases, there was not even a single altered word, let alone substantial revision.  I also rarely got any response, let alone accurate answers, for definitions of basic concepts, like “what is a thesis?” or “how do you quote?”  Even when offered extra credit, students were unable or unwilling to do the work required to learn core concepts. 

We went over basic skills material multiple times in class and in assigned readings, students wrote Essay 1 as a diagnostic test of these skills, and many students started revising Essay 1 with additional feedback from me during office hours. Later in the semester, I gave an open book exam on these basic skills to see if students were gaining knowledge.  An open-book exam is really a high school tactic and I’m not aware of professors giving such exams in higher education, especially at a university.  At this point in the class (week 9), all students should have been able to successfully get 100% on an open book exam, but many did not.  Why?

 

Open Book Exam:  Students Failing Exam / Students Not Turning In Exam

1)     3 / 2  (29% of total class)

2)     1 / 3 (17% of total class)

3)     0 / 7 (32% of total class)

4)     1 / 5 (29% of total class)

5)     0 / 0

Analysis

A significant number of students in most classes could not pass a take home exam, or could not complete the assignment and turn it in.  There is no way to effectively teach a student who lacks the cognitive skills to complete an open-book exam, or who lacks the basic motivation to do their homework.  The assignment was worth 24 points, or almost 2% of their total grade, so it was not a minor assignment.

Possible Causes:

I think there are two probable causes.  Some students are overcommitted with competing responsibilities, and these students do not have the time to complete assignments, or come to class regularly.  Other students have low motivation, and these students do not care enough to complete assignments or work hard enough to master new skills.  But there is also a more basic potential cause: Many students are illiterate or have undeveloped literacy skills.  Last semester, spring 2018, I conducted a basic reading and discussion activity in class.  Students read a two-page newspaper article, and I asked a lot of basic reading comprehension questions, but few students could demonstrate high-school level literacy, let alone college-level literacy.  Most of the students in several classes couldn’t even find and accurately count the number of human researchers who were named in the article, which is a very simple task that an 7th or 8th grader could do. 

8. The Signal and the Noise: Are There Any Patterns?

           

Analyzing this data, I see a general pattern.  Around 30-48% of students were failing by week 4.  The first three classes had 5-6 students who were chronically late and absent from class.  Between 53-67% of students failed the first essay.  And 17-30% of these students could not pass or complete a take-home exam.  The data indicate that over 1/3 of my students (approximately 30-40%) were overburdened with competing responsibilities, they could not motivate themselves enough to attend class or learn, or they could not put enough effort into the class to be successful.  Further, around 2/3 of the class (53-75%) did not have basic prerequisite skills to be successful so they had to learn both prerequisite skills and higher order skills at the same time, which could have also contributed to demotivating some of these students.  Given this data, a competent teacher who is not unduly lowering standards and/or inflating grades would expect to pass only around 50% of these students.  The data showed that I had a measurable impact on between 10-60% of un-prepared students who failed Essay 1 because they students went on to pass Essay 2.  These data are the most significant measure of my teaching effectiveness.

The outlying data are clearly my last class at 12pm, which was far more successful than my other four classes.  Part of the reason was that this class was much smaller than the other classes (only 18 students registered week 2).  Although my first class was also smaller, this class was at 8 am and many students were consistently late and absent due to the early time.  But the main reason my 12pm class was so successful was due to random fluctuation of student enrollment.  Only 1 student (5.5%) was not fully participating and passing by week 4, and 71% of students passed Essay 1.  I was lucky with this class because students were more prepared with core prerequisite concepts and skills, and these students were more motivated to work hard, come to class, and participate.  I did not include this data above, but 87% of this class passed Essay 2, and the two failing students both got D grades (one of them a D+ or 69%) so they were not far from passing marks.  Clearly this class demonstrates how teaching and learning effectiveness is related.  A competent teacher can only do so much with low-skilled and de-motivated students, but competent teacher can be more successful if students have prerequisite skills and they are motivated to learn.

The 12pm class proves that my teaching style and curriculum are not the cause of low student persistence and achievement in my classes.  If I was an incompetent teacher or if I had unreasonable expectations, then there would be high rates of failure equally distributed between all my classes, plus one would expect little or no achievement gains with the unprepared students (between Essay 1 and 2).     

I work very hard each and every semester, and yet I am rarely very successful here at UTSA in terms of getting more students to persist in my classes and successfully pass.  Nor am I very successful in getting students to appreciate criticism and hard work as tools that promote academic and personal growth.  A lack of preparedness would not be an insurmountable problem if students were motivated and willing to work hard and learn.  But many are not.  I collected data from my students during Spring 2018.  I had them fill out a survey after each major essay assignment.  The most important question I asked was, “Did you complete the full assignment and follow all of the teacher’s directions?”  Between 45 and 61.5 percent of students said “No.”  Approximately 13.5 to 24 percent did not complete all the required reading for the project.  Around 33 to 54 percent admitted they did not spend enough time to complete the assignment and do well.  About 5 percent admitted that they didn’t even both to read the directions for the assignment in the syllabus.  When students refuse to fully participate, it is impossible to get them to learn and grow. 

But I do have some successes, albeit limited, as the data above shows.  Ironically, while students complain a lot on student evaluations and give me very low scores, according to the research I collected in Spring 2018, between 83 to 94 percent agreed that my criticism was helpful.  One student from that bunch wrote, “Professor Beach has made it his goal to help his students become better prepared for future college classes as well as our future careers.  Through his high standards for each student, Professor Beach has helped me learn from my failures, and grow strong as a writer and college student.  Although it has been a daily struggle, I have overall done well in the course and am aware of my weaknesses to work and develop them…Professor Beach has forced me to work hard and think more critically that I have ever had to before…If not for Professor Beach setting such high standards, I would not have known my own capability as a writer, student, and overall person” (Student #3). 

The data I have presented show a basic pattern, which illustrates the difficulty of teaching unprepared freshmen at UTSA, and the limited success that a highly trained and competent teacher can have.  Although I continue to innovate in my classes, I cannot be much more successful because of A) the academic quality of students admitted to this university, B) the competing outside responsibilities of these students, which takes up their time and effort, and C) the lack of effective student support programs at this university.  When I taught down the street at Trinity University, around 90-100% of my students would persist through to the end of the semester and pass my courses, the majority with A and B grades.  Such rates of success are simply not possible at UTSA, but they could be if there were better management and professional development of faculty at UTSA, and also more coordinated reforms at every level of this institution to make sure that students are prepared and fully supported to succeed.

Educational Malpractice in China

This article was a case study on ONPS International Summer School, a private university-level program at Jinan University in Guangzhou, China where I worked in 2012. This essay is an excerpt from my book Academic Capitalism in China: Higher Education or Fraud?, which was originally published in 2013. In the book, I changed the name of ONPS to China X program for legal purposes. I gave my data to The Chronicle of Higher Education and they sent reporters to China to independently verify my account and they did their own original research on the topic, which was published by Beth McMurtrie and Lara Farrar as "Chinese Summer Schools Sell Quick Credits."  Chronicle of Higher Education (Jan 14, 2013)

 
This program is a business to make profit.
— Staff member at ONPS
 

Higher Education in China

In East Asia, state sponsored education and a cultural emphasis on credentialed knowledge workers have both been venerated traditions for thousands of years.  In what is often called Chinese “Confucian” culture, education has been revered as a time-honored process of transmitting the collected wisdom of Chinese civilization – one of the oldest civilizations on Earth.[i]  Academic degrees have been the primary markers of social distinction and economic mobility for over two thousand years.  The hereditary locus of aristocratic power became blended with a meritocratic educated bureaucracy, which together created a “mixed aristocratic/bureaucratic ruling class.”[ii] 

For much of the past two millennia of human history, China was “the most literate and numerate society in the world.”[iii]  Educational institutions stressed rote memorization of the Chinese language, classical Chinese texts, ritualized socialization, writing, and the arts. [iv]   And while Confucian and neo-Confucian educational principles did stress individual development as “self-cultivation,” the emphasis of formal schooling, especially in later neo-Confucian institutions, focused more on situating the individual within the hierarchical “structure” of society.  Thus, much of a student’s instruction was geared toward a socialization process, whereby, the individual student learned proper social values, such as formal social discourse, deference to superiors, and traditional rituals.[v] 

Instruction culminated in a high stakes final “examination” that served as the gateway to a social title and a position in the state bureaucracy.[vi]  This East Asian educational system produced a small population of literate and cultured elites, trained in a traditional and largely unchanging body of ethical and technical knowledge.  The literate elite served as the administrative center of the Chinese empire.  This elite “enjoyed unrivalled authority and numerous privileges”[vii] because they effectively ran the empire by implementing the demands of the emperors. This caste of educated elites was higher in status than all other social classes, including military leaders, merchants, and priests.

The Communist revolution of the 1950s did not displace the standing of the educated elite in China, nor did it diminish the cultural importance of learning.  However, the revolution did temporarily replace the venerated texts of Confucius with those written by communist leaders, such as Marx and Mao.  In many ways, the communist revolution was co-opted by the previous imperial bureaucracy.  The state remained the paternalist center of an imperial empire, but there was a political shift away from hereditary monarchs towards the somewhat more open structure of the communist party, which supplanted the monarchs as the ruling authority.[viii]

Chinese communism was a very “pragmatic” blend of imperial bureaucratic tradition, communist ideology, and market activity.[ix]  Chinese leaders began to move further away from communist ideology towards capitalist economic development in 1978, albeit a form of state directed capitalism, starting with a few “special economic zones,” which eventually served as a model for the rest of the country.[x]  Due to these economic reforms, the economic growth rate accelerated considerably, moving from 4-5 percent during Mao’s administration to a yearly rate of 9.5 percent from 1978 to 1992.[xi]

The economic turn toward capitalism also ushered in a cultural transformation as well.  The Chinese people began to “worship wealth” and celebrate entrepreneurs, just like their counterparts in the capitalist western world.[xii]  As the political scientist Martin Jacques has explained, “Money-making, meanwhile, has replaced politics as the most valued and respected form of social activity, including within the [communist] Party itself.”[xiii] 

Communist Party leaders have set a new example for the rest of the nation.  They are highly educated, many with western university degrees, and they participate in market activities.  These leaders also often engage in corruption, exploiting state power to privately enrich themselves and their families.  Over 92 percent of central committee Party members have earned a college degree, many in technical subjects.  Most have used their political standing and connections to engage in entrepreneurial and investment activities, much of which would be considered corruption.  The former Prime Minister, Wen Jiabao, reportedly enabled his extended family to amass a fortune of over $2.7 billion dollars.[xiv]  In 2011 alone, close to 143,000 Party officials were accused of illegal activity, which led to “the recovery of 8.4 billion Yuan ($1.35 billion) in assets.”[xv]

The traditional veneration of education and credentials has only intensified in the 21st century.  China produces more college graduates than any other country, around 4.5 million in 2007 alone.  This was up from approximately 950,000 college graduates in 2000, an increase of over 470 percent![xvi]  And the numbers keep going up.  Now, there are close to 8 million college graduates a year, including both community colleges and universities.  A growing fraction of these college students attended and/or graduated from western universities.  By 2020, China anticipates having 195 million college graduates, compared to the United States, which expects to have only 120 million.[xvii]

The Chinese government has been investing around $250 billion a year in its educational system, encouraging more and more students to attend college and earn degrees.  Over the last decade, the number of colleges and universities in China had doubled, now numbering 2,409.[xviii]  

The demand for college credentials in China has increased exponentially, but the quality of Chinese institutions of higher education has been low and their management “dysfunctional.”[xix]  However, due to increased state investment and regulations, Chinese universities are becoming stronger.  Hu Jintao, the President of China, has admitted that “While people receive a good education, there are significant gaps compared with the advanced international level.”[xx]

Part of the problem with Chinese higher education is the lack of professors trained in research, leadership, and academic ethics.[xxi]  A generation ago, there were not many college graduates, especially researchers with postgraduate degrees.  With the exponential increase in Chinese colleges and universities, there have not been enough highly qualified college graduates to serve as professors.  And the pay is not great.  The average professor earns only the equivalent of $300 a month, which is less than many skilled laborers.  Many professors become entrepreneurs out of necessity, turning to the labor market for second jobs or to start a company.[xxii]

In 2010, no mainland Chinese universities were ranked in the top 30 internationally, but six mainland Chinese universities were ranked in the top 200, up from only five in 2004.  The United States, by contrast, has the most developed and highest ranked universities in the world.  Seven of the top ten universities in the world are in the U.S., the other three are in the U.K.  The allure of a degree from a top-ranked university has caused more and more Chinese students to study abroad in the U.S. and U.K.  During the 2003-04 school-year, there were approximately 128,000 Chinese students studying in the U.S., and another 75,000 studying in the U.K.  These numbers have been steadily increasing over the past decade, albeit with some fluctuation during the Great Recession of 2008-10.[xxiii] 

Chinese students studying abroad make up 17 percent of the total amount of international students globally.  In 2010, there were approximately 562,889 Chinese international students.  The top destinations were the U.S., Australia, Japan, the U.K., and Korea.  The U.S. is the most popular destination globally for international students, hosting approximately 19 percent of all such students.[xxiv]

But there is a dark side the educational boom in China.  For one, there is widespread corruption and fraud, by both students and professors.  Philip Altbach tentatively noted that “such corruption seems embedded in [Chinese] academe.”[xxv]  One recent study conducted by researchers from Beijing University found that Chinese students and professors had “little or no idea” about “academic ethics and misconduct.”  Approximately 40 percent of students admitted that current policies did not deter widespread cheating and fraud.[xxvi]  Unethical behavior in higher education mirrors widespread unethical behavior in the larger society, especially in politics and business, perhaps signaling a sort of break down in traditional ethical principles due to the momentous social transformation from a socialist to a capitalist society.[xxvii] 

In order to quickly graduate and get low-skilled government jobs, many students don’t care about learning or the quality of their academic work.  University students are plagiarizing established information from published sources or simply fabricating research results.  Graduate students steal research from their colleagues, publishing the data before the authors’ can write up their report.  Some hire ghost-writers to research and write graduate thesis papers and dissertations.  A master’s thesis in English costs around 20,000 Yuan, cheaper if it is written in Chinese.  You can even pay some academic journals to have your work published.  Some ghostwriting businesses offer to both write your paper and get it published![xxviii]  One Chinese student explained, “No one likes writing papers.  It is meaningless and just a technicality before graduation. Most teachers are acquiescent."[xxix]  Some graduate students just buy their degrees from corrupt higher education officials or from fake schools, often referred to as diploma mills.[xxx]  Sometimes, students have to bribe university officials just to get accepted.  One student with adequate test scores was asked to pay a $12,000 bribe in order to be admitted to a university.[xxxi] 

Professors are also engaging in academic fraud, perhaps setting bad examples, which their students eagerly follow.  More than a few professors have lied about their qualifications, falsely claiming non-existent degrees or falsely claiming published papers or books.  A couple of professors have falsely claimed to be the authors of research papers published in the west.  At least a few unscrupulous professors have just copied previously published papers and then re-submitted the work to another journal, falsely claiming original authorship of someone else’s paper.[xxxii]  At least one professor, Lu Jun, who was hired by Beijing University of Chemical Technology, admitted to completely falsifying his entire resume, lying about not only his degrees, but also his work experience and published work.  He simply copied information from the resumes of western professors and then claimed it all on his own.[xxxiii]    

Western universities have been experimenting with collaborative ventures, offering a western style university education taught by visiting professors and sanctioned by the prestige of western university standards.  Universities such as Yale, Columbia, and Arizona State University offer higher education programs in China, but students earn a western degree.  However, widespread academic fraud and corruption have strained these endeavors.  Students lie about academic credentials and research, and they routinely plagiarize and cheat.  One Yale professor explained, “When a student I am teaching steals words and ideas from an author without acknowledgment, I feel cheated…I ask myself, why should I teach people who knowingly deceive me?”[xxxiv]

Chinese academic fraud is also affecting international students and their host countries.  Western institutions of higher education want to attract international students for a number of reasons.  These students enhance a school’s diversity, it builds brand recognition and loyalty in developing countries, and international students pay full tuition, often at higher rates that domestic students.[xxxv]  Such calculations can often devolve into a type of fraudulent academic capitalism, whereby western universities sell their brand, and the lure of a prestigious degree, to unprepared students who  do not have the foundational knowledge or skills to successfully pass western university classes.

But not all international students are victims.  Many students lie, cheat, and buy their way into western universities.  Approximately 80 percent of Chinese international students hire an agent to prepare the application materials to apply to a western university. These agents are paid up to $10,000 for their services.  Many of these agents not only fraudulently fill out the application, lying about educational credentials, skills, and references, but these agents also write the students application essays, lying about the student’s experience and misrepresenting students’ foreign language proficiency.  One consultancy group researching such agencies estimates that most of the information on Chinese student applications is fraudulent: 90 percent of recommendation letters, 70 percent of application essays, and 50 percent of high school transcripts are fake.[xxxvi] 

An educational researcher from the U.S. warned, "The problem is massive.  There's no oversight in China, no control over who can set up an agency, over what the agency can and can't do…[These agencies] help in creating fraudulent documents."[xxxvii]  One Australian research group explained, “unscrupulous education agents on impossibly high commissions” are “funneling students with fraudulent documents into any course irrespective of the quality of the course or the student.”[xxxviii]  The Chinese government has finally recognized this problem and is starting to take steps to regulate these college application agencies.

But unscrupulous Chinese students and entrepreneurs are not the only people engaged in academic capitalism.  As already noted, American institutions of higher education are also exploiting students for brand expansion and economic gain.  But a new class of fraudulent for-profit colleges, which are often referred to as “diploma mills,” have sprung up in the U.S. to take advantage of gullible Chinese exchange students.  Some of these fraudulent organizations have been set up by former Chinese nationals who have used their knowledge of Chinese education to better exploit eager international students.  Dickson State University admitted unqualified international students, 95 percent of whom came from China, and awarded them fraudulent degrees.[xxxix]  Herguan University and Tri-Valley University, both located in the San Francisco Bay Area, preyed upon Chinese exchange students and generated millions in illicit profits, until U.S. officials began to investigate these fraudulent organizations.[xl]

A newer type of academic capitalism has recently emerged in China, which is a hybrid form of Chinese entrepreneurialism and western higher education.[xli]  The Chinese government designates these ventures as “duli” or “independent institutions.”  Luxi Zhang and Bob Adamson, professors at The Hong Kong Institute of Education, explain, “The Ministry of Education stated that an independent institution should be run by entrepreneurs, following the principle of ‘seven independences’: independent campus and basic facilities, relatively independent teaching and administrative staffing, independent student enrolment, independent certification, independent finance budgeting, independent legal entity and independent civil responsibilities.”[xlii]

One variant of this new phenomenon is the international university summer school.  Chinese capitalists have created undergraduate “summer school” programs hosted at Chinese universities, but usually not officially connected to, or sponsored by, the university.  These programs target mostly international students who return home to China during the summer, although some also target western undergraduates looking to study abroad.  But unlike other forms of academic capitalism in China, these organizations hire western university professors and lecturers who teach western style classes.[xliii] 

These programs claim that students can take credits from these summer schools back to the U.S. and earn transfer credit from U.S. universities.  Most of these programs operate on the campus of various Chinese universities, and some actually use the name of host universities; however, these summer schools are actually just private businesses renting classrooms, ostensibly using the university location to provide a veneer of academic legitimacy.[xliv]

These programs are mostly run by young Chinese businessmen who have been educated in western universities.  Some of these entrepreneurs are still registered undergraduate students at U.S. universities, taking time off from school to develop their own business.  These young entrepreneurs secure funding from Chinese capitalists and run their summer school businesses like franchises, spinning off affiliated programs in new cities, most likely earning a percentage of profits for new programs.  As The Chronicle of Higher Education recently explained, “These entrepreneurs have taken an American product—the Western college course—and created a shorter, cheaper version to sell to their peers. In doing so, they have tapped into the seemingly insatiable demand for Western education by China's growing middle class.”[xlv]    

Besides the convenience of taking western university courses back home in China, these programs also offer western credit hours at substantially lower prices than exchange students would be paying at U.S. universities.  As one Chinese student explained, “If summer school provides me the credits and it's cheaper, why not choose that?"  According to another international student, these programs seem to attract two different types of students: “Those who want to finish college as soon as possible, they work very hard. Another group, they can't finish the courses in their own school, and they think summer school will be easy.” 

There is evidence to suggest that some of these schools engage in deceptive practices, similar in type of broader forms of fraud and unethical behavior documented in the larger Chinese marketplace.[xlvi]  Profit hungry administrators at these for-profit schools don’t seem to be screening applicants to differentiate serious students from others who just want to buy cheap credits.[xlvii]  Some of these schools, as I’ll explain in the next chapter, give students financial incentives to take as many classes as possible, which set up most students to fail – or puts pressure on faculty to just pass all students.  One U.S. professor criticized these summer schools for undermining the integrity of western institutions of higher education: "Essentially what Summer China did was create a cheap, Chinese program.  I was providing an inexpensive product students could buy in lieu of better developed courses back home [in the U.S.]."[xlviii]

With some much scheming and fraud in Chinese higher education, by faculty, students, and businessmen, it seems insightful to ask, what is Chinese higher education for?  If these institutions were actually imparting real skills and knowledge that were to be usefully employed in Chinese society and in the economy, then cracking down on academic fraud would be a pertinent policy issue.  But if high education is simply a status marker of prestige, a mindless social ritual that serves as a gateway into the Chinese state bureaucracy, then why not just buy a credential, or steal it? 

For thousands of years in China, education has been reduced to a commodity, mere social capital, and it is prized not for its utility, but because of its exclusivity, like a luxury good.  As such, it should come as no surprise that educational credentials are bought and sold like any other commodity.  Further, like most other luxury goods in China, educational credentials are easy to fake.  Selling fake credentials is simply one more black-market activity, a mundane expansion of China’s seemingly limitless sea of counterfeit goods.

 

CHINA X: Educational Opportunity or Fraud?

CHINA X[xlix] International Summer School was created in 2011 on the campus of Qingdao University in Qingdao, Shandong province, which is located on the central-east coast of China.  The purpose of the program was to invite the "world’s top professors" to China in order to teach western university classes in English.  The program is mostly geared to Chinese foreign exchange students studying abroad in the United States.  While home visiting their families, these students could earn transfer credits towards an American bachelor’s degree.  The program claims to offer the equivalent of American college courses taught in English by American professors - for a much cheaper price. 

CHINA X is a private school.  It is not an official part of any university, nor is it run by local faculty.  Instead, it is organized and administrated by local businessmen and hosted at prominent Chinese universities.  Some of these program administrators are college students in their mid-twenties.  CHINA X is a franchise business.  Each locale is independently organized and operated.  There seems to be no centralized coordination or oversight, although all campuses share a single website.  In 2012 the CHINA X program expanded into two more cities in China: Southwest Jiaotong University in Chengdu and Jinan University in Guangzhou.  In 2013 the program will reportedly spread to Beijing and Taiwan.

In its first year, there were around 200 students accepted into the program.  By 2012, around 400 to 500 students enrolled between the three campuses.  Most students were Chinese exchange students who have already been accepted for undergraduate study in American universities.  Many of these students were freshmen or sophomores already studying abroad, some from prominent U.S. universities, like the University of Wisconsin, Syracuse University, and the University of California.  Most of these students had come home to China for summer vacation in order to see family.  Some of the students were recent graduates from local Chinese high schools.  Students who pass CHINA X courses earn credits that supposedly can transfer to "over 200 American colleges," although only 35 universities are listed on the website.  The program is not just academic.  It also offers non-credit classes on dance, rock-n-roll, and yachting, as well as on-campus dormitories and social activities, like dance parties and field trips.

The CHINA X program has a highly ambitious, and potentially contradictory, vision. The mission statement of the program is published on its website, both in English and Chinese.  It claims,

“Cooperating with more groundbreaking Chinese and American universities, CHINA X International Summer School is devoted to constructing the best summer program in Greater China, building an international high-end platform for the elite students community, well-known professors and Fortune 500 companies.”

There seem to be several different, and possibly conflicting, goals here.  One part of the CHINA X mission seems to be to foster international cooperation between Chinese and American universities through cultural exchange.  Another goal seems to be a competitive educational and/or business vision to create "the best" university summer school in China for "elite students" at the "lowest cost among peer programs across the world."  And finally, there is another goal, only half-articulated, which seems to be an aspiration to be a business school.  It is not clear if this program wants to attract funding or guest speakers from "Fortune 500 companies," or if CHINA X aspires to be a global corporation, like most Fortune 500 companies.

I want to look at each of these goals, one by one.  Did this program foster international cooperation between Chinese and American universities?  Did this program hire the "world's top professors" in order to create a superior university summer program for "elite" students?  Was this program seeking to become an international business school, or did it aspire to be a global corporation? 

I evaluated the claims made on the CHINA X website with three sources of data: my observation of this program in Guangzhou during the summer of 2012, discussions with other American faculty members who taught in this program, and interviews with support staff.  I concluded that this program does not foster much international cooperation, and what little cooperation did take place was marred by economic exploitation.  It does not recruit "top" professors, nor does it recruit "elite" students.  And the program does not offer superior university courses.  The program did focus on business and economics, but was not a coherent business school.  And finally, CHINA X was a for-profit enterprise that seemed to focus on maximizing profit, not maximizing education or student learning, and towards this end, the program may have committed academic fraud.

First, did this program foster international cooperation?  Yes, there were some forms of international cooperation; however, it was mostly between faculty and support staff.  One Chinese professor from the business school gave a speech on the first day, but was never seen again.  Another Chinese professor from the business school attended a few of the public speaking classes.  No other Chinese professors participated in the program.  Students attended classes, but rarely, if ever, talked to professors outside of class.  Most did not do much speaking in class.  Some students also attended field trips and social events, but there was rarely any mixing with professors, outside of occasional small talk.  The only real exchange was between faculty and the sixteen support staff, all of whom were local students at Jinan University, and many of whom were graduate students.  Faculty members were dependent upon these students for help, both with classes and with navigating the culture.  Supposedly there were "research" opportunities to collaborate with Chinese colleagues, but nothing was ever said of this opportunity once we arrived, and no American faculty had any contact with the local professors. 

While there was collaboration and exchange with the support staff, it was not collaboration between equals.  Sadly, these staff members were being economically exploited by the program, as are many workers in China.  There were two types of support staff: teaching assistants and living assistants.  Teaching assistants, like their counterparts in American universities, were mostly graduate students who attended classes, lead recitation sections, and helped professors proctor exams and grade assignments.  Living assistants were both graduate and undergraduate students who helped professors interact with the local culture, which included help with shopping, dining, banking, sightseeing, and issues with living quarters. 

These students served an important role, but they were not being compensated fairly for their work.  Over the five-week program, the TAs would work between 15 to 30 hours a week, while the living assistants would work between 5 to 10 hours a week.  Both groups were required to be on call day and night to help professors when needed.  And they were required to put in extra hours as service workers during program events and parties.  For this all this effort, TAs earned 700-500 Yuan, which is the equivalent of $80-$122 for five weeks.  Thus, for 75 to 150 hours of work over five weeks, TAs earned the equivalent of $0.53 to $1.63 an hour.  Worse, the living assistants were paid nothing at all. 

Almost all of the assistants I interviewed said they were not treated fairly by the program.  One TA said the working conditions were "terrible" and that "I did not feel like I was valued."  Another TA said "the payment is abnormal in the market," which meant that the CHINA X wages were low, even by the extremely low standards of the Chinese labor market.  But this student didn't complain.  She was the only respondent to consider her treatment fair because she was able to take free classes by American professors, which she valued more than a decent salary.  She said, "I don’t really care about the salary. I join the program because there are relevant courses that I want to learn. So, I tend to participate it even there’s no payment." 

Clearly, these students joined the program for non-monetary rewards, but the CHINA X administrators seemed to exploit these motivations.  All Chinese undergraduates need to take an internship for school, thus, working for CHINA X fulfilled this requirement.  Some students also saw this as an opportunity to make connections with American faculty who might help them later study abroad in the U.S.  But rather than treat support staff as volunteers and students, they were treated as menial workers who were expected to be on call for duty at all hours of the day.

While CHINA X didn't foster much by way of international cooperation, how about its second claim: Did it hire the "world's top professors" in order to create a superior university summer program for "elite" students?  On all three parts of this claim the answer is unequivocally negative.  CHINA X did not hire "top professors" by any standard way of measuring such a claim.  The program was inferior in every way to an American college course, although it was potentially much cheaper.  And the program certainly did not admit "elite" students. 

First, who were the professors?  The website claims that CHINA X has "the best line-up of professors in Asia."  It claims that professors come from highly acclaimed tier-1 research universities in America and England, like Harvard University, The University of California at Berkeley, and University of Cambridge.  The program also claims that professors are focused on "improving the quality of learning and teaching," "curriculum design," and "pedagogical innovations."  Some of the visiting professors did in fact work at internationally recognized, top-tier American universities, like the University of California at Berkeley.  But the vast majority did not.  Most of the professors came from mid- to low-ranked American state universities, like the University of Texas at San Antonio or Arlington, the University of Wisconsin at Platteville, or the University of Minnesota at Crookston.

Few, if any, of the professors were tenured full-professors, and none were leaders in any academic field.  Only a small minority of the visiting professors had done any original research or published academic work.  Some of the "professors" were not even professors at all.  Around half of the faculty were adjunct lecturers, some only partly affiliated with universities, as they taught primarily at community colleges in the United States.  Many of these adjunct faculty had only master’s degrees and not much experience teaching at the university level.  In several instances, the program website lied about the credentials of some of these professors, claiming they had earned PhDs (when they had not), and claiming they worked at more prestigious universities. 

Few of the professors knew anything about curriculum or instruction and there was little, if any, "curriculum design" or "pedagogical innovation."  Most professors simply lectured to students, assigned readings from the course textbook, used high stakes exams, and a few assigned academic papers.  While the quality of "learning and teaching" in any university naturally varies from class to class, depending on both the professor and the students, at this summer program there was no evidence of any exceptional teaching or innovative pedagogical techniques.  In fact, just the opposite.  Most offered very traditional classes.  Thus, the claims of "the best line-up of professors in Asia" and “pedagogical innovations” were clearly false.  And the claim that all professors came from prominent tier-1 American research universities was grossly overstated and misleading. 

What about the program?  Did CHINA X offer a superior university summer program?  A good university program would have innovative and demanding university classes that reinforce core learning goals, the program would be coherently integrated and well organized, and it would provide adequate student support services to ensure quality learning.  CHINA X did not display any of these characteristics.  The classes were standard, lecture and exam-oriented college classes taught by, at best, adequate instructors.  Most classes did not demand much time and effort from students, outside of preparing for exams.  There were no core learning goals or outcomes for the program.  The classes were not integrated in any way.  The program was poorly organized.  Decision making was reactive, rather than proactive, with many modifications made on the fly as problems arose. 

And there were almost no student support services: computers in classrooms were slow and infected with viruses, there was only one printer in the faculty lounge, the library did not have access to English language academic databases, there were few English language books, and there was no writing and learning facility to help tutor students.  Several faculty noted the absence of a writing and learning lab because most students struggled with their reading and writing skills.  A hastily organized "writing center" opened halfway through the five-week program.  It was staffed by one novice English instructor for a couple hours a day, and it could not accommodate even a fraction of the students who needed such services.

But the program was relatively cheap, in comparison with non-resident tuition at American universities.  Including fees and free books, one CHINA X class was $2,450 (15,680 Yuan), which at the low end of typical out-of-state tuition for an American public university, which cost around $1,500 to $7,000 for a three-credit class, depending if the university is a lower-tier or a tier-one institution.  Essentially, students were paying for a lower-tier American university education and that is exactly what they were getting, with the exception, of course, of the condensed 5-week structure.  Such short classes severely constricted the amount of information and assessment students received, thus, students were being sold a false bill of goods and left classes with little “higher education,” in terms of either knowledge or skills.

CHINA X also gave students financial incentives to take as many classes as possible.  If a student registers for two or more classes, each additional class is only an additional $400 to $500.  And this includes free books, albeit the books are pirated photocopies of American textbooks.  Many students registered for three or four classes (at least a couple registered for five!).  There is no way these students did any more than memorize short-term information to pass standardized, high-stakes exams.  Several students had to eventually drop out of classes (and lose their money) because there was no way for them to be successful with such an unrealistic load of classes.  

And finally, what about the students?  Did CHINA X admit "elite" students?  Well, the answer to this question is mixed, yes and no.  Any exchange student who enrolls in a foreign language university to earn a degree should be considered an "elite" student due to the difficulty of mastering a second language on top of the knowledge requirements of a university degree program.  However, there have been many studies about sub-standard educational practices in Chinese schools and the struggles of foreign-exchange students in American universities.  These reports raise doubts about how prepared these students are for a western university degree programs.  Further, there have been recent investigative reports about Chinese students engaging agents to apply to western universities.  These agents not only fill out the college application, but also have been known to lie about students’ qualifications and to write the application essay for the student.  While some of the students enrolled in CHINA X were absolutely "elite" students, many were not. 

Most CHINA X students were not fluent in English speaking, reading, and writing and they struggled to successfully pass intense five-week college courses.  Under ideal circumstances, with a low class-load, trained teachers, and adequate student support services, most of these students could have developed their English skills and mastered course material.  But CHINA X did not provide ideal circumstances.  Most professors had no knowledge of pedagogy, they used class only to lecture, and few met with students outside of class to help them learn.  Some professors used class time to go on “field trips,” which were no more than tours of local sites that had, at best, moderate connection to the course curriculum.  As already noted, there were almost no student support services.  And students had low expectations of easy and cheap college courses, so many enrolled for three, four, even five courses at once.  Under such circumstances, there was little student learning. 

Students struggled to meet the workload requirements and usually studied only before exams.  Many had difficulty understanding verbal English and so they sat quietly in class, taking fractured notes, starting at the walls, or playing on their computers.  Many students also had difficulty reading in English, which limited their ability to understand their textbooks, especially in the reading-heavy courses of literature and philosophy.  Many students also routinely plagiarized ideas and wording from their textbooks. 

The American professors seemed to have low expectations.  Most seemed to treat their stay in China as a vacation, rather than a serious academic endeavor.  Some dealt with poor student performance by grading on curves, setting the academic bar fairly low.  Most professors passed every student, even though few of these students possessed the English skills to pass a real university level course in the United States.  The few professors who pushed students to learn, and who eventually failed some students, were pressured to lower their standards, change grades, and pass all students.  There was even some evidence that grades were tampered with.  Two of the CHINA X support staff said that administrators may have changed professor's final grades so that all students in the program would pass classes. 

Looking past the false rhetoric of the mission statement, the CHINA X program seemed to have only one goal: It wanted to attract a lot of students to take many classes so that program administrators could make a large profit.  CHINA X did focus on business and economics, around 40% of the total classes offered, but it did not create a coherent business school model.  Instead, the program offered a diverse variety of core freshman and sophomore classes in a range of disciplines, which was meant to attract a wide variety of students.  It also offered a price plan that was meant to encourage students to take multiple classes.  All of the support staff that I interviewed agreed that the primary goal of CHINA X was to make a profit.

CHINA X is a for-profit enterprise that is clearly focused on maximizing profit, not maximizing education.  The enterprise forfeited not only educational values in the pursuit of profit, but it broke the law as well.  Most of the professors were surprised when they were told to enter the country on a tourist visa, rather than a work visa.  The program administrators explained that it was just easier that way, as there was a lot of red tape to hire foreign workers.  While plausible, it turns out that most Chinese educational institutions do in fact apply for work visas for foreign staff, and they are not all that hard to get approved.  Professors found it a bit more shocking to be paid in cash.  They were given large stacks of American dollars in incremental stages.  This method of payment gave the whole operation a gangster-like feel.

CHINA X had a clear, for-profit mission, which was at odds with its stated mission published on its website.  When asked, the support staff agreed: this was a business, not a school.  One staff member stated, "It was clear that the directors didn't care much about the quality of education."  Another explained why, "This program is a business to make profit." 

Towards its profit-driven end, CHINA X exploits support staff, students, and visiting professors.  Most participants were manipulated with false or misleading information.  Students were sold a false bill of goods.  They did not receive a top-notch American university education from highly regarded American professors.  They were provided no support services to help them learn.  And they were encouraged to take more classes than they could successfully pass. They were also not told that many American universities would not accept CHINA X courses for transfer credit.

Perhaps more worrisome, the CHINA X program seems to have engaged in deliberate academic fraud by altering the final grades of professors so that all students could pass classes.  Students may have also been complicit in the fraud if they were promised easy credits with the guarantee of passing. 

The academic community in the U.S. and in the rest of the world needs to be aware of profit-driven programs, such as CHINA X, so as to guard against a breach in the academic integrity of the western university system.  Programs such as CHINA X seem to be selling college credits, rather than offering quality higher education.  Such programs also tarnish the integrity of visiting faculty and foreign exchange students who travel abroad. 

  

Conclusion: What Is the Value of Higher Education in China?

For thousands of years, the value of higher education in China has not been intrinsic.  The value and utility of a college credential has rested upon one distinguishing characteristic: it is an unobtainable good that most cannot afford.  It was used solely as a status symbol, a credential signaling exclusivity.  As historian of education David F. Labaree argued, schooling is often reduced to a commodity: it is “a kind of ‘cultural currency’ that can be exchanged for social position and worldly success.”[l]  Schools offer, according to Thomas Frank, the “golden ticket” to success, thus, universities offer the “capital-C Credential.”[li]  In such a cultural environment, real learning is not important.  Instead, “surrogate learning” is all that’s needed.  As Michael W. Sedlak explained, “As long as the tests are passed, credits are accumulated, and credentials are awarded, what occurs in most classrooms is allowed to pass for education.”[lii] 

And often, as philosopher Matthew B. Crawford points out, where such social rituals displace real learning, an educational credential “serves only to obscure a more real stupidifictaion.”[liii]  Rather than make a person smart, by imparting real knowledge and skills, schools often make people stupid, by incapacitating them through mindless ritual and deference to authority.  Higher education in China is more like virtual education, rather than the acquisition of higher order skills and knowledge through real learning.  For centuries, higher education in China has been a social marker of legitimation, a mere gatekeeping function.  Higher education has served the imperial bureaucracy for centuries, certifying an administrative class of deferential servants.  It has the same basic function today.

But the enduring problem of all luxury goods, especially in vibrant, unregulated marketplaces like China, is the ability of entrepreneurs to cheaply replicate fakes, flooding the marketplace with worthless replicas and deflating the value of luxury goods through a crises of identity.  China has long been known for its industrious ability to produce cheap knock-offs of designer goods. There is evidence to suggest that deceptive practices, including the selling of fraudulent merchandise, are perfectly acceptable in the Chinese business world.[liv]  The Economist sardonically notes, “You could almost say that counterfeits remain Silk Street’s trademark, despite the market’s efforts to stamp them out.”[lv] 

The marketplace for educational credentials has been no different.[lvi]  If all have access to a luxury good, then it can’t be a luxury anymore.  If more and more people have the capital-C Credential of higher education, then how can elites visibly identify superiority?  The fake good eventually becomes exposed and devalued, and elites move on to the next luxury marker of higher social status, perhaps to goods that are not so easily knocked-off, like cars, foreign travel, and real estate.

Higher education has always been traditionally reserved for an elite upper class.  It was meant to be exclusive and to serve as a social signal legitimating elite status because it was guarded by elite institutions and conferred only by elaborate social rituals.  But the democratization of western society in the 19th and 20th centuries corroded the exclusivity of traditional elite institutions, such as political governance, schooling, and the market place.  These democratizing currents were at first forced on eastern nations, such as Japan and China, due to the western world’s insatiable appetite for new markets to buy raw materials and sell manufactured goods.  But eventually, the public at large in south-east Asia and Japan began to demand more and more democratization, albeit blending western ideas and institutions with traditional eastern ways of life.  

In the early 20th century, the economist Joseph Schumpeter foresaw how democratization would produce a credential arms race and would result in the devaluation of higher education.  He was writing at a time when only a small minority of people went to college, but policy makers were heatedly discussing the opening of higher education to larger swaths of the middle class.  Schumpeter warned that the supply of credentialed workers would outpace labor market demand.  Flooding the market with credentialed workers would devalue the signaling function of degrees, thereby, reducing the social capital of all degree holders.  This devaluation of credentials would thus condemn the previously elite class of college graduates into a netherworld of over-education and “substandard work.”[lvii]

It would be instructive to step back and ask, why are Chinese students so focused on earning college degrees?  What will they do with this luxury of exclusive social capital?  In 2012 approximately 7 million students graduated with a college degree in China, but there were no jobs for many of this credentialed class.  Due to the constrained possibilities and fierce competition of the private market, around 1.4 million of these students applied for the government civil service exams, a massive increase from the previous decade, but there were only 20,800 positions to fill.[lviii]  Some turned to state-run corporations, and a lucky few found work abroad.  But many college graduates were forced into low-paid work in factories, the emerging service sector, or in small, local, mostly family-owned businesses.         

And for the lucky college graduates who find a government job?  Do they get a life of privilege and ease?  The Economist paints a different portrait: “Mr Zhang, who is 27, is beginning his climb up the bureaucracy in the capital of a province, Shanxi, south-west of Beijing, which is reputed to be among the most corrupt and least competently governed. The jobs are hard to get, says Mr Zhang, but they are not the cushy sinecures that many assume. He works from 8am until midnight on most days, he says, compiling dry reports on topics like coal production and sales for higher-level officials. He commands a modest salary by urban standards—about 2,800 Yuan ($450) a month, in a city where a decent flat near his office rents for two-thirds that much. This way of life does not impress the ladies, he says; he has been on two blind dates in four years, both of them failures. This picture of dedication and loneliness stands in sharp contrast to the popular image.”[lix]

But isn’t a position like Mr Zhang’s just a starting point, an entry-level job with which one could work their way up the ladder to success?  Sadly, no.  As The Economist goes on to explain, “The chance of advancement is small indeed. Of China’s 6.9m civil servants, about 900,000 are, like Mr Zhang, at the lowest official rung of government above entry-level. Roughly 40,000 civil servants serve at the city or ‘bureau’ level. Many promotions are handed out on the basis of relationships, gifts and the outright sale of offices. Even when they compete for promotions on merit, some officials will pad their CVs with fake graduate degrees.”[lx]

And Mr Zhang is not alone.  A reporter for The New York Times interviewed a young community college graduate, Wang Zengsong.  Mr Zengsong is 25 years old.  He grew up in the country on a rice farm, but he managed to go to community college and earn a three-year associates degree.  But ever since graduating, now over three years, he has been mostly unemployed. He has only had a couple of short-term, low-paying jobs, such as a security guard at a shopping mall and a waiter in a restaurant.  There are factory jobs, but Mr Zengsong won’t apply for those.  Why?  As the Times reporter explains, “He will not consider applying for a full-time factory job because Mr. Wang, as a college graduate, thinks that is beneath him. Instead, he searches every day for an office job, which would initially pay as little as a third of factory wages. ‘I have never and will never consider a factory job — what’s the point of sitting there hour after hour, doing repetitive work?’ he asked.  Millions of recent college graduates in China like Mr. Wang are asking the same question.”[lxi]

There is now widespread “over-education” in China because the labor market does not have enough high-skill positions for all the graduates leaving college each year.[lxii]  In 2012, Prime Minister Wen Jiabao noted that only 78 percent of graduates from the year before had found a job.  There is a persistent “structural mismatch,” as the deputy secretary general of China’s Education Ministry has acknowledged: Too many college graduates and not enough good jobs.  The situation is not any better for students with postgraduate degrees.[lxiii]  And not only are many college graduates unemployed, under-employed, and desperately looking for work, but those college graduates who do have jobs are seeing their wages erode, as a flood of skilled laborers devalue the market.  This leaves many college graduates with a difficult choice: work in a factory or go back home to live with parents.[lxiv]

The problem of credentialism and over-education is not only affecting China.  It is happening in the U.S. too.  It is a global problem.  One has to ask, what good is an education if there is no way to use such an education to live a better life?  If higher education has been reduced to a credential that signals elite status, then why not just buy one, legitimate or fake?  But what happens when the labor market is flooded with bought degrees that signal no real learning or skills?  What happens when technological development and the globalized economy creatively destroy old industries and create new ones? 

Those who see higher education as nothing more than a credential leave themselves exposed to the mercies of the global labor market.  There is a lot that can be said about the intrinsic value of knowledge, skills, and personal development.  But leaving all that aside, and simply focusing on the labor market value of a college degree, which is what most people seem to be doing in the world, there is a frightful consequence of credentialism. 

If the individual does not actually purchase real knowledge and skills that can be creatively and purposefully used in the marketplace, then they offer employers nothing other than a piece of paper signaling exclusivity.  But if a growing minority, or even a majority, of people possess that same piece of paper, then its sole signaling purpose ceases to function and it becomes devalued, if not completely devoid of value.   At such a point, the individual becomes completely helpless as an un-skilled laborer, potentially much worse off because the college students has spent tens of thousands of dollars, at least, to purchase a now worthless credential.  What would be the national and global consequence of such a dismal situation? 

We will most likely find out over the coming decades.


Endnotes

[i] Martin Jacques, When China Rules the World: The End of the Western World and the Birth of a New Global Order (New York, 2012), 248.

[ii] James B. Palais, “Confucianism and the Aristocratic/ Bureaucratic Balance in Korea,” Harvard Journal of Asiatic Studies 44, no. 2 (Dec 1984): 427-68.

[iii] Jacques, When China Rules the World, 15.

[iv] Fredrick W. Mote, Intellectual Foundations of China (New York: Knopf, 1971); Jacques, When China Rules the World, 96.

[v] Michael Charles Kalton, The Neo-Confucian World View and Value System of Yi Dynasty Korea (Diss., Harvard University, Sept 1977), 6, 7, 9, 82; Jacques, When China Rules the World, 96.

[vi] Jacques, When China Rules the World, 96; Philip G. Altbach, “The Giants Awake: Higher Education Systems in China and India,” Economic and Political Weekly, 44, No. 23 (Jun. 6 - 12, 2009), 39-51.

[vii] Ibid.

[viii] Ibid.

[ix] Ibid., 176.

[x] Ibid.

[xi] Ibid., 177.

[xii] Ibid., 179.

[xiii] Ibid., 282.

[xiv] “China’s Ruling Families: Riches Exposed,” The Economist (Nov 3, 2012), Retrieved from www.economist.com

[xv] “The Fight Against Corruption,” The Economist (Dec 8, 2012), Retrieved from www.economist.com

[xvi] Jacques, When China Rules the World, Ibid., 217.

[xvii] Keith Bradsher, “Next Made-in-China Boom: College Graduates,” The New York Times (Jan 16, 2013), Retrieved from www.nytimes.com

[xviii] Ibid.

[xix] Altbach, “The Giants Awake: Higher Education Systems in China and India,” Ibid., 42.

[xx] Bradsher, “Next Mand-in-China Boom,” Ibid.

[xxi] Altbach, “The Giants Awake: Higher Education Systems in China and India,” Ibid., 46.

[xxii] Bradsher, “Next Mand-in-China Boom,” Ibid.

[xxiii] Jacques, When China Rules the World, Ibid., 547-48.

[xxiv] “Global Flow of Tertiary-Level Students,” UNESCO Institute for Statistics, UNESCO.org (2012), Retrieved from http://www.uis.unesco.org/EDUCATION/Pages/international-student-flow-viz.aspx

[xxv] Altbach, “The Giants Awake: Higher Education Systems in China and India,” Ibid., 47.

[xxvi] “Chinese Students Admit to Little or No Idea about Ethics,” The Times Higher Education Supplement (Aug 5, 2010), 11.

[xxvii] “China’s Ruling Families: Riches Exposed,” The Economist, Ibid.; “The Fight Against Corruption,” The Economist, Ibid.; Nick Lee, Amanda Beatson, Tony C. Garrett, Ian Lings and Xi Zhang, “A Study of the Attitudes towards Unethical Selling Amongst Chinese Salespeople,” Journal of Business Ethics, 88, Supplement 3 (2009), 497-515.

[xxviii] Yojana Sharma, “New Academic Misconduct Laws May Not Be Adequate to Curb Cheating,” University World News Global Edition,  234 (Aug 12, 2012), Retrieved from www.universityworldnews.com; Yojana Sharma, “Regulation on Academic Fraud Hopes to Reduce Plagiarism,” University World News Global Edition, 253 (Jan 6, 2013), Retrieved from www.universityworldnews.com; “Fake Papers Are Rife at Universities,” China Daily/Asia News Network (March 8, 2010), Retrieved from www.news.asiaone.com

[xxix] As cited in “Fake Papers Are Rife at Universities,” China Daily/Asia News Network (March 8, 2010), Retrieved from www.news.asiaone.com

[xxx] Yojana Sharma, “New Academic Misconduct Laws May Not Be Adequate to Curb Cheating,” University World News Global Edition,  234 (Aug 12, 2012), Retrieved from www.universityworldnews.com; Yojana Sharma, “Regulation on Academic Fraud Hopes to Reduce Plagiarism,” University World News Global Edition, 253 (Jan 6, 2013), Retrieved from www.universityworldnews.com; “Fake Papers Are Rife at Universities,” China Daily/Asia

News Network (March 8, 2010), Retrieved from www.news.asiaone.com

[xxxi] Philip Altbach, “Stench of Rotten Fruit Fills Groves of Academe,” The Times Higher Education Supplement (Jan 21, 2005), 12.

[xxxii] Ibid.

[xxxiii] “University Sacks Prof Who Was 3 Times A Fake,” People's Daily Online (July 30, 2012), Retrieved from www.english.peopledaily.com.cn

[xxxiv] “Campus Collaboration: Foreign Universities Find Working in China Harder than They Expected,” The Economist (Jan 5, 2013), Retrieved from www. economist.com

[xxxv] Alexis Lai, “Chinese Flock to Elite U.S. Schools, CNN (November 26, 2012), Retrieved from www.cnn.com

[xxxvi] Justin Bergman, “Forged Transcripts and Fake Essays: How Unscrupulous Agents Get Chinese Students into U.S. Schools,” Time (July 26, 2012), Retrieved from www.time.com

[xxxvii] As cited in Justin Bergman, “A U.S. Degree At Any Cost,” Time (Aug 20, 2012), Retrieved from www.time.com

[xxxviii] As cited in Yojana Sharma, “Ministry Mulls Powers to Ban Student Recruitment Agents,” University World News Global Edition, 246 (November 1, 2012), Retrieved from www.universityworldnews.com

[xxxix] Ibid.

[xl] Lisa M. Krieger and Molly Vorwerck, “Sunnyvale University CEO Indicted on Visa Fraud Charges,” San Jose Mercury News (May 8, 2012), Retrieved from www. mercurynews.com

[xli] Luxi Zhang & Bob Adamson, “The New Independent Higher Education Institutions in China: Dilemmas and Challenges,” Higher Education Quarterly, 65, No. 3 (July 2011), 251–266.

[xlii] Ibid., 253.

[xliii] Beth McMurtrie and Lara Farrar, “Chinese Summer Schools Sell Quick Credits,” The Chronicle of Higher Education (January 14, 2013), Retrieved from www.chronicle.com.  I was a main source of information for this article.  Information from this source draws from both the published article and my own research in China.

[xliv] Ibid.

[xlv] Ibid.

[xlvi] Nick Lee, Amanda Beatson, Tony C. Garrett, Ian Lings and Xi Zhang, “A Study of the Attitudes towards Unethical Selling Amongst Chinese Salespeople,” Journal of Business Ethics, 88, Supplement 3 (2009), 497-515.

[xlvii] Ibid.

[xlviii] As cited in Ibid.

[xlix] China X is a pseudonym for a real organization that continues to operate an international summer school in southern China.  This chapter is based on information gleaned from the organization’s web site, organizational documents, first-hand observation of the program, and interviews with members of the organization.

[l] David F. Labaree, How to Succeed in School without Really Learning: The Credentials Race in American Education (New Haven, 1997), 43.

[li] Thomas Frank, “A Matter of Degrees,” Harpers (Aug 2012), 4.

[lii] As cited in Labaree, How to Succeed in School without Really Learning, 44.

[liii] Matthew B. Crawford, Shop Class as Soulcraft: An Inquiry in the Value of Work (New York, 2009), 144.

[liv] Nick Lee, Amanda Beatson, Tony C. Garrett, Ian Lings and Xi Zhang, “A Study of the Attitudes towards Unethical Selling Amongst Chinese Salespeople,” Journal of Business Ethics, 88, Supplement 3 (2009), 497-515.

[lv] “Fakes and Status in China,” The Economist (June 23, 2012), Retrieved from www.economist.com

[lvi] Frank, “A Matter of Degrees.”

[lvii] Joseph Schumpeter, Capitalism, Socialism and Democracy (New York, 1942), 152.

[lviii] “The Golden Rice-Bowl,” The Economist (Nov 24, 2012), Retrieved from www.economist.com

[lix] Ibid.

[lx] Ibid.

[lxi] Keith Bradsher, “Chinese Graduates Say No Thanks to

Factory Jobs,” The New York Times (Jan 24, 2013), Retrieved from www.nytimes.com

[lxii] Dan Wang, Dian Liu, Chun Lai, “Expansion of Higher Education and the Employment Crisis: Policy Innovations in China,” On The Horizon, 20, no. 4 (2012), 336-344.

[lxiii] Yojana Sharma, “Concern Over Too Many Postgraduates as Fewer Find Jobs,” University World News Global Edition,  235 (Oct 28, 2012), Retrieved from www.universityworldnews.com

[lxiv] Bradsher, “Chinese Graduates Say No Thanks to Factory Jobs,” Ibid.

Educational Malpractice in South Korea

This article was a case study on Chung Dahm Learning, a private academy in Seoul, South Korea where I worked from 2009-2010. This essay is an excerpt from my book Children Dying Inside, which was originally published in 2011. In the book, I changed the name of Chung Dahm Learning to Korean English Preparatory Academy for legal purposes.

 
Children die inside (test kill children)
— Elementary student at Chung Dahm Learning
 

Private Education in South Korea

The post-war construction of public schooling was centered only on the elementary level.  Middle school through university was left to private institutions with private sources of funding, mostly tuition paid by parents.  Private schools constituted around 40 to 50 percent of all secondary schools in South Korea and over 65 percent of institutions of higher education.  In the two major cities, Seoul and Pusan, around 75 percent of all high schools were private academic high schools and 90 percent of university students attended a private school.  Michael J. Seth explained, "In general, the higher and more prestigious the level of schooling, the greater the share of enrollments in private institutions."[i]

Because of the frantic push for academic success, different forms of private schooling have dramatically increased over the last two decades in order to profit from “education fever."  There are four types of private education in South Korea: private K-12 schools, private colleges and universities, private tutoring, and hagwons.  Private primary schools represent a small portion of schools overall, as most students enroll in state funded institutions.  Private primary schools were actually illegal until 1962 when this ban was dropped because the state did not have the teachers or facilities to accommodate the flood of students enrolling in school.[ii]  Because public middle schools and high schools are non-compulsory and tuition-based, private schools occupy a large part of the 7th to 12th grade educational sector.  Private schools present themselves as a quality alternative to public schooling.  In the early 1990s, around 30 percent of middle school students and over 50 percent of high school students attended a private school.[iii]  Seoul National University is the only prestigious public university.  The rest are private schools.  Thus, the vast majority of university students are enrolled in a private institution, around 90 percent overall.[iv]  Outside of formal schooling there is also a robust business of private tutoring, which is legally regulated, but due to its size and highly idiosyncratic nature, it is practically free of oversight and hard to generalize.[v] 

Finally, the most popular form of private schooling is the hagwon.  A hagwon is a private, for-profit educational institution that delivers instruction seven days a week.  The legal hours of operation are 5am to 10pm, although many hagwons open after regular school hours (3-4pm) and stay open until late at night, some past 1am.[vi]  In 2008 there was a move to eliminate all restriction on hours of operation so that hagwons could stay open all night, but this measure went down to defeat, later narrowly upheld by the Constitutional Court in 2009.[vii]  Hagwons enroll students from pre-school age through high school, and they come in a wide variety of forms.  Many of them focus on single subject areas, like math, English, piano, or golf.  There are even military-style boot camps run by retired soldiers, focusing on physical drills to test the endurance and pain threshold of students.[viii]  But some of the largest hagwons present themselves as comprehensive preparatory academies, like KEPA, the focus of this study.  These comprehensive academies offer a multi-leveled array of academic classes, including English, Chinese, TOEFL exam prep, literature, history, philosophy, and debate. 

The primary purpose of most private education is to prepare students for the College Scholastic Ability Test (CSAT) and the Test of English as a Foreign Language (TOEFL), which are the formal placement exams for college.  The entire country adjusts its schedule on CSAT day: the government orders business to modify the work day to clear the roads for students heading to the test; all nonessential workers, both government and private, are told to report late to work; construction work near schools is halted; motorists are informed not to honk their horns; thousands of police are mobilized to handle traffic; the Korean stock market opens late and closes early; flights at all of the nation's airports are restricted; the U.S. military suspends aviation and live-fire training; and adults flock to churches to pray for their child's success.  The results of the CSAT are considered the "crowing life achievement" of a student.  Good scores place students in Korea's top universities, which is the primary factor in finding a good job after college.[ix]

In 1970 there were about 1,421 hagwons in South Korea, but most of these closed during the 1980s.  The autocratic President Chun Doo-hwan decreed that private education was illegal so as to promote an equal educational playing field, but this ban was later ruled unconstitutional.  Hagwons were legalized in a regulated market in 1991, and by 1996 private tutoring was also legal.[x]  In 1980, before the ban took place, about 1/5 of Korean students received some form of private education: 13 percent of elementary school students, 15 percent of middle school students, and 26 percent of high school students.  In 1997 over half of Korean students were being privately educated: 70 percent of elementary students and 50 percent of middle and high school students.  By 2003 Koreans were spending around $12.4 billion on private education, which was more than half the national budget for public schooling.[xi]  In 2003 about 72.6 percent of Korean students were privately educated.  Parents were spending between 10 to 30 percent of family income on private schooling.[xii]  By 2008 there were around 70,213 hagwons and Koreans spent almost 21 trillion won (around $17 billion) on private education.[xiii]  Because the state has never funded much of the educational system, parents bear most of the burden of educating their children in the private educational market.  Because of this, South Korean families spend more on education than in most other countries, around 69 percent of the total price, making the South Korea "possibly the world's costliest educational system."[xiv] 

The Korean hagwon sector in particular is one of the major factors driving up the costs of education.  They have begun to sell their services on the internet, thus expanding an already growing market.[xv]  By the 1990s it was one of the "fastest growing of South Korea's many booming industries."[xvi]  It is becoming so profitable that it has now begun to attract Western private equity firms.  The Carlye Group invested around $20 million in Topia Academy, Inc., one of the largest hagwons in South Korea.[xvii]  KEPA has also attracted over $2 million in foreign private equity investment.[xviii]  Hagwons are also becoming a global phenomenon, following Korean immigrants abroad and attracting non-Korean students.  In 2009 there were 183 academic hagwons and 73 art and music hagwons in Orange County, California alone.[xix]  In 2007 KEPA spun-off a new company, KEPA America, Inc., as an independent entity with its own CEO.  The mission statement of KEPA America, Inc. was to "extend the KEPA network's market to new territories like the US, Canada, Mexico, and South America."[xx]  

While many Koreans consider private education superior to K-12 public education, the private sector is not without its flaws.  For one, the ability to utilize private sector schooling is highly correlated to family income, which contributes to rising inequality through unequal access to quality education and through unequal preparation for elite universities.  Private schools, tutoring, and hagwons serve only those who can pay, so they largely benefit the wealthy.[xxi]  Hagwons also take their profit motive too far.  Business practices routinely determine educational practices.  These institutions inflate grades, teach to standardized tests, and place more emphasis on marketing than teaching.[xxii]  It also seems that these institutions have been systematically overcharging parents for services, which promoted a rebuke by the President in 2009.  The Ministry of Education, Science and Technology reported the 67 percent of hagwons overcharged, 74 percent of foreign language institutes, with more than 40 percent charging twice the standardized tuition level set by the government.  But enforcement is almost impossible, not least because of the lack of government officials.  In southern Seoul there are about 5,000 hagwons but only three civil servants monitoring the district.[xxiii] 

Hagwons also employ teachers who have limited knowledge of subject matter and no training or experience as educators.  The only qualification to teach in Korea is a bachelor's degree from a Western university, no matter the subject.  Few instructors have any previous teaching experience and most know nothing of curriculum or student learning.  One critic sarcastically claimed, "Business owners with suspect educational credentials seem content to hire foreign staff with equally suspect educational credentials to pretend to teach (more like entertain) children in some kind of a babysitting service designed more to generate fast profit rather than quality education."[xxiv]  There have also been widespread complaints by foreign teachers that hagwons do not live up to the terms of employment contracts.[xxv]

The most serious flaw with private education, and with “education fever” more broadly in Korea, is the damage done to children.  Korean culture places a lot of emphasis on exams and college placements, which creates a "pressure-cooker atmosphere."[xxvi]  Thus, most hagwons use a "teach-for-the-test" curriculum that focuses on the memorization of information, standardized multiple-choice tests, and test-taking techniques.  Diane Ravitch has insightfully critiqued such high stakes testing where the curriculum is reduced to "test-taking skills:" Students "master the art of filling in the bubbles on multiple-choice tests, but [cannot] express themselves, particularly when a question requires them to think about and explain what they had just read."[xxvii]  Linda Darling-Hammond has also noted the limitations of standardized testing: "Researchers consistently find that instruction focused on memorizing unconnected facts and drilling skills out of context produces inert rather than active knowledge that does not transfer to real-world activities or problem-solving situations.  Most of the material learned in this way is soon forgotten and cannot be retrieved or applied when it would be useful later."[xxviii] 

With such a curriculum students are "trained, not educated,"[xxix] and this training rewards students for endurance and trickery, not learning.  Korean students rarely understand the information being taught to them, they are not taught to critically analyze information, and they cannot apply information to other contexts.  Students simply become "expert memorizers" of "de-contextualized" facts that can only be used to take standardized tests.[xxx]  This teach-for-the-test curriculum "stifle[s] creativity, hinder[s] the development of analytical reasoning, ma[kes] schooling a process of rote memorization of meaningless facts, and drain[s] all the job out of learning."[xxxi]  High stakes exams also lead to widespread cheating, grade inflation, and outright bribery.[xxxii] 

But there is a much more serious problem for students.   Hagwons take up a lot of extra time for classes and homework, add additional pressure for academic performance, and induce more stress on already overburdened students.  Students already spend a lot of time studying for regular school exams, but the addition of hagwons and private tutors takes up a lot of time during the week, leaving most students with little to no free time.  Students routinely are in school, studying, or engaged in private education for up to 18 hours a day, seven days a week.  One student explained, "I have to get up at 7 in the morning.  I have to be at school by 8 and lessons finish at 4.  Then you go to a hagwon and when you arrive home, it's around 1 o'clock in the morning."[xxxiii]  The Korean Teachers and Education Worker's Union claims that high school students sleep on average 5.4 hours a day, although a recent academic study found that the average sleep time was slightly higher, around 6.5 hours a day.[xxxiv]  The Ministry of Health, Welfare and Family Affairs has issued warnings about student's irregular meals and lack of sleep.  About 40 percent of elementary and middle school students skip meals because they lack a break in their busy daily schedule.[xxxv]  There is a popular student proverb, "If you sleep for four hours a night, you'll get into the college of your choice - if you sleep for five hours, you fail." 

This pressure to perform leads to serious physical harm and psychological distress.  Parents and teachers routinely beat students that do not perform well academically.  A study published in 1996 found that "97 percent of all children reported being beaten by parents and/or teachers, many of them frequently."[xxxvi]  Many students turn to suicide as the only escape from this relentless pressure to perform.  Statistics are not routinely kept on this issue, but limited data are frightening.  Around 50 high school students committed suicide after failing the college entrance exam in 1987.  An academic study published in 1990 revealed that "20 percent of all secondary students contemplated suicide and 5 percent attempted it."[xxxvii]  And the problem is only getting worse.  Two recent surveys found that between 43-48 percent of Korean students have contemplated suicide.  From 2000 to 2003 over 1,000 students between the ages of 10 and 19 committed suicide.  Families also suffer.  In 2005 a father was so distressed over his son's bad grades that he torched himself, his wife, and their daughter outside his son's school in shame.[xxxviii] 

 

The Business Model of KEPA: Organizational Structure and Mission

Korean English Preparatory Academy was founded in 1999 by a private English language tutor.  It began as a small private school with only a few instructors.  Now it is a publically traded corporation in the “education industry,” and one of the most prestigious hagwons in South Korea.  KEPA has over 250 instructors and hundreds of staff on 65 campuses spread across Seoul and every major city in Korea.  Citing the success of Coca Cola and McDonalds, KEPA has also initiated the “globalization of our business” to capture a share of the international ESL market.  Towards this end KEPA has initiated a joint venture with a group from Zhing-Hwa University in China.  KEPA has also spun off a separate corporate entity, KEPA America, Inc., which was designed to export the hagwon model to the American continent.  And KEPA created an English language immersion school in British Columbia, Canada.[xxxix] 

KEPA has an integrated ESL program broken down into multiple levels, beginning with a very basic introduction to the English language for pre-school age children, all the way to college-prep history, literature, writing, and debate classes for high school students.  Placement in every level is determined by a standardized test with incremental scores correlated to the different course levels, ranging from a score of 0-31 for the introductory level to a score of 110 or higher for the college prep classes.  Outside of the academic “fundamentals,” there is also a structured program designed solely to train students for TOSEL based standardized tests, including grammar, reading, multiple choice question types, essay writing, and interview questions.

 KEPA uses a range of textbooks from Cambridge University Press, Pearson/Longman, Scholastic, Cengage, and a series of specially designed KEPA workbooks designed by their in-house research and development center.  KEPA also has a national corporate website to centralize teaching materials, on-line student homework, attendance, and grading.

In corporate advertising and outreach materials, KEPA presents itself as a college preparatory academy with professionally trained teachers and a 21st century curriculum.  Corporate advertising routinely pictures the same image: an ordered classroom setting with uniformed students actively engaged with energetic teachers wearing suits and ties.  Outreach materials are professionally and fashionably designed in full color on expensive paper.  These materials break down the curricular aims of the academy through trendy catch-phrases, like “critical capability” and “communicative capability.”  The “critical” component is broken down into “critical reading/listening” and “critical speaking/writing,” with each part further packaged into three broad student outcome “deliverables:” “English fluency,” “knowledge,” and “critical thinking.”  The “communicative” component focuses on the interactive process of classroom instruction, which includes class discussion, debates, group work, research, group presentations, peer evaluations, “skill” training, online instruction, and webzine postings.[xl]   

The CEO of KEPA has positioned his company in response to his perception of the global economy.  He points to three highly abstract macro-economic developments, "the globalization trend," the "information revolution," and an "economic crisis that arose in the last 50 years."  He states that these macro-economic changes have produced a "paradigm shift" in global and national markets, which in turn has created demand for a new set of skills.  Thus, the CEO created KEPA to capitalize on these developments, selling the "skills" students will need to compete in a globalized world and to protect themselves from "economic crisis."[xli] 

What are those new skills?  The CEO identified only two: "English expression" and "critical thinking."  To impart these two skills, the CEO created a new "methodology" that would focus on both skills from "the beginning" of language training, thus, creating a "blended learning system" that would "amplify learning efficiency."  The CEO vaguely explained, "The Critical Learning system is a new attempt to accomplish the learning objective through the merger and improvement of system and contents."  This learning system also blends classroom instruction with "on-line learning," which includes grammar exercises, writing, and a national blog to post projects and comment on classroom assignments.[xlii]        

While language acquisition is fairly straightforward, what is critical thinking?  The CEO defines this practice as "disregarding intuition and emotion" in order to use logic to solve problems via a "topical approach."  At a basic level, logic is the ability to understand main ideas "while avoiding comprehension of minor details" in order to "execute an oral or written summary."  At a higher level, logic is an "attempt" at "in depth comprehension" by analyzing "purpose," "tone," and "identifying logical fallacies."  The topical approach is explained simply in terms of learning language through the study of a specific informational topics, such as endangered species, cloning, or cyber bullying.[xliii]

What is the "learning objective" for KEPA?  This is somewhat unclear.  The CEO has described the KEPA mission in very vague and abstract language: "Cultivating communicative capability by escaping from self-rationalization, which can be a blind spot of critical thinking, and reinforcing resolution through compromise."  Another corporate document uses equally abstract but more humanistic terms, "Our mission is to help people realize their potential and thereby discover new meaning in their lives."[xliv]  In the more concrete terms of classroom methodology, students learn vocabulary and grammar through reading or listening to a specific topic, while they practice speaking skills.  The culmination of each classroom activity is a "critical thinking project," which is a group project that is supposed to demonstrate "solving problems" through "discussion, evaluation, and the logical presentation of an organized conclusion."[xlv]

In a widely disseminated image used in teacher training meetings, KEPA explains its organizational mission in terms of a bowl of rice.  The rice is critical thinking, and just like rice, critical thinking is "necessary for survival."  The bowl is the "delivery system," which is a combination of internet technology and faculty. The role of teachers is to "deliver" the product of "critical thinking."  However, the rice is also presented in a different slide as a trio of knowledge goals: critical thinking, cognitive language, and relevant content.  This is the official trio of the Korean Association for Teachers of English (KATE).[xlvi]  This image reveals KEPA's basic content-centered pedagogical framework: the "banking concept of education."  Knowledge is an object that the teacher holds and "deposits" into the passive "receptacle" of a student.[xlvii]   

While KEPA markets itself as an educational institution, internal documents and the CEOs own language paint the organization as a profit seeking business.  In an internal corporate magazine, KEPA Culture, the CEO of the company made it clear that the most important part of KEPA’s success was the “self-confidence and invincible attitude to maintain market leadership,” including the ability to diversify the company to reach multiple markets in the private education sector.  Corporate leadership does not discuss teaching or curriculum in educational terms.  Instead they refer these parts of the business as “products” and “contents.”  The company is not focused on any academic or learning principles.  Instead administrative leadership discusses their corporate mission in terms of an “ESL lifestyle business.”  As such, this organization is focused on launching “new products” to generate revenue, creating “strategic marketing campaigns” in order to “create value,” becoming a “content leader” in their niche, and muscling out other ESL “competitors” to capture greater market share.[xlviii]  In internal documents, the CEO primarily refers to KEPA as a “publically listed company.”  He calls KEPA a “corporate organization comprised of business divisions, R&D centers, and performance-driven IT and management infrastructures.”  Different campuses are referred to as “franchises” and “subsidiary companies.”[xlix]  There is rarely any mention of teaching, learning, or curriculum, and never any attempt to characterize KEPA as an educational institution.  As far as corporate leadership and administrative staff are concerned, KEPA is a profiting seeking business. 

After sorting through corporate memos, teacher training presentations, administrative staff comments, and teacher comments, it is clear that KEPA sends mixed messages about the multiple and often highly abstract objectives of this organization.

Most administrators and teachers have no clear idea about what the company stands for or what to prioritize in the classroom.  It is clear that KEPA has lofty business and instructional goals, but the corporate vision does not fully connect with the more concrete methodology employed in the classroom.  Further, due to the vast confusion over organizational goals, most staff go with whatever corporate directive has been most recently issued, while adhering to the monolithically proscribed instructional routine for classroom management.  Thus, despite the lofty rhetorical goals KEPA espouses in outreach documents, corporate memos, and teacher training seminars, the real organizational emphasis of this company seems to fall on two interlocking objectives: making profit while rigidly adhering to the KEPA "delivery system."[l]  The later is a teacher-proof curriculum and classroom management structure known internally to teachers and staff as the "KEPA method."

 

Teaching without Teachers: Authority, Structure and Surveillance

Externally, KEPA advertises itself as a state of the art English language academy with professionally trained teachers, a 21st century curriculum, and engaged students.  Internally, corporate executives claim to have created a new ESL curriculum that is supposed to train students to become proficient in the English language (listening, speaking, reading, and writing), as well as in critical thinking and argumentative debate.  However, behind the corporate rhetoric lies a different, darker reality.  Only vaguely understood by most organizational actors, there is an institutionalized "hidden curriculum"[li] created by KEPA's CEO and organizational structure that undermines KEPA's corporate rhetoric and frustrates student learning.  

The CEO wants to make KEPA a “united” organization with a “central focus.”[lii]  As one middle manager explained, "It is crucial that we all 'row this big ship together for smoothing sailing.'"[liii]  But to maintain order and discipline, the CEO admits that he has to use an “authoritarian” management style and to be “strict on the staff and faculty.”[liv]  Why?  Because KEPA employs an inexperienced, untrained, and transient workforce. 

Most middle managers, administrative staff, and instructors leave the company within a year.  Some middle managers leave the company after only three to six months.  At my particular branch, four different people occupied the upper-middle manager position and three different people occupied the lower-middle manager position within twelve months.  Few entry-level administrative staff have any knowledge of English or education, and most cannot even speak English.  These low-paid, primarily young office workers rarely stay for more than six to nine months.  All of the English instructors are recruited from overseas on one-year contracts, and the majority stay for only a single year.  If an instructor persists for more than a year then they are automatically considered an "expert instructor."[lv]  Most of these instructors have only a bachelor’s degree in fields other than English and no previous teaching experience or knowledge of student learning.  Some have extremely limited reading, speaking, and writing skills and they are not fit to teach.  Most if not all instructors are employed at KEPA because they could not find employment in their home country.  Some come overseas to primarily "party," while waiting for a better opportunity back home.[lvi]  For the majority of instructors, working at KEPA is all about the money, and for the most part, KEPA pays a higher wage and offers better working conditions than many other Korean Hagwons.  Given instructor's lack of skills, inexperience at teaching, and mercenary motives, combined with the traditional hierarchical culture of Korean corporations, the CEO's decision to maintain an "authoritarian" organization seems reasonable.  To deal with an unskilled and transient workforce, the organization is built on the foundation of authoritarian managers who enforce a rigid classroom management method.  Instructors and administrative staff are but the interchangeable and temporary "bowls" delivering the standardized product that makes KEPA a hefty profit.

The KEPA instructional method is a carefully guarded "confidential" trade-secret that was created by the CEO and developed by the Research and Development staff.  Only top corporate managers, R&D staff, and Training Center instructors have full access to the rationales behind the KEPA method.  All middle management and instructors are given a brief, standardized version of the KEPA method, which is a "class structure" that must be rigidly followed.  Instructors are not told how or why the method works.  They are simply told to follow the method.  Every three hour class has the same standard format and is planned down to the minute.  Instructors are told to follow the "class structure" without question and without modification.  The main task of middle-management is surveillance.  They monitor instructors via CCTV to make sure every part of the "class structure" is accomplished according to a standardized "observation report," which is a checklist based on the "class structure," with the addition of three additional factors, enthusiasm, professionalism, and student management.  But the main rubric that all middle management cling to and incessantly enforce is whether or not an instructor "follows KEPA methodology for class structure and instruction," which means does an instructor do each prescribed activity on the checklist for the exact length of time allotted for each task.[lvii]

The KEPA method is the centerpiece of the organization.  The CEO claims that the KEPA method is a "new product" that has enabled KEPA to become a "content leader" in the ESL market.[lviii]  The KEPA method is not only a "new" and effective way to teach ESL for the 21st century,[lix] it is also "the most effective ESL methodology in East Asia."[lx]  On what does the CEO base his claims?  What knowledge or training does the CEO possess to invent such a revolutionary educational model?  The answer to both questions, sadly, is nothing.  The CEO earned a bachelor’s degree in philosophy.  He mostly focused on G. W. F. Hegel, the German romantic who believed the world was infused with transcendental spirit, and the CEO is prone to sending Hegelian inspired, abstract emails to faculty and staff.  After working for a number of years as a private tutor, the CEO was able to start his own hagwon business.  He seems to think of himself primarily as an entrepreneur, not as an educator, and he refers to KEPA primarily as a business.  After starting the company, he hired a number of program marketers and researchers that helped him invent and market a "new" ESL product.  But these program developers only had bachelor’s degrees, mostly in fields other than English or Education.  According to one former R&D staff member, these people had almost no knowledge of the disciplines of English, English as a Second Language, or Education, yet they were designing the curriculum.[lxi] 

So, if the method was not created by knowledgeable ESL or educational experts, then what is the KEPA method based on?  I was able to get a copy of the "confidential" General Trainer's Manual through an informant.  This official document is the company Bible because it contains the complete curricular rationale and framework for the KEPA method.  This Manual was developed by the CEO, R&D, and program marketers, and it is only given to the elite, veteran KEPA instructors who are company certified to train incoming recruits. 

The 45-page Manual contains only 12 pages of conceptual framework.  Many pages present information that has been plagiarized, and only 6 pages contain 18 partially documented secondary sources.  Of these sources, 17 are cited as either an author's last name or in parenthetical notations with an author's last name and year of publication.  There is no bibliography, and only one title is presented.  The one fully sourced reference is improperly placed in the middle of a page between two summary paragraphs.  Despite some citations, there is no indication that these references are used with any professional or academic reasoning.  No author's academic credentials, discipline, or expertise is mentioned.  There is no discussion of research methodology and there is no critical analysis of research findings.  All references are cited at the end of brief summaries (most are one sentence long), which present a list of generalized knowledge claims.  All of these generalizations are superficial and display no substantial understanding of the subject matter (such as student self-efficacy, student behavior, or student learning).  Some of the generalized claims are simply nonsense: "A cognitive phenomenon strongly supported by psychological research, has broad applicability within education."  There is obviously no knowledge of professional academic standards on plagiarism, summary, critical analysis, or referencing, let alone any expertise in the content of ESL education or student learning.  The few authors that are named are referred to generally as "professors," which seems to lend a general aura of credibility and authority to the claims being presented.  But these claims are presented randomly in a list with no overarching thesis, integration, or coherence.[lxii] 

The single most repeated and authoritative source cited in the Manual is the CEO.  In Asian corporate culture, leadership is revered.  The CEO of KEPA is treated like a demigod. When he makes his yearly appearance at each branch the entire staff lines the entrance to greet him.  Every corporate email or memo is treated as revealed truth.  But a close inspection of his unfounded and illogical claims in the Manual shows that he is no expert on education, ESL, or anything else.  The CEO claims that East Asian ESL speakers are very different from "other ESL regions" because only East Asians use English for "business and academic communication."  Thus, he claims there is a specific need for a "distinctive" East Asian ESL method for these purposes.  Furthermore, he claims to have invented "the most effective ESL methodology in East Asia."  On this foundation, the CEO defines several key concepts and makes several knowledge claims that form the foundation of the KEPA curriculum.  This information is presented as self-evident truth and there is no attempt to reasonably explain any concept or claim, let alone conclusively proving them true. 

The Manual violates every elementary principle of expository writing, logical analysis, and critical thinking.  Superficial and abstract knowledge claims are randomly strung together in lists with no thesis or coherence, and the whole document is grounded on a fallacious appeal to the authority of the CEO and the KEPA corporation.  In this regard, the section on critical thinking is highly ironic.  It tells the reader that "'Critical Thinking' is most essential to KEPA ESL Methodology."  It warns against "dogmatic thinking," which is defined as "accepting one perspective blindly," and just "reiterating" a single perspective as truth.  Yet in defining and explaining critical thinking, this document merely quotes the CEO from a marketing memo and then ends with a long quote from late 19th century proto-sociologist William Graham Sumner, who is referred to authoritatively in the present tense as an "American academic and professor at Yale."  The General Trainer's Manual is transparently an exercise in uncritical, dogmatic thinking.  It presents a highly selective, superficial, disorganized, and unfounded list of incoherent information as "the most effective ESL methodology in East Asia," and new recruits are sternly told to follow it to the letter.  

Perhaps the most telling aspect of the General Trainer's Manual is the fact that only 12 out of 45 pages are devoted to any type of content-based information on ESL, student learning, or instruction.  The rest of the Manual is a prescriptive checklist.  It follows the same rigid logic as the standard classroom KEPA method and it is consistent with the overall authoritarian ethos of the organization.  The Manual tells the trainer exactly what to do every day of training down to the minute.  There is a regimented "Training Check List" to follow for every component.  It even tells the trainer how many new recruits should fail the program (10%).  But of course, the final outcome of training is not controlled by the actual trainers.  All pass/fail decisions are made by the Director of the Training Center who receives trainer recommendations and reviews all training sessions via CCTV.  This training structure is similar to the general management structure of KEPA where a large group of middle managers watch CCTV to monitor and evaluate staff.  But when it comes to final evaluations, disciplining, rewarding, or firing staff, only corporate managers have the real power to make decisions.   

Many instructors put up with this rigid structure because of the pay, and because of the "upward advancement opportunities" to "climb the corporate ladder."  Like other Korean corporate organizations, KEPA prizes loyalty above competence.  Almost all senior administrators and middle managers worked their way up from being an instructor, which is seen as the entry position to earn one's place within the "business culture" of KEPA.[lxiii]  Others stay on because it is an easy job with good hours (4-10pm), relatively good pay, and most instructors only work four days a week.  These instructors have plenty of time and money to pursue a range personal activities and to explore a fascinating foreign culture.  Some find working with ESL students very rewarding and a few say they want to be teachers upon returning home.  Finally, some instructors buy into KEPA's corporate rhetoric and find this organization a worthwhile and satisfying experience.  As one instructor explained, "I picked up valuable skills...diversifying my experience at KEPA.  I was selling a product that I actually believed in... teaching."[lxiv]

 

"Children Dying Inside": Instructional Ritual and Student Resistance 

To deal with an unskilled and transient workforce, the organization is built on the foundation of authoritarian managers who enforce a rigid classroom management routine called the "KEPA method."  Almost every three hour class follows the same basic structure and each activity is rigidly planned down to the minute.  This class structure is repeated for 9 weeks, on the 10th week a standardized achievement test is administered (speaking, reading, listening, and writing), and then the 11th-13th weeks are back to the normal routine.  Every three month term follows the exact same structure, and there are never any breaks between terms.

Except for the college prep courses, every class follows the same basic routine.  The first five minutes is attendance and homework review.  The homework is a combination of vocabulary exercises, filling in blanks, and writing a paragraph summary.  Grading homework consists of a quick glance at a workbook to make sure all blanks are filled in (there is no inspection for understanding or accuracy).  Students earn an A+ if all homework is completed and an F is nothing is done.  If at least some blanks are filled then they earn a B.  These are the only three grades an instructor is allowed to give.  Next is a "review" test on vocabulary.  Students are assigned 45 vocabulary words, 45 synonyms, and 10 phrase length "chunks" to memorize each week.  The average score is 50 percent (10/20 questions), which earns a B grade.  A score of 10 percent (2/20 questions) earns a C- grade.  These grades are set by R&D.  Then there is a 10-minute-long whole class "student counseling" discussion, in which instructors explain homework, "motivate" students by publicly recognizing high performers and scolding low performers, and if there is time, conduct "student rapport" activities, such as language games, like 20 questions, telephone, or riddles.  The next two hours are devoted to a brief skill lecture and then reading or listening exercises, leading up to a reading or listening comprehension quiz.  The final activity of each class is a group "critical thinking project" based on the day's content theme.  Students are given prompts and asked to prepare a group oral presentation, which they will speak in front of the class.  The class is supposed to evaluate each group and a winning group is chosen by the instructor.[lxv]

On the surface, this basic structure seems to pack a range of language-based activities into a well-organized three-hour block.  Time is given to vocabulary, skill acquisition, skill practice, skill test, writing, group work, and oral presentations.  And in fact, high performing students are able to use this structure to practice and polish their English skills.  However, there is almost no time for individual feedback or correction, thus, there is very little opportunity for students engage the material and learn new skills.  Furthermore, KEPA's curricular materials are inappropriately advanced for most students, thus, students struggle to understand the lesson's conceptual topic and advanced vocabulary words.  Elementary students in the basic reading and listening programs are taught about beneficial bacteria, hyperinflation, competing scientific theories of species extinction, or cryogenics.  In the more advanced classes, elementary and middle school students use American college textbooks with sophisticated essays and they are introduced to logic, argumentation, fallacies, and expository writing.  Most students are completely overwhelmed, not only by the advanced conceptual topics, but also by the extremely advanced vocabulary.  The majority of students in every class routinely fail the reading or listening comprehension quiz.  The average score hovers around 50 percent or lower.  Students struggle to comprehend the material thrown at them each week, let alone developing their language skills.

The KEPA pedagogical structure itself is to blame.  Due to the rigid time and activity structure, there is no opportunity for instructors to explain each week's topic, nor is there any time for the class to engage in discussion.  The whole focus of the class is preparing students to take the standardized multiple choice question test during the second hour, which is meant to prepare them for the standardized final exam week 10.  In fact, the whole KEPA curriculum is built around the College Scholastic Ability Test (CSAT) the Test of English as a Foreign Language (TOEFL), and a host of other standardized tests, which are the formal placement exams for academic high schools and colleges.  Despite KEPA's rhetoric about language acquisition, blending learning, and critical thinking, this hagwon is only concerned about one goal: preparing students to take standardized tests in the English language.  Thus, the primary instructional activity that KEPA management places at the center of the KEPA method is "test-taking skills."  In training secessions and from management comments, the primary instructional activity is to help students "refine fundamental test-taking skills" so that they can "obtain the best iBT score possible."  This is the central mission of KEPA.  Classroom activities focus not on discussion or understanding written texts or oral texts, instead they focus on standardized test question types, strategic approaches to text taking, note taking, and summary writing.  This also explains the difficult nature of the textbooks because TOEFL and other standardized tests use "excerpts from college-level textbooks."  Thus, students read or listen to college-level texts, not because it is developmentally or educationally appropriate, but because it is necessary to acclimatize them to standardized test taking.[lxvi]

There is no room in the KEPA method to make weekly topics interesting, relevant, or even understandable to most students.  This alienates and frustrates even willing students.  But most classrooms are not filled with willing students, especially when they reach middle school. There is an underlying reality behind Korean private schooling: it is culturally mandatory.  Because of the general "education mania" in Korea, parents enroll students in private education all week long.  Some students go to hagwons and private tutors seven days a week for up to six to eight hours a day.  After an informal class discussion on how students are overworked in Korea, I had one of my students approach me after class.  He informed me that he has to go to 13 hagwons a week, each assigning homework, plus his regular school and homework.  He said he had no choice.  His parents make him go.  Many students report that they are always going to hagwons or doing homework, they have no free time, and they sleep only four to six hours a night. 

Thus, many students in KEPA classrooms are completely unresponsive and do the very least just to get by because they know schooling is simply a test of endurance, and they how to work the KEPA system.  As long as students fill out their book, stay quiet during class, and do at least some homework then they will earn passing grades.  Many students will just stare at the walls during class.  The week 10 standardized achievement tests are also rigged to accommodate these unresponsive students.  The same tests are used over and over again, grading is curved, and students will advance to higher classes if their parents complain.  KEPA offers a highly ritualized environment that demands very little from students other than displaying the proper behavior.  There is a subtle truce between instructors and students.  Many students play the hagwon game to keep up the appearance of schooling; however, a close look into their blank eyes reveals a silent, enduring resistance.  Sometimes this resistance turns into open hostility.  One student explained, there is "conflict between teachers and the students which leads to an uncomfortable learning environment."[lxvii]

I engaged students often about their educational circumstances in order to understand how they perceive schooling, hagwons, and the pressure to perform.  92 percent of the students I surveyed (n = 59) said they went to school 6 days a week, while on average spending just over 5 days a week at private education (either a hagwon or a tutor).  19 students (32%) spent 7 days a week at private education.  Students averaged about 4.2 hours a day at private education, with 6 students (10 %) spending an average of 7+ hours a day at private education.  On average students went to 4 different hagwons or private tutors, with 7 students going to 7+ hagwons or private tutors.  On average students spent 4.2 hours a day on homework, although over 20 percent said they spent 7+ hours a day on homework.  When asked how much "free time" students had during the day, the average response was 2.5 hours, with 39 percent responding only 1 hour.  On average students got 7.6 hours of sleep a night, with 15 percent saying they got only 5-6 hours of sleep.

I also asked students to write about what they liked or disliked about the hagwan.  Most students repeat the same basic evaluations: too much homework, too many tests, too much stress to perform, and not enough break time to eat and go to the bathroom.  On student wrote, "They spend lots of time in doing KEPA homework, no time to do school homework, and no time to study other subjects."  Most disturbingly were the repeated comments about how much "stress" all the class time, homework, and tests put on students.  One college prep student wrote, "The Korean school system puts too much pressure on students.  The stress that the students have to carry on their backs is very heavy that some students fall down, never reaching their goals.  Do we have to do it this seriously?  I absolutely DON'T think so."  Another college prep student wrote something similar, "Everyday I have to go academies... every day I have to finish homework...I get tired, stressed usually, when I am busy.  I am hated of doing this uninterested thing...Usually I feel negative of this busy life.  But I'm continuing this life because I'm being forced."  Two elementary students verbalized in quite shocking language how this stress makes them feel: "Children dying inside" and "Children die inside (test kill children)."  A couple of students said they "hate" KEPA and want to "destroy it."[lxviii]

 

Conclusions

Taking the ethical vantage point of Amartya Sen's "impartial spectator,"[lxix] I want to make a few observations about the South Korean pursuit of "education fever" and the social role of hagwons, like KEPA, in order to ask a basic question: Is the South Korean educational model just or fair?  Specifically, I want to use Sen's "capability approach" to look at the means and ends of "satisfactory human living" and the extent to which an individual not only "ends up doing," but also what that individual is "in fact able to do" and whether or not that individual is able to freely choose any particular course of action.[lxx]  As Sen explains, "A theory of justice - or more generally an adequate theory of normative social choice - has to be alive to both the fairness of the processes involved and to the equity and efficiency of the substantive opportunities that people can enjoy...Neither justice, nor political or moral evaluation, can be concerned only with the overall opportunities and advantages of individuals in a society."[lxxi]

The ends of South Korean education look very attractive.  Today, South Korea has one of the highest percentages of school-age population enrolled in both K-12 and higher education, around 99 percent enrollment in middle school, over 96 percent in high school, and close to 70 percent in some form of higher education.[lxxii]  South Korea has also been the site of a "miracle" socio-economic transformation from an underdeveloped, autocratic third-world backwater into a developed, free-market, high-skilled economy and democratizing society.  South Korea deserves credit for its highly educated population, soaring industrial productivity, and innovative technology, but at what cost and who pays the cost?

In 2008 Korean families spent almost 21 trillion won (around $17 billion) on private education.[lxxiii]  South Korean families spend more on education than in most other countries, around 69 percent of the total price, making the South Korea "possibly the world's costliest educational system."[lxxiv]  And students are pushed from as early as kindergarten or the 1st grade to not only perform well in regular schooling, but also to go to private tutors and hagwons so that they can prepare for the high stakes testing in middle school, high school, and the college entrance exam.  Most students study all day for seven days a week and get less than eight hours of sleep a night.  These students are pushed to study and succeed on standardized tests, they are pushed to become fluent in English, and they are pushed to get into the most prestigious high schools and universities.  Students are slaves to their parents' ambitions, whether or not some students actually internalize "education fever."  Students are under so much pressure that a large percentage of students, somewhere between 20 to 48 percent, actively contemplate suicide each year, and a significant minority actually kill themselves because they cannot take the pressure to succeed or the burden of failure.

And what are South Korean children actually learning in this "pressure-cooker atmosphere"?[lxxv]  Public and private schools use a "teach-for-the-test" curriculum that focuses on the memorization of information, standardized multiple-choice tests, and test-taking techniques.  Korean students rarely understand the information being taught to them, they are not taught to critically analyze information, and they cannot apply information to other contexts.  Students simply become "expert memorizers" of "de-contextualized" facts that can only be used to take standardized tests.[lxxvi]  This teach-for-the-test curriculum "stifle[s] creativity, hinder[s] the development of analytical reasoning, ma[kes] schooling a process of rote memorization of meaningless facts, and drain[s] all the job out of learning."[lxxvii]

And what are the ends of this education system?  Students are ultimately competing for a limited number of high paying jobs with top corporations or government agencies.  But economic and social inequality has intensified over the last several decades, and there is “a growing disparity” between rich and poor measured by consumption patterns, residential segregation, and access to quality education, especially quality higher education.[lxxviii]  Not only are the numbers of impoverished and underemployed still a problem, there has also been increasing unemployment and growing job insecurity for white collar workers.  Women still find it hard to compete in the labor market.  Over the past decade, Koreans have suffered setbacks from less protective labor laws, increased competition in the skilled labor market for fewer full-time jobs, and the introduction of neoliberal business models, like increased use a flexible, contingent, and low-paid labor force that can be easily hired and fired in reaction to business cycles.[lxxix]  Plus, the educationally driven culture of South Korea turns out many more college graduates than can be adequately employed in the economy.[lxxx]

But schooling in South Korea has traditionally been about social status and class, not employment in the labor market.[lxxxi]  Koreans have had a "faith in education," seeing it as the only avenue to social advancement, if not economic opportunity.[lxxxii]  A successful student not only raises his or her own status, but also brings social benefits to the entire family.  Thus, Denise Potrzeba Lett has argued that economic goals are not "the primary motivation" behind Koreans' pursuit of education.  Instead, Koreans' "pursuit of education was more than anything else a pursuit of status."[lxxxiii]  Lett calls the modern manifestation of the process the "yangbanization" of Korean society: "as South Korea's middle class has become more affluent, it has come to exhibit characteristics more typically associated with an upper rather than a middle class."[lxxxiv]  The pursuit of formal education, especially higher education, becomes the primary marker of class distinction, which helps position an individual within the highly regimented labor market.[lxxxv]

The ends of the South Korean education system seem perversely clear: a successful student spends 16 years of intense intellectual labor, earns a degree from a prestigious university, and gains entry to one of the top 50 corporations, only to raise a family and push his or her children onto the same path.  But only a minority of South Korean students actually fulfill this career trajectory.  In a society defined by social status and the attainment of success markers, what is the quality of life for the majority who fail to reach the cultural pinnacles of success?  And is educational, social, and economic success truly just if it is not freely chosen?

And even if one of the lucky few achieve all of these markers of success, what then?  Are they happy, fulfilled, content, complete? 

I am reminded by the words of the French philosopher Pascal: "The present is never our end.  Past and present are our means, only the future is our end.  And so we never actually live, though we hope to, and in constantly striving for happiness it is inevitable that we will never achieve it."[lxxxvi]  South Korean society is obsessed with status and education seems to be the primary vehicle to attain this future end.  But if the process to obtain a desired end causes only misery than what happiness can come when the end is reached?  As John Dewey noted, most people see education as simply "the control of means for achieving ends."[lxxxvii]  However, Dewey explained that education is connected to the development of human beings, and as such, it is a process of discovery, and should have "no end beyond itself."[lxxxviii]  If education is treated simply as a means to an end then personal development and learning will not happen - education will be reduced to a perverse ritual that tortures the young to conform to competitive social pressures. 

Sadly, education in South Korea seems to be a demonstration of Dewey's point: "Education fever" is not about education at all.  Schooling is but the means for the relentless pursuit of social status and prestige.  Thus, the recently debated phenomenon of "tiger mothers" in the United States should give us pause to think about the means and ends of education.[lxxxix]  The education system in South Korea should serve as a warning to the world.  It helps us understand how education can be used and abused in the pursuit of social goals, and how children can suffer from their parents' pursuit of an ideal end.  South Korea should not be seen as a global educational exemplar.  In contrast, the South Korean educational model should serve as a warning.  Beware the reduction of education to economic mobility and social status.  Beware the grip of "education fever."

Endnotes

[i] Seth, Education Fever, 82-83, 135.

[ii] Ibid., 88.

[iii] Sorensen, “Success and Education in South Korea,” 18.

[iv] Seth, Education Fever, 82-83, 135.

[v] Lee Soo-yeon, "Hagwon Close, but Late-Night Education Goes On," Joong Ang Daily (Aug 17 2009).

[vi] Bae Ji-sook, "Should Hagwon Run Round-the-Clock?" Korea Times (March 13 2008).

[vii] Kim Tae-jong, "Seoul City Council Cancels All-Night Hagwon Plan," Korea Times (March 18 2008); Park Yu-mi and Kim Mi-ju, "Despite Protests, Court Says Hagwon Ban Is Constitutional," Joong Ang Daily (Oct 31 2009).

[viii] John M. Glionna, “South Korean Kids Get a Taste of Boot Camp,” Los Angeles Times.Com (Aug 21 2009).

[ix] James Card, "Life and Death Exams in South Korea," Asia Times Online (Nov 30 2005). Seth, Education Fever, 1.

[x] Casey Lartigue, "You'll Never Guess What South Korea Frowns Upon," Washington Post (May 28 2000); Card, "Life and Death Exams in South Korea;" Seth, Education Fever, 185.

[xi] Koo, “The Changing Faces of Inequality in South Korea,” 12, 14.

[xii] Joseph E. Yi, "Academic Success at Any Cost?" KoreAm: The Korean American Experience (Oct 1 2009); Lartigue, "You'll Never Guess What South Korea Frowns Upon."

[xiii] Moon Gwang-lip, "Statistics Paint Korean Picture," Joong Ang Daily (Dec 15 2009); "Lee Seeks to Cut Educational Costs," Korea Herald (Aug 14 2009).

[xiv] Seth, Education Fever, 172, 187.

[xv] Choe Sang-hun, "Tech Company Helps South Korean Students Ace Entrance Tests," The New York Times (June 1 2009).

[xvi] Seth, Education Fever, 185-86.

[xvii] Hwang Young-jin, "Equity Fund Bets on Cram Schools," Korea Times (n.d.), KEPA papers.

[xviii] KEPA, "The KEPA America Mission," corporate email (Nov 13 2007), KEPA papers.

[xix] Yi, "Academic Success at Any Cost?"

[xx] KEPA, "The KEPA America Mission."

[xxi] Koo, “The Changing Faces of Inequality in South Korea,” 31.

[xxii] KEPA papers.

[xxiii] Kim Tae-jong, "Hagwon Easily Dodge Crackdown," Korea Times (Oct 26 2008); Kang Shin-who, "67 Percent of Private Cram Schools Overcharge Parents," Korea Times (April 14 2009).

[xxiv] "Unforeseen Dangers of Korea's Hagwon Culture," Asian Pacific Post (Jan 10 2006).

[xxv] Ibid., Limb Jae-un, "English Teachers Complain about Certain Hagwon," Joong Ang Daily (Dec 8 2008).

[xxvi] Seth, Education Fever, 192.

[xxvii] Diane Ravitch, The Death and Life of the Great American School System: How Testing and Choice are Undermining Education (New York: Basic, 2010), 107-108, 159.

[xxviii] Darling-Hammond, The Flat World and Education, 70.

[xxix] Ibid., 109.

[xxx] Rose Senior, "Korean Students Silenced by Exams," The Guardian Weekly (Jan 15 2009); Card, "Life and Death Exams in South Korea."

[xxxi] Seth, Education Fever, 170.

[xxxii] Card, "Life and Death Exams in South Korea."

[xxxiii] Hyun-Sung Khang, "Education-Obsessed South Korea," Radio Nederland Wereldomroep (Aug 6 2001).

[xxxiv] Bae Ji-sook, "Should Hagwon Rune Round-the-Clock?;" Soonjae Joo, Chol Shin, Jinkwan Kim, Hyeryeon Yi, Yongkyu Ahn, Minkyu Park, Jehyeong Kim, and Sangduck Lee, “Prevalence and Correlates of Excessive Daytime Sleepiness in High School Students in Korea,” Psychiatry and Clinical Neurosciences 59 (2005): 433-440.

[xxxv] Bae Ji-sook, "Should Hagwon Rune Round-the-Clock?."

[xxxvi] Seth, Education Fever, 168.

[xxxvii] Seth, Education Fever, 166.

[xxxviii] Card, "Life and Death Exams in South Korea."

[xxxix] “Blog commentary by administrative staff in response to CEO interview,” KEPA papers; CEO, “The Road Not Taken,” Corporate email, KEPA papers; S. T., "My KEPA Story," (Dec 12 2007), KEPA papers.

[xl] Promotional handouts and advertising documents, KEPA papers.

[xli] CEO, "From Blended Learning to Critical Learning" (May 15 2009), KEPA Papers.

[xlii] Ibid.

[xliii] Ibid.

[xliv] KEPA, "Critical Learning," ESL Learning Center Business Division (July 2 2009), KEPA papers.

[xlv] CEO, "From Blended Learning to Critical Learning" (May 15 2009), KEPA Papers.

[xlvi] Ibid.; KEPA, "Critical Learning." See also Korean Association for Teachers of English, <www.kate.or.kr/>

[xlvii] Paulo Freire, Pedagogy of the Oppressed (New York: Continuum, 1993), ch 2.

[xlviii] “Interview with KEPA CEO,” KEPA CULTURE MAGAZINE (June 2006), KEPA papers; KEPA, "Critical Learning," ESL Learning Center Business Division (July 2 2009), KEPA papers.

[xlix] CEO, “The Road Not Taken,” Corporate email, KEPA papers.

[l] KEPA, "Critical Learning," ESL Learning Center Business Division (July 2 2009), KEPA papers.

[li] James E. Rosenbaum, Making Inequality: The Hidden Curriculum of High School Tracking (New York: Wiley, 1976); Henry Giroux and David Purpel, Eds., The Hidden Curriculum and Moral Education (Berkeley: McCutchan, 1983).

[lii] CEO, “The Road Not Taken,” Corporate email, KEPA papers.

[liii] Faculty Manager, "Email to Branch Staff," (Jan 7 2010), KEPA papers.

[liv] “Interview with KEPA CEO,” KEPA CULTURE MAGAZINE (June 2006), KEPA papers.

[lv] KEPA, "KEPA Branch(ISE) Head Instructor Guidelines and Expectations," (Aug 18 2009), KEPA papers.

[lvi] C. S., "My KEPA Story," (March 10 2008), KEPA papers.

[lvii] KEPA, "General Trainer's Manual" (Oct 12 2009); KEPA, "Reading and Writing: Track A Program Guide" (Aug 19 2009); KEPA, "CCTV Observation Report" (Feb 2009); KEPA, "KEPA Branch(ISE) Head Instructor Guidelines and Expectations," (Aug 18 2009).  All KEPA Papers.

[lviii] “Interview with KEPA CEO,” KEPA CULTURE MAGAZINE (June 2006), KEPA papers; KEPA, "Critical Learning," ESL Learning Center Business Division (July 2 2009), KEPA papers.

[lix] CEO, "From Blended Learning to Critical Learning" (May 15 2009), KEPA Papers.

[lx] KEPA, General Trainer's Manual (Oct 12 2009), KEPA Papers.

[lxi] CEO, “The Road Not Taken,” Corporate email, KEPA papers; Interview with Informant #1 (Nov 1 2009); Interview with Informant #2 (March 6 2010).

[lxii] KEPA, General Trainer's Manual (Oct 12 2009), KEPA Papers.

[lxiii] S. T., "My KEPA Story," (Dec 12 2007), KEPA papers; C. S., "My KEPA Story," (March 10 2008), KEPA papers; C.B., "Success Story," (Nov 14 2007), KEPA papers.

[lxiv] KEPA, KEPA Culture, (May 2009), KEPA papers.

[lxv] KEPA, "Reading and Writing: Track A Program Guide," (Aug 19 2009); KEPA, "Student Counseling Guidelines," (June 25 2009).  KEPA papers.

[lxvi] KEPA, "Global Track Overview: Standardized Tests, What Are They and Why Do Students Take Them?" (n.d.); KEPA, "Effective Questioning," Faculty Handout (n.d.); KEPA, "IBT Reading Question Types," Faculty Handout (n.d.); KEPA, "TOFEL iBT Reading," Faculty Handout (n.d.).  KEPA papers.

[lxvii] "Student Writing," KEPA papers.  On the antagonistic power struggle between students and teachers see Willard Waller, The Sociology of Teaching (New York: Russell & Russell, 1961).

[lxviii] "Student Writing," KEPA papers.

[lxix] Amartya Sen, The Idea of Justice (Cambridge, MA: Harvard University Press, 2009), 124.

[lxx] Ibid., 234-35, 238.

[lxxi] Ibid., 296-97.

[lxxii] UNESCO, South Korea; Hye-Jung Lee, “Higher Education in Korea,” Center for Teaching and Learning, Seoul National University (Feb 2009).

[lxxiii] Moon Gwang-lip, "Statistics Paint Korean Picture," Joong Ang Daily (Dec 15 2009); "Lee Seeks to Cut Educational Costs," Korea Herald (Aug 14 2009).

[lxxiv] Seth, Education Fever, 172, 187.

[lxxv] Seth, Education Fever, 192.

[lxxvi] Rose Senior, "Korean Students Silenced by Exams," The Guardian Weekly (Jan 15 2009); Card, "Life and Death Exams in South Korea."

[lxxvii] Seth, Education Fever, 170; Darling-Hammond, The Flat World and Education, 70.

[lxxviii] Hagen Koo, “The Changing Faces of Inequality in South Korea in the Age of Globalization,” Korean Studies 31 (2007): 1-18.

[lxxix] Koo, “The Changing Faces of Inequality in South Korea in the Age of Globalization”; Andrew Eungi Kim and Innwon Park, “Changing Trends of Work in South Korea: The Rapid Growth of Underemployed and Job Insecurity,” Asian Survey 46, no. 3 (May/June 2006): 437-56; Nelson, Measured Excess; Dennis Lett, In Pursuit of Status: The Making of South Korea’s “New” Urban Middle Class (Cambridge, Harvard University Press, 1998).

[lxxx] United Nations Educational, Scientific and Cultural Organization (UNESCO), South Korea, revised version, World Data on Education, 6th ed. (Paris: UNESCO, Oct 2006), 30; Cho Jae-eun, "Too Many Grads Fight for Too Few Jobs," Joong Ang Daily (Oct 18 2010).

[lxxxi] Seth, Education Fever, 100;

[lxxxii] Seth, Education Fever, 102.

[lxxxiii] Lett, In Pursuit of Status, 159, 164; Cho Jae-eun, "Too Many Grads Fight for Too Few Jobs," Joong Ang Daily (Oct 18 2010).

[lxxxiv] Ibid., 212, 215.

[lxxxv] Ibid., 218-19.

[lxxxvi] Pascal, Pensees (Oxford: Oxford University Press, 2008), 21.  For similar conclusion by a modern academic who studies the "science of happiness" see Daniel Gilbert, Stumbling on Happiness (New York: Vintage, 2007).

[lxxxvii] John Dewey, Democracy and Education (New York: Feather Trail Press, 2009), 28.

[lxxxviii] Ibid., 29.

[lxxxix] Amy Chua, Battle Hymn of the Tiger Mother (New York, 2011); Sandra Tsing Loh, "My Chinese American Problem - and Ours," The Atlantic (April 2011) 83-91.

My Mis-Education, part 1

originally written 2014

How does one begin to explain the first experience of learning?  Those first conscious moments where the individual human being begins to not only see the world, but to know the world and to give it meaning. 

When are these moments?  What is it that we really learn as children? 

For most of us our formative education comes as coaxing instruction from the immediate circles of our family, often a mother or father, teaching first words, how to dress, table manners, and the simple difference between wrong and right.  No doubt this socialization process can be benign, at times even pleasant. 

But often parental instruction is delivered as a half-articulate, hands-shaking rebuke, rather than a time for teaching. 

I am reminded of a Toni Morrison novel where the child narrator exclaims, “Adults do not talk to us – they give us directions.  They issue orders without providing information...We do not, cannot, know the meaning of all their words…So we watch their faces, their hands, their feet, and listen for truth in timbre.” (1)

I think my first lesson learned was negation, the negative – thou shalt not!  For me the word “no” and its derivatives were perhaps the greatest early lesson, often accompanied by raised voices, stern looks, threats, and sometimes, physical violence. 

Nothing teaches a child what is right or wrong so effectively as the swat of a spoon, the slap of a hand, the strike of a belt, or the snap of a cord.  This type of education is pure Pavlov: do right and be praised, or do wrong and be punished.  A kid learns the “right” path soon enough just to avoid being hit. 

The almighty NO is a powerful incentive for learning to be sure. The knowledge gained a hard-earned prize. But ultimately, these negative lessons that we learn as children are for the benefit of society and parents, not for us as individuals. We are socialized, not for our own good, but for the good defined by the power of authority and tradition.

The threat of violence is perhaps the basis of all human morality and civilized law, as a cursory glance at most major world religions and judicial penal codes will demonstrate.  I grew up in a strict Evangelical Protestant home.  The most important lesson that my father taught to me was found in the immortal words of Solomon, the wise king of the ancient Israelites: “The fear of the Lord is the beginning of knowledge: but fools despise wisdom and instruction.” (2)

And as any good Christian child would know, the Biblical Jehovah was a sadistic and jealous god who gave life sentences of pain and suffering to Adam and Eve for disobedience, demanded death for disrespecting one’s father, gloried in bloodshed and war, and gave the ancient Israelites a blessed promised land only if they would first massacre every man, woman, child, and beast who happened to already live there. 

Then there are the later additions of the New Testament, largely influenced by the legalist mentality of St. Paul who told children to obey their parents, wives to obey their husbands, and slaves to obey their masters.  And finally, there is the bloody vision of St. John, who dreamed up a holocaust at the end of history when every non-Christian would be subjected to various horrors during the last days and then tortured in the fires of hell for eternity. 

If that is not enough to give a child nightmares, I’m not sure what would. 

I learned early on that God’s divine and eternal punishment was something to be feared at a visceral level.  I was terrified of God, always scared that I would cross some unknown line and risk an eternity of torture.  I would wake up some nights petrified. 

I knew the anguish of an indeterminate and capricious salvation by my Lord’s grace.  My fear of God instilled something akin to what Sigmund Freud once called das Uber-Ich, often translated as the “Over-I” or “super-ego.”  I had a nagging voice of right and wrong mysteriously placed in my subconscious mind.

I was driven by the terror of divine retribution and social approbation.  We do “right” not because we want to.  We do “right” because we fear the consequences of doing “wrong.”

The flip-side to this subliminal injunction is a type of pleasure.  I think most children find it inherently pleasing to acquiesce to authority, groveling before those same dominating figures who dispense punishment. 

Most of us are taught to embrace the ingratiating self-effacement of bowing low to those with the power to crush us: parents, priests, police, school principals, and popular peers.  This strong social tendency is perhaps more noticeable in Asian cultures where the bow and differential forms of address still sanction a strict social hierarchy.

But every society has hierarchical structures of power and the accompanying relationships of respect and deference.  There is a socialized satisfaction that comes through the self-denial needed to appease the higher authority. 

The naturalness of authority and privilege are instilled in us at an early age.  We this demonstrated in the differential power relations between parents and child, which are based on the traditional power dynamics of God and man, king and subject, ruler and ruled.  One does “right” to earn a pat on the head, a smile, or the praise of the powerful. 

This is the tyrant’s strength.  An Italian Marxist called it hegemony.  It is the soft power exercised through the willing cooperation of the lowly who want to please their master, subconsciously fearing to do otherwise. 

I believe this unspoken and often unnoticed power dynamic is a central part of human relationships.  It has been represented and explained in various theoretical concepts over the past century by Sigmund Freud, Antonio Gramsci, and Michele Foucault.  We do what is right because we know it is right and because it is policed by the powerful.  We cannot do otherwise. 

We dare not do otherwise. 

We consciously and subconsciously know the structures of authority that envelope and restrain us.  Thus, for thousands of years, the basis of human knowledge and right action was fear of the various lords who ruled over subject populations: God, King, tribal Chief, and Father.

My parents were good people, acting on what they thought was right.  They raised their children the best they could with the knowledge and experience they had.  They were God’s agents, acting on behalf of the distant liege they worshiped.  They were to be respected and they were to be feared in their capacity to dispense justice, punishment, and love. 

I wanted to please them, I really did. 

But there was also something perverse at the very core of my being.  I had an insatiable curiosity.  I also had a penchant for experimentation.  I was deeply interested in life and different forms of experience. 

There was a deep injunction in my subconscious that told me certain words, deeds, even thoughts were prohibited.  But I could not help fantasizing about their possibility, and sometimes indulging in the taboo.  I was often told that these thoughts and inclinations were the work of the devil who often tried to temp and snare the unwary.  But I had a hard time understanding how the devil could seem so connected to my innermost being. 

The devil seemed to be such a natural part of my body that I was never sure who was in control.  St. Paul’s injunction to hate the flesh and the distractions of the sinful world were constantly uttered in my household.  But the devilment of my inclinations was hard to deny, so the reprimands and stern warnings of God’s agents seemed tyrannical, and early on I developed a split personality. 

Because of this split personality, I learned to wear a mask.  I needed to present an external demeanor that would deliver what was expected of me – my public self.  This public self was also subconsciously tutored by an inner voice of right and wrong that was constantly reinforced by parents, pastors, and the words of God. 

As I grew older, I was instructed more fully in Protestant theology, grounded on a literal and thorough reading of the Bible.  My inner voice of right and wrong became melded with my conception of God’s righteous presence.  My parents and pastor fostered this association, as they would often justify their judgments by explicit reference to Bible as the final and ultimate source of authority. 

I grew to respect and strive for the “right,” while fearing the consequences of the “wrong.”  Morality also came to seem quite natural, albeit not a perfect fit.  My public self was my ideal self, and I often strove to be righteous. 

I lived in a clear moral universe composed of black and white truths.  I was lovingly and firmly led by the Holy Book.  I was constantly watched by the knowing disciples of God.  And I felt always under the discerning gaze of my inherited Lord.

 

References  

(1) Toni Morrison, The Bluest Eye (1970; reprint, New York, 2007), 10, 15.

(2) Proverbs 1:7, Kings James Bible.

My Mis-Education, part 2

originally written 2014

I’ve always felt a sense of at least two selves.  Growing up, I was instructed to be the good boy.  I self-consciously conformed to what others expected me to be.  But there was also another self.  A deeper, not fully conscious, more comfortable, yet somewhat dangerous self – my inner self. 

I’ve always had a perverse inclination to seek out the unknown, and to experiment.  One of my earliest childhood memories, reinforced by the anecdotes of my parents, was of a child of three or four reaching for the door to the outside, opening it, and breaking free of the confines of home to explore the great unknown.  I did not get far, but I did get out. 

I’ve always had the need to reach, to get out.  But not all doors are so easily opened, and the social gatekeepers are always close behind.  Those moments of getting out, of exploring, were the most educative and exciting times of my youth – they were also taboo, and strictly prohibited.  

I had many friends growing up.  My close friends were not church going folk.  If they were, they were not outwardly pious or domineering about their faith.  I spent as much time as I could outside my own home.  Visiting the houses of my friends, I encountered the unknown, the explicitly taboo, the dangerous – the devil. 

I reveled in everything that I was forbidden at home.  I listened to popular music, songs about sex, drugs, and violence.  I read risky literature: Mad Magazine, fantasy novels, and science fiction.  I gaped at pornography.  I watched television, including the newly invented evils of MTV and HBO.  I hung out with girls, flirted, and played not so innocent games of physical exploration.  I talked about sex and other naughty things.  I used profanities and scatological humor.  I snuck into unlocked liquor cabinets, wondered at the strangeness of condoms, and choked on the bittersweet smoke of a stolen cigarette. 

Life seemed pregnant with possibility, yet always with the hint of danger.  My friends and I consciously stalked at the perimeters of morality, taboo, and legality.  We tried to break our way into adulthood, which was for us prohibited, but we still tried.  I learned to sneak, to hide, to skulk.

But for most of my childhood, these moments of transgression were few.  I was usually stuck at home.  I could not often leave the house, and when I could, it was only with permission.  My parents were quite explicit that I always ask to leave my immediate neighborhood.  There were always restrictions and curfews when I was allowed to roam abroad. 

Most of the time I was confined to my house or my immediate neighborhood.  Yet I still found ways to escape.  I would activate my imagination and lock myself in fantasy worlds, either in my toy-filled room or in the open fields behind our house.  Alone, I created the imaginary conditions of an impossible freedom, which were inspired by those fictional stories that animated my life.  I would be the knight, the soldier, the explorer, the king, the builder, or I would just roam the fields and forests with fantastic visions of other worlds, other times, other ways of being. 

But my ability to experiment and indulge in this inner self was limited.  Not only did my parents police my activities, but being born in a lower-middle class family, I learned that many types of experience were beyond my grasp.  Outside of fearing God, I learned to fear the lack of money and the pain, suffering, or denial that it could bring.  But I never had to go without basic necessities.  My parents always provided their children with more than enough to survive, and extra besides. 

I was fortunate to live such a comfortable life.  However, there were many luxuries that seemed like necessities to a child, as the power of peer groups and social status become more and more important at school.  Yet unlike many of my other friends, these luxuries were almost always denied to me and I deeply felt the want of such things.

The experience of economic deprivation was both humbling and frustrating.  I was of course frustrated in not being able to join my peers at movies, soccer camps, skiing in the mountains, concerts, or holiday trips.  I was also often embarrassed when I could not do small things, like go to the movies, go out to dinner, or have the fashionable attire on the sports field or at school. 

I especially felt the want during the early years of high school when I did not have a car and had no prospects of getting one.  Eventually my grandmother gave me her old 1974 Chevy Malibu, which had a good engine, but it was a rusty old-boat.  I was teased to no end, not only for the car itself, but also for the modest amenities I added to hide the damaged interior, like blue carpet (in a brown car) to cover up the cracked dashboard and rusty floors.  But at least I had a car.  Some of my friends did not.

Outside of the indignities of want, I was humbled by the fact that industrious and frugal people, like my parents and a great many others, worked very hard and long hours just to scrape by.  I learned to work hard, save my money, and to appreciate those times when small luxuries could be purchased and enjoyed. 

As a kid, I always had to do chores around the house, several hours a week, to earn basic privileges, like watching t.v. or earning time with my friends.  I also had to work part-time during high school, at jobs I hated, just to pay for small luxuries of my life, like gas and maintenance for my car, going to the movies, going out to dinner, or for a case of cheap beer. 

I was a manual laborer, and had I not fought my way out of my destiny, I would have always been a manual laborer.  From my first job mowing lawns at the age of twelve until I was a sophomore in college, I worked with my hands doing low-paid odd jobs that were physically demanding, and often dirty work. 

Even in graduate school, I sometimes did manual labor during the summers to earn extra cash.  I have mowed lawns and fields, dug ditches, cleaned toilets, mopped floors, vacuumed carpets, cut and nailed timber, hauled trash and debris, landscaped yards, built houses, painted barns, and harvested Christmas trees.  I've worked through pouring rain, freezing snow, and scorching heat.  Most of my bosses were skilled, but poorly paid contractors who worked every day from dusk to dawn with very little to show for it. 

I hated this type of work.  I hated the sweat and the dirt and the pain of sore hands and sore muscles at the end of the day.  I vowed early on that I would do something more with my life.  Somehow, if I worked very hard, I might become economically free one day.   

So, unlike most of my friends, I tried to save the little money I earned in hopes of a better life.  While I cherished the freedom and happiness that it could buy, I also feared the constraints and embarrassments of its absence.  I worked hard and never spent idly.  This regimen only intensified during college.

My childhood was constantly policed: by parents, by priests, and by my own powerlessness.  The perimeters of my being were guarded and the inclinations of my inner self confined.  But I still struggled against this confinement. 

I yearned to be free.  I yearned to do as I wished.  And, on various occasions, I did break free, if only for a few moments. 

But once my parents became aware of my dalliances with the dark side of my nature, they became ever more firm, watchful, restrictive.  Certain friends were prohibited.  Social activities were closely monitored.  I was forbidden “secular” music or movies.  The few tapes and magazines that I had smuggled into the house were confiscated and made an example of God’s power. 

Once, I sat with my father watching my cherished contraband burn in the fire.  We both half-expected demons screaming and rising from the ashes because my father had told me that such occurrences really did happen. 

I was forbidden most television shows.  Books brought home from the library were censored.  Those deemed un-Godly were confiscated and returned.  Visiting the homes of school friends was strictly regulated.  Without much power to rebel, I was often reduced to a smoldering fury of feeble rage, carefully waiting for a chance to be free - someday. 

As a boy, I was very practical.  Openly rebelling was not an option.  Such insolence would have been beaten out of me, and what little freedom I had would have been reduced to nothing.  I learned that sanity meant giving into the might of the powerful. 

Most of the time, I let my unobtainable dreams of freedom float away.  I spent my childhood living up to my public self, scrutinized by the gaze of protective parents and a wrathful God.  I tried to become a good Christian boy, which basically meant loving Jesus and doing what I was told. 

Gradually I embraced the religious life of my parents.  There was no other option.  Just as Kafka’s ape embraced the imposition of human nature, I embraced Christianity as my only way out.

Up until high school I faithfully went to church two days a week, sometimes more.  The congregation met every Wednesday and Sunday, supplemented with extra services, meetings, community service, and social activities.  I was an active and highly respected member of the congregation – largely, I might add, because my parents held leadership positions in the church. 

My parents were extremely active in the myriad activities of the church, which meant I was extremely active as well.  As a dutiful son, I did my best to live up to the esteemed stature of my parents.  I tried to make them proud. 

I was a founding member of the church youth group.  I helped start a Christian hip-hop band.  I was actively involved in community service, especially projects that benefited the needy members of our congregation.  I taught Sunday school to toddlers (my first real teaching experience).  I was a camp counselor for Bible camps.  I was called upon every now and again to help issue the Holy Communion or take up the collection during a Sunday service. 

And on many occasions, Bible in hand, I went on evangelical missions near and far to help save souls for Christ.  All in all, I seemed to be a model Christian.  Much of the time I actually believed that I was a faithful child of God. 

But throughout these times of professed piety, I also indulged in my perverse, private self.  I found ways to sneak contraband into the house (books, movies, music, and magazines).  I had many impure thoughts, obsessively thinking about sex, as all young boys do.  I privately criticized certain aspects of Christianity that seemed unrealistic or overly harsh.  And at times I even questioned the reality of God. 

I relished those times when I was away from my parent’s confining gaze, especially in the company of secular friends, or even with my more liberal Christian fiends.  Ultimately, I had troubling questions.  I also had an insatiable curiosity, as well as a growing rebelliousness that bubbled up from the darkness within. 

Yet I retained my pious mask.  I tried to find comfort in the rigid confines of Christianity.  But always my imagination held hope for a different way of life.  I dreamt of the possibility of broader freedom.  

My parents no doubt suspected the darker side of my nature.  At around the age of thirteen they locked me in an existential cell.  My parents decided to pull me away from the snares of the secular world and cloister me in the confines of a strictly Christian education.

I was to be home schooled. 

Without much time to react, I was withdrawn from the public school system halfway through 7th grade.  My parents didn’t even wait for winter break.  I was quite upset by this decision and forcefully tried to block its implementation.  I even threatened to run away.  But eventually I acquiesced because in reality, as in every other area of my life, I was quite powerless. 

I had no choice.  I had nowhere else to go and no other way to survive.  My parents held all the cards and I was smart enough to know that I was beat.  All my secular school friends quickly disappeared.  We moved to a new house.  Although still living in the same city, I felt worlds away from my former life.

Academically speaking, home schooling was a waste of my time and talents.  I languished.  I was also socially isolated.  My weeks were filled with monotonous routine.  

I was indoctrinated every day with a Bible-based curriculum (Christian English, Christian math, Christian art, and Christian history), supplemented by reading the Bible (in case I didn’t get enough from the rest of the curriculum).  My extracurricular activities included going to church, volunteering for church-related activities, or sometimes visiting other home-schooled families from our church.  Had I completely conformed to this educational regime, I most certainly would have turned into a monk.  I’m sure that would have pleased my parents.

Most of the curriculum was designed by an Evangelical Protestant publishing company.  It consisted of a series of workbooks and fill-in-the-blank tests, which ritualized a very superficial fact-oriented knowledge.  Although given its clear Christian bias, much of the “factual” content was merely pre-packaged bite-sized dogma

The history book was somewhat different.  I read a large textbook, which used a Biblical literalism to re-tell five thousand years of Western history from a Christian point of view.  Yes, in case you’re wondering, the dawn of history begins with God creating the heavens and the Earth in six days.  I wonder how it ended…

The remaining part of my curriculum was more laborious and boring.  I had to read the Bible cover to cover once a year – on top of the verse by verse reading of the Bible in church twice a week.  In case you haven’t managed to read this entire book, it’s overrated as literature and quite vague as a spiritual handbook for modern life.  Perhaps that’s why my father’s bookshelves were filled with Biblical commentaries on every facet of this baffling book.  Even the faithful get confused by its incoherence and contradictions. 

God’s Word penetrated my daily being.  It pervaded my consciousness.  It seeped into my skin.  Even as a middle-aged man, I still sometimes sweat the Bible out of my pours.

I was home schooled for about three years.  During that time, I was intellectually starved, socially isolated (except for church related activities), and generally board.  I hated it.  I had friends at church I enjoyed, especially when I could escape my own house and sleep over, but the rest of my life was largely a nightmare. 

And as all prisoners do, I acquiesced to the confinement. I settled into an acceptable routine.  I was smart (and devious) enough to realize that the home school curriculum was a joke.  So, by the second year of my prison sentence, I secretly rebelled against the absurdity of my existence.  Every morning I took both my work books and the answer key.  I didn’t bother to read the textbooks.  I just used the answer key to fill in the blanks and circle the correct answers.  The history book was a bit more fun because I loved to read and I liked history.  I managed to read this text book and write several essays.  I would have liked to skip the Bible reading, but my father always asked about the scriptures, so every day I breezed through a couple chapters of the Bible without much thought. 

This was my educational routine for the next two years.  Cheat, read the Bible, go through the motions.  I became so efficient that I was finished with “school” by mid-morning.   

Thankfully, I finished the bullshit quickly, and I had most of the day to myself.  I usually sat in my room, pretending to study, and let my imagination run free.  I listened to music, daydreamed, and sometimes thought about the future. 

Mostly I used the time to read, often contraband books from the public library that I snuck into the house.  As a young boy I loved fantasy novels with sword play, strange creatures, and heroic journeys.  I remember the Lord of the Rings trilogy was my favorite.  I read this trilogy and The Hobbit at least once a year.  I also checked out other books from the library that were more traditionally educational, like biographies and history books.  I’ve always had a fascination with the past. 

Books became a window into another world, exposing me to various forms of life that I was denied.  I read to feel alive.  I read to escape.

My First Real Education, part 3

Discovering Books and Reading My Way Out

originally written 2014

There was a distinct irony to home schooling, which only I could enjoy.  While my parents planned my indoctrination with a Christian curriculum, I used most of my “school” days to read secular books and engage with the devil. 

I loved to read.  It was the only way to develop my sense self freely, feed my intellect, and escape the confines of my prison.  I taught myself during these years.  For much of my youth, I was an autodidact. 

My education was gleaned through reading whatever books I could get my hands on.  I subverted the educational intentions of my parents, and explored the world through books, often secretly checked out from the library and snuck into the house.  In the process, I learned how to learn. 

I learned despite a boring and oppressive curriculum.  I learned without teachers.  I learned without a school.  I discovered an impressive gift during these years, which would permanently define my ethos.  As the philosopher Robert Nozick once said, "We are not identical with the books we read, but neither would we be the same without them." (1)

Love of reading was actually an inherited trait, learned through the example of my father.  I remember my father did two things every Sunday: he went to church and he bought a newspaper.  In every house we lived, my father’s books were always in prominent display on large bookshelves.  Some of my earliest memories are shelves perfectly lined with books. 

Subconsciously, books will always feel like home to me.  My father spent hours every night studying the Bible or reading newspapers.  In our house, books were a constant presence and reading was a holy activity.  Like manuscripts in Medieval times, I grew up believing that books were a "precious object." (2)

So, reading became second nature.  It was a way to conform.  But more importantly, it was also a means of escape. 

During my home school years, I began to use most of my free time to explore my father’s books, spending countless hours slipping my finger over each volume.  Some were historical books filled with pictures from the Civil War, the 1960s, or maps of ancient Rome.  There was an abridged Oxford English Dictionary and countless other reference books. 

Some books were old classics, like Dickens, Thackeray, and Kipling.  There was even a complete works of Shakespeare, each volume separately bound in blue leather.  I would dabble reading these, working more on Dickens and Shakespeare than the others.

Most of the books were Christian, in one way or another.  My father owned several Bibles.  There was the King James, the Latin Vulgate, the New Revised Standard edition, Bibles with commentary, Bibles with maps, study Bibles, abridged Bibles, and devotional Bibles.  He also had hundreds of commentaries on the Bible, devotional studies, or idealized testimonials about being a Christian. 

There was even a multi-volume series that claimed to decode the prophetic books of the Bible.  These books were somewhat frightening because they prophesied the “end times” of the apocalypse when Jesus would come back on a white horse to judge mankind and send the damned to hell.  For decades, I was programed by my parents to believe that the world was ending soon.

Some books used Evangelical Protestant readings of the Bible to debunk other religions, explaining how Muslims, Jews, Mormons, and everyone else were destined for hell because they worshiped false gods.  Some books described America as the province of Satan, filled with atheists, feminists, gays – the whole host of the damned. 

Some books described America as a Christian nation, pointing out how every notable American believed in Jesus, from George Washington to George Bush.  There were also Christian novels. Some were classics, like Bunyan’s Pilgrim’s Progress or C. S. Lewis’s The Lion, The Witch, and the Wardrobe.  Many were modern Christian novels. 

The most interesting Christian novels focused on “spiritual wars” between demons and angels.  These books told lurid tales of unscrupulous liberals, drug-addicted celebrities, and vicious abortion doctors.  The most frighteningly vivid novels explained the torturous violence of the apocalypse, which my father told us, again and again, was looming in the near future.

Once I became acquainted with the religious nature of my father’s books, they largely lost their luster.  Even at the time, as a young boy, I sensed how silly many of these books were.  But there were some other books that caught my attention. 

Deep within the bowels of this mountain of Christian literature laid an unnoticed and surprising corner of dusty books.  For many years I passed them over because I did not recognize the authors and I could not classify their content based upon the titles.  Then one day I happened to open one.  I’m not sure why my father had these books.  I never bothered to ask.  They must have been from elective courses in college.  Yet, unlike most of his other college books, which he gave away or destroyed, he saved these books for some reason.  I don’t know why.  But it was a fateful decision. 

At first, I did not completely grasp the meaning of these books.  Upon my father’s self, I found Friedrich Nietzsche’s The Birth of Tragedy and The Genealogy of Morals, Thomas Hobbes’ Leviathan, Henry David Thoreau’s Walden Pond, Ralph Ellison’s Invisible Man, J. D. Salinger’s Catcher in the Rye, Herman Melville’s Moby Dick, and Nathaniel Hawthorne’s The Scarlet Letter

After several readings, these books slowly opened up not only other worlds, but they also opened up a sense of self, and began to inspire a purpose.  My true education began at last. 

As a philosopher once recalled, "The substance of my being has been informed by the books I learned to care for.  They accompany me every minute of every day of my life, making me see much more and be much more than I could have seen or been." (3) I too felt this way.  I will always feel this way.  These books, and many others, would awaken me from a deep existential slumber.  They would help clarify my own existence and open up a world of possibilities.  I carry the words of these childhood books still written in my mind, burning in my heart. 

These books would eventually change my life, but it took a while.  Most were quite difficult to understand.  I had to read them all more than once.  I remember trying to read Ellison’s Invisible Man the first time at the age of thirteen.  I was disappointed when it was not about an invisible man, or any other super hero with special powers.  I lost interest and dropped the book on my floor. 

But after a few months I got bored and picked it up again.  It was difficult and strange.  The cruelty of the battle royal was very alarming.  But I was drawn in by the narrator, a young black boy trying to find his place in a hostile white world.  I will forever remember those words written on the assumed letter of recommendation as he traveled north to escape the racist prison of the South, “keep this boy running.”  Here was a character searching for wholeness, searching for identity, searching for freedom, and nobody could help him – nobody would help him.  He literally finds that he has no place in the corrupt, crazy world, living underground by the end of the book, relishing the alienated freedom of his invisibility. 

That message struck me deeply and helped me understand my own situation. 

I read the Catcher in the Rye shortly after.  Again, a young protagonist searching for a way out of the false and confining world of school, adults, and meaningless existence.  This book was much easier to read.  And like Ellison's book, it hit me on a visceral level, speaking to my condition. 

After reading these two books I was significantly changed, although I couldn’t put my transformation into words.  It was more of an unconscious feeling.  I spent many afternoons pondering the meaning of these books and the meaning of my life. 

I knew that I was not alone.  Others had felt my alienation, my disappointment, my confinement.  Others had searched for freedom.  Through these fictionalized stories I found my self, waiting.  Unexpectedly, I found my self in the prose of another.

I was caught off guard.  I did not expect literature to communicate so clearly and to penetrate my self so powerfully.  Up until this point, most of the secular books I had snuck into the house were a diversion, an escape from reality – they were fun. 

Ellison and Salinger were different.  These books were more difficult to read.  The plot was not always exciting.  The messages were more obscure.  It was almost like reading the Bible.  Actually, I came to realize that it was like reading the Bible.  Just like the Holy Book, these works of literature were also holy books because they held meaning, deeply hidden, but waiting for the patient and active reader. 

In these holy books, there were deep and cryptic lessons on the human condition.  These books described a reality and an experience that was very different from my own, yet there were profound similarities.  As M. H. Abrams once explained, "In a receptive reading of the text as literature...we participate from the inside with subjectivities very different from our own - the subjectivities both of the author and of the characters that the author has bodied forth - and so are enabled to see ourselves as others see us, to see others as they see themselves, and to acknowledge in others some part of ourselves." (4)

I read about myself in these works of fiction.  In these books, I found my own experience, my own longings, my own fears, my own dreams.  I was the invisible man.  I was Holden Caulfield. 

For some time, I had been searching for an identity and a place in the world.  These books communicated truths that helped me to understand and articulate my inner self.  Books also provided a link through the past with other people who wrestled with similar experiences and problems.  I did not just read books, as the philosopher Montaigne once explained, I discussed and dialogued with the dead as contemporaries, as friends.

It would be a year later when I opened the pages of Thoreau’s Walden.  At several times I almost stopped reading because the book was so difficult, but for some reason I kept coming back.  I sensed that there was something – something that I needed to find in those pages.  I felt urgency and impatience in Thoreau’s tone.  I intuited a deep wisdom underlying his conversational prose. 

It was as if this man was reaching through history, shaking me from a slumber, and trying to teach some hidden truth.  And then I found it.  I found a truth I will always carry with me. 

Thoreau said to me, “I went to the woods because I wished to live deliberately, to front only the essential facts of life, and see if I could not learn what it had to teach, to live deep, to suck out the marrow of life, to put to route all that was not life, and not, when I came to die, discover that I had not lived.” 

As a boy, these words hit me like a hammer.  I was anguished in my prison, realizing how I had not yet lived.  I was stillborn.  I had been wasting time playing empty games dictated by adults.  My life was planned according to a pattern that chaffed and cut my inner self. 

I felt transformed.  For the first time my inner self felt legitimated and strangely powerful.  I resolved to breathe after my own fashion.  Let others be damned, I would see who was the strongest!  I began to see life as a gift. 

Despite external dictates and expectations, I realized life was mine to live as I would make it.  My life was clay.  I could form it according to my own design.  I realized that Thoreau too had been trapped by the confines of tyrants.  He too had been kept from living freely.  But he rebelled against his fate.  He made an exodus to the woods, sought out his true self, and managed to live deliberately at great personal cost. 

“What concerns me now,” wrote Thoreau's friend Margret Fuller, “is that my life be a beautiful, powerful, in a word, a complete life.” (5)  I would read Fuller’s much later in my life, but these words express the profound aspiration I had upon reading Thoreau for the first time.   

Thoreau, Ellison, and Salinger gave voice to the rebel inside.  I was inspired with a vision of another type of existence.  I vowed to dig more deeply into life’s mystery and find my own way. 

But who was I? 

How should I live? 

I knew nothing.  I was totally ignorant about that which really mattered.  So, until I could find my way, I became more actively rebellious, breaking the chains that bound my ability to be me. 

I was at the beginning of a quest to live deliberately. 

I wanted to discover my self and test the boundaries of being.  I had found a way out, but I was not yet free.  I realized that I could not act until I had broken away from the confinement of home schooling.  I waited patiently for the first flower of freedom quickly budding within.  I longed for high school because it would be my escape.


References  

(1) Robert Nozick, The Examined Life (New York, 1989), 15.

(2) Lucien Febvre and Henri-Jean Martin, The Coming of the Book: The Impact of Printing, 1450-1800 (London, 2010), 104.

(3) Allan Bloom, The Closing of the American Mind (Touchstone, 1988), 245.  I agree with Nussbaum that books can become false authorities and that, quoting Seneca, there should be "a space between you and the book" (35).  However, I disagree when Nussbaum says that "books are not 'alive'" and that they display a "inflexible sameness, addressing very different people, always in the same way" (34).  Instead, I agree with Ralph Waldo Emerson who argued that books can communicate a person's ethos and experience, and that we can interact with a book as living thought.  See Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of Reform in Liberal Education (Cambridge, MA, 1997), 34-35.

(4) M. H. Abrams, "The Transformation of English Studies: 1930-1995," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 147.

(5) Margret Fuller, Woman in the Nineteenth Century, ed. Larry J. Reynolds (New York, 1997).

My Mis-Education in High School, part 4

The Hidden Curriculum of High School

originally written 2014

Out of paternal sense of love and duty, my education was planned and my life dictated.  Certainly, the prison of home schooling was built for my protection, but like a son of fallen Adam, I vowed to test the laws of God.  When divine punishment was not forthcoming, my transgressive spirit stretched further. 

By the time I was sixteen, I had been born again.  Despite being baptized by my pastor in the river as an outward sign of my forced devotion, I came up through the cold water alienated and confused.  I was dancing to the warden’s tune, while plotting my escape.  But I had nowhere to run. 

I was trying to find a way out.  Freedom would come slowly.  But even when I found some semblance of freedom, deliberate living would still be far away.  The freedom found during high school and the early years of college turned out to be a mixed blessing. 

During these years, I found a freedom from the restrictive boundaries of my youth.  But I was only rebelling against the dictates of others, while exercising a blind volition.  I had no freedom to live my life as I wanted. 

To make matters worse, I didn’t even know what I wanted.  Although less restricted, I was directionless.  Instead of living deliberately, I drifted mindlessly. 

Seeking a journey for truth, I became intoxicated by freedom and succumbed to wanderlust.  Eagerly breaking away from youthful prisons, my exodus stalled in the deserts of experimentation and transgression.  

When I was sixteen, my parents allowed me to re-enter secular society.  I had forcefully demanded from the start of home schooling that I would return to a normal life one day.  Half-tamed by my captivity, I had matured and embraced my public self. 

I was a good Christian boy.  I had been publicly confirmed and congratulated by the whole congregation.  I had been baptized in the river and my sins had been washed away.  I had proved my merits and acted the part of a good Christian boy so completely that a great trust was extended. 

My parents felt secure in the external signs of my religiosity.  They were no doubt sure that I had been fully converted.  I had become a faithful follower of Christ, so a return to public schooling seemed sensible and safe.  In the fall of 1991, I was enrolled in senior high school as a sophomore.

As far as anyone knew, I was a model Christian – baptized with water and “on fire” with the blood of the lamb.  I half-believed it myself at times.  I certainly played the part.  This had been my only way out of the tyranny of my home.  Sent to Biblical boot-camp for the past few years and geared up in the armor of the Lord, my parents believed I had internalized a spiritual discipline.  They believed that I had become a soldier for Christ.  

They assumed that my faith would withstand the snares of the secular world.  But unbeknownst to them, I had already undergone a secular transformation.  Before high school began, I was already losing my religion. 

I was in a paradoxical position.  Like Werner Heisenberg once acknowledged, "If someone were to say that I had not been a Christian, he would be wrong.  But if someone were to say that I had been a Christian, he would be saying too much." (1)

I had kept up appearances, diligently polishing the Christian façade, but internally this public face had cracked.  During high school this façade would to crumble.  I would come to sacrifice Christ for the secular sacraments of sex, alcohol, and rock & roll.  My new secular environment unleashed a devilish spirit, and over the next few years I would let this beast run wild. 

Being an adolescent in a public high school was tough.  Actually, it was excruciatingly painful.  I was caught between two senses of self: the good Christian boy, an established public identity that I had worn over the last three years, and another identity that grew from my inner self, which was emerging in this new secular environment. 

I didn’t know who I was and I didn’t understand the new game I had entered.  I was unsure how seriously I needed to take my schooling because high school seemed like a joke.  Most of my peers treated the routines of schooling like a suburban crosswalk: you only follow the rules and walk between the lines on those rare occasions when police are watching; the rest of the time you do as you please and break the law at will. 

Despite the outward rituals of schooling, I quickly sensed my peers were playing a game that had nothing to do with education - a game that adults didn't understand.  Living at the borders of a complex social ecology, I was lost and baffled.  I didn’t fit and I didn’t know the rules.  I didn’t know how to dress, how to speak, how to act.  I had no friends. 

At first, high school was a Darwinian jungle, all predators and prey.  I was fresh meat, trying not to be eaten alive.  I was isolated in a hostile environment.  Playing school under the prying eyes of teachers and parents was the easy part.  Classes were simple, teachers were accommodating, and exams were surprisingly easy.  The harder test was learning how to survive and thrive in the savage social world of the teenage animal kingdom. 

The routines of schooling mask a hidden curriculum.  Teachers rarely acknowledge its existence, but students intuitively understand and comply.  Beneath the academic facade lies subtle rituals and a fight for social status.  Socialization is the primary imperative of the institution called we call “school,” not education in its broader scope, nor training in the narrower sense. 

Yet schooling in America is not socialization into the adult world.  Instead, students are coddled by a rosy idyllic garden during elementary school only to be later thrown into a cruel teenage pantomime of American society: a status-driven, class-based, red in tooth and claw struggle for power and fame.

Living in the northwestern United States, in Oregon, elementary schools seemed to provide a homogenizing social experience.  Ethnic, class, religious, and personal differences were boxed out by an ideology of friendship, fairness, and follow the leader.  More than literacy and mathematics, elementary schools drilled a deeper lesson: following directions and living peacefully with your peers. 

Grade school teachers socialized kids into thinking that we all live in one big happy family led harmoniously by the adults in control of our lives.  As long as we do as we’re told, play nice, and treat everyone fairly, then all is well in world.  Of course, this is all a lie.  We live in an ethnically diverse, morally fractured, competitive, caste-based, racist, existentially seething, alienating society where not everyone has an equal chance to succeed and never will. 

The idyllic ideology of Eden taught in grade school begins to crack by junior high and the lies began to fall before a harsh reality.  Kids begin to realize the real world is unfair and brutal so they begin to treat each other accordingly.  The social fissions and corruption of the adult world collide with the natural rebelliousness and experimentation of adolescence. 

By the time teenagers have begun to scratch out their identity, navigate the expectations of maturity, and advance towards the inner mystery of adulthood, the world reveals itself in all its contradiction and cruelty.  For most the veil is lifted by the age of sixteen, if not earlier.  The horror of horrors lie exposed. 

Kids realize the part they must play in the savage circus we call society.  As the fictional Vernon Little exclaims about the "lie-world" children live within, "The truth is a corrosive thing...The Human Condition...Watch out for that fucker." (2) Reality bites! 

High school is the institutionalized fall from grace.  It is a slaughter house.  Teenagers find their childhood gouged out, blindly falling over the next few years into the razored webs of adulthood.  Easily corrupted by the dim hope of freedom, most teenagers eagerly rage to embrace the dying of their light, but not all. 

Some grasp hard at their naivety.  These lambs blissfully stumble past much of the violence by hiding in their ignorance, pitching a tent in daddy’s protective checkbook, clutching at a crucifix, or diving into some other mechanism of escape.  Yet these lambs too will be led to the block.  No one escapes. 

The hidden curriculum of high school contains several important truths that every student struggles to learn.  While the state mandated curriculum is optional, and easily subverted, these deeper lessons were unavoidable, arising from the necessity of survival. 

The first lesson was paradoxical, but strangely familiar.  All adults are enemies, until you become one.  Most teenagers want to become adults as quickly as possible.  Teenagers desperately yearn to knife away their innocence and the restrictions of childhood in order to gain the bittersweet fruits of adult freedom, completely unaware of the crippling responsibilities and consequences that follow. 

As children we live in a state of debt peonage to our adult overlords to whom we owe our life and love.  We become more conscious of the circumstances of our slavery during adolescence.  Teenagers live uneasily in the humiliation of bondage, even though most still love their masters. 

All teenagers yearn for those two magic days, marked by the astral signs of birth, when freedom is bestowed and life truly begins.  These secular sacraments are the eighteenth and twenty-first birthdays.  In America, these dates are more anticipated than the second coming of Christ. 

By adolescence, teenagers are keenly aware of their bondage.  Children are human clay being molded by parents, priests, and teachers for distant and alien ends.  Adolescence begins to breed a seething rebellion against the institutional structures that control the contours of youth. 

Teachers bear the worst of it.   These poorly paid professionals are the tyrannical gatekeepers of the adult world.  Public school teachers are pitiless mercenaries employed by the state to torture all teenagers with boredom, rules, exams, and homework.  Middle schools and high schools only appear to educate. 

Beneath the surface of ordered classrooms, hall monitors, and report cards there lies an incensed rabble often bubbling into anarchy.  Teenagers only play the role of student so they can resist any actual learning.  Every classroom seethes with silent hatred, refusal, and subtle revolt. Teachers are in charge only to the extent that they do not push their prisoners too hard. 

Thus, the primary duty of all teenagers is to resist schooling.  This art form is taken one step further by the most ambitious students: Resist schooling while earning good grades.

The second lesson of the hidden curriculum was more subtle.  Pagan shamans once believed that having knowledge of true names was a form of power.  To name objects was to know.  To know was to master.  Real friendships during high school, although possible, were rare.  Most teenagers formed temporary alliances, treating each other as rivals in a game of conquest. 

For protection, you learned to hide your identity, to wear masks, and to practice subterfuge.  For aggression, you stabbed at others with labels, wounded with pranks, and tore down the unsuspecting physically and emotionally.  The powerful proudly conquered the weak. 

Popularity was the prize.  Society was a skirmish and high school halls were filled with marauding teens armed with verbal bludgeons, savagely striking with a stinging sense of sarcasm and entitlement.  The unpopular students largely cower in fear, avoiding public spaces at all costs.  Of course, not all the young gladiators were aware of the arena, understood its cruelty, or willingly participated in the gory contest of battle.  Some scurried away from the predators and hid beneath the rocks. 

Popularity is power.  Like most other forms of authority, it is largely accumulated through savage exploitation.  Most popular kids calculate the loss of lesser lives like warriors conquering for prestige.  Many scalp hard, and wear the bloody pulp as a badge of virility and honor.  I saw countless innocents fall under the knife, myself included.  The wounds cut deep. 

During my sophomore year, I was socially bloodied, but not often publicly humiliated.  I licked my wounds, avoided the most powerful potentates, and quickly learned to strike first blood.  By my junior year I fought my way up the social hierarchy to earn a place in the shadow of kings.  By my senior year, I was royalty.

As the battle for popularity was brutally learned, the next lesson of the hidden curriculum became strikingly obvious.  The high school is a structured social hierarchy, tiered like the ancient chain of being.  There exist two separate spheres of influence.  In the adult fiefdom, the school principal was enthroned on high, supreme in power and aloof, aided in trinity by the vice principle and dean.  This administrative godhead was attended to by gradations of lesser deities who engineered the institution of socialization: teachers, counselors, and coaches. 

The actual breaking of the teenage beasts was done by an army of under-educated lecturers and disseminators of standardized tests.  These so-called "teachers" were not really paid to teach.  They tried to tame the savages through rituals of sadistic boredom and mindless conformity.  Students are locked in caged classrooms, daily broken by homework, exams, report cards, and class rankings.  We play this game because we're supposed to.

But within the halls and outside the walls, there exists another sphere of influence, more important and more immediate.  Graded down to the lower depths of hell, the petty fiefdoms of the teenage rabble keep local control: athletic princes, fashion princesses, preppy courtiers, class clowns, drug dealers, thugs, and an endless motley assortment of ill-defined cliques. 

Despite the academic pretension of the American high school, the singular purpose of this institution seems to be sorting individuals into social cliques, loosely related to the hierarchical social structure of the adult world.  Students must find a petty fiefdom and pledge allegiance to secure identity and friendship.  It can be vicious.  Not all teenagers make it through this ring of fire. 

Teenage identity is socially negotiated by the silent consent of the majority.  Some use humor, some intelligence, some physical prowess, some beauty, and some use violence.  From the first day of fall term, labels are branded about by the powerful.  One either accepts the hand of fate or struggles valiantly against it.  The trenches of the social world are daily assaulted and the flags of fidelity switch back and forth over the scorched terrain.  More than a few lose their life in the struggle for identity and acceptance. 

The institution of the American high school is not just a physical and social reality.  It is also ideological.  This was the final lesson of the hidden curriculum – and the hardest to learn. 

Beyond the ordered classrooms and tiered identity groups, there was an ideological message inscribed on the walls and emblazoned into every textbook, although it has devolved into mush over the last quarter century.  If the practical purpose of high school was socialization, then the ideological purpose was nationalization.  The ideology of Americanism is a confused rhetoric of equality, meritocracy, and aristocracy. 

For most teens these three principles morph into a mocking myth called the "American Dream."  It goes something like this:  Teenagers are second-class citizens equal only in powerlessness.  While forced to follow the dictates of adult society, youthful slavery is eventually exchanged for citizenship, granted in stages at eighteen and twenty-one.  Thereby, all young adults gain unequal measures of quasi-freedom, while slowly selling themselves into another form of slavery, a career.  They barter away their newly granted liberty in a labor market, exchanging hope for a wage. 

Students mercilessly compete for academic distinction and degrees, often financially indebting themselves, just to fetch a higher price on the road to serfdom.   The rewards of prosperity, however, are limited and not equally available to all, especially those marked by the disadvantages of poverty, gender, race, or disability. 

Because America is one of the most inequitable countries in the developed world, the greatest spoils are bestowed on those born into privilege.  The rest claw up the broken social ladder, striving for success.  Few make it.

 


References

(1) David Lindley, Uncertainty: Einstein, Heisenberg, Bohr, and the Struggle for the Soul of Science (New York, 2008), 77.

(2) DBC Pierre, Vernon God Little (New York, 2003), 125, 28, 129.

My Mis-Education in College, part 5

Life’s a Party, Until Reality Bites

My Journey from High School to College

originally written 2014

Navigating the hidden curriculum of high school became the focal point of my adolescent education, although the last lesson took almost a decade to understand (I was blinkered with the myth of the American Dream well into graduate school). 

Each semester of high school was like making my way through one of the nine circles of hell.  After successfully performing this sacred rite of passage, I was among the blessed (just barely), those who would venture on to the Purgatorio of college.  So called "higher education" was a magical institution where immature souls waited for ascension to the mythic bliss of professional purpose and economic independence (although these heavenly rewards prove perpetually illusive to most). 

The secular comedy of schooling in America marks a profound human transformation, second only to death, albeit for many, much more protracted and painful.  I survived, of course, but it wasn't easy.

Entering high school as a sophomore in the lowest ranks of hell, I thought the torture would last forever.  I endured a year of humiliation and ostracism at the margins of the savage circus, making few friends, but keeping up my grades.  Although home school was a joke, I soon realized that the academic lie of high school was an even worse masquerade. 

Classes weren’t challenging.  Homework was simple.  Cheating was rampant.  Maybe half of the students actually did their own homework, but that wasn't saying much because a trained monkey could also accomplish most of our assigned academic tasks. 

There were a hundred ways to cheat on an assignment or a test.  The easiest and most brazen way was to get the teacher’s answer book and write down the answers in advance.  In some classes, that wasn’t very hard.  Most students cheated in some way each week. 

I managed to pull a B+/A- average without much effort.  I cheated some of the time but not often.  Hell, most assignments and tests were so easy I didn't need to cheat!  Most of my friends earned respectable C averages by doing nothing at all.  Literally, they did nothing at all and still managed to pass classes!!!  

High school was an academic joke and it seemed our teachers were oblivious, or they just pretended to teach, probably both.  In some classes, we simply sat around, socialized, played cards, and gambled away our lunch money. 

My friends and I eventually graduated with a diploma.  This piece of paper certified not an education earned, not even a social milestone, but simply our ability to successfully play a silly institutionalized game called "high school". 

Any effort or enthusiasm in the classroom was against the unstated teenage code of honor.  We were proud of coming through this institution no smarter than when we came in.  For most of us, what was really significant about these years were the positive social experiences of popularity and partying.  We refused all notion of adult responsibility in our drunken denial, as we spun around in circles having fun.

Conformity was the key: dress a certain way, engage in subtle rebellion, poke fun at teachers, and slowly earn the respect and trust of peers.  While I didn’t make many friends as a sophomore, I did manage to impress the right people, acquiring scores of powerful acquaintances and the scorn of several teachers.  This earned me a ticket to the popular crowd by my junior year. 

At the age of seventeen, I was born again – no, a different kind of “born again.”  Like a religious conversion, I was baptized into a new community.  I was now a “cool” kid. 

I had ingratiated myself with the varsity soccer team as a member of the summer tournament squad.  I was a dedicated player and I loved the game, but I hadn’t had the luxury of premiere leagues, private lessons, and expensive summer camps.  Despite trying very hard to officially qualify for the regular varsity team, my skills were not advanced enough, and I was too old for junior varsity.  However, I still played whenever I could and I became fast friends with the entire soccer team. 

These friendships opened the doors of my social existence and defined the course of my life for the next two years.  I took a new identity and lived a vibrant life, rising from the lower depths of hell to become popular.  I had been invited to dine at the table of teenage kings - well, princes at least.  As in most high schools, true royalty was reserved for football players and the cheerleading squad.  My life shortly transformed from a state of awkward exclusion to become one big party.

As a newly ranking member of the popular class, I pledged allegiance to a pantheon of teenage pagan gods: athletics, intoxication, fornication, rebellion, and general mischief. 

We were determined to unshackle ourselves from the slavery of childhood by subversively, often illegally, engaging in certain rituals of adulthood.  The primary technology of teenage rebellion was alcohol, the golden nectar of liquid courage and affability.  Alcohol has long been used by mystics, divines, and pleasure-seeking fools.  It breaks down inhibitions, brings on feelings of general wellbeing, opens the doors of perception, and allows for the spontaneous release of emotion and energy. 

As underage drinkers the most difficult task was acquiring this magic elixir, but it wasn’t too hard.  We sometimes used older siblings or even permissive parents.  More often we befriended newly minted adults who still liked to party with minors.  Less frequently, friends would purchase fake IDs or inherit a sibling’s old license.   

Another option existed for some, like me, who looked old for our age.  On more than one occasion I simply walked up to the counter and purchased small quantities of alcohol without any questions asked.  Some daring acquaintances actually stole booze, often through the backdoors of supermarkets, but at times they just picked up a six-pack and walked straight out the front door.  Getting alcohol was not too difficult, neither were drugs.  I had several friends who were petty drug dealers. 

The most challenging aspect of underage partying was finding a suitable venue for our Dionysian revelries.  Drinking or smoking pot with a small group in a backyard, in a car, or in a parking lot was the easiest way, but one had to be careful not to draw too much attention.  Agents of the adult world were everywhere.  Police were a constant fear.  Our premature push into the adult world was criminalized, and otherwise mildly mischievous behavior was enough to earn a rap-sheet. 

Secluded public spaces were the best option for drinking.  It was easy to hide.  Parking lots or parks worked well, but these locations were often easily discovered.  Sometimes we would go into the woods outside of town, telling our parents that these excursions were “camping” trips.  But most teenage drinking and pot smoking is done at house parties, usually when parents are away on business or vacation. 

I wasn't a drug person, although many of my friends were.  I didn't try marijuana or other drugs until college, but even then, I always preferred alcohol because pot made me sleepy, and other drugs made me a bit paranoid.  I'm sure all the alcohol I consumed as a minor destroyed more than a few brain cells because I don't have many memories of all the endless parties.  It’s one big blur.  This time of my life all blends together. 

I have several hazy memories, mostly when something outrageously funny or serious occurred: stupid pranks, fights, bloody body parts, girls, and sometimes near-death experiences.  I know I had a great time, for the most part, but details aren't there.  Some of the most notable highlights involve near death experiences, close encounters with the law, and experimenting with sex.  Of course, teenage sex is mostly unremarkable in its splendid awkwardness.  I remember the feelings of fear and anticipation more than the sex itself.  There was also dancing, laughing, broken bottles, fights, beer bongs, drugs, and much else I won’t speak of, including wrenching my guts out and many nights passed out in random places.  I know I had many wonderful experiences during these years.  I just can’t remember most of them. 

Looking back at our youthful stupidity, I am amazed that more of us did not get arrested for underage drinking, public disturbances, drunk driving, or more serious offences.  At several points many of us came close to dying, some in automobile accidents, some from alcohol poisoning, and some from ridiculously stupid behavior. 

Leaving early from a party one night, I got lost in unfamiliar hills and stopped at the side of the road because I saw a shoe.  Flashlight in hand, I realized that a friend’s truck had driven off the road and rolled over a hundred-foot embankment.  I managed to climb down the hill and stabilize the two boys, one of whom had broken his arm, while the other had punctured a lung and was coughing up blood.  I put flares on the road and flagged down a vehicle of strangers that happened to drive by.  The ambulance arrived over an hour later and I had to help the paramedics haul the gurney up the embankment.  Luckily, they both survived. 

Another time my friends and I were the lucky ones.  We were in a truck drinking while driving around the mountains not far from town.  For some reason my friend lost control of his truck, we slid over the embankment, and down the side of the hill.  We happened to crash into a tree, which caught us from falling over a cliff to our certain death. 

There were countless other times when in a drunken haze we drove our cars, played with guns, jumped off bridges, got in fights, rode bicycles down a flight of steps in the house to fly out the front door.  On occasions such as these, and many others, my friends and I dumbly managed escape – from detection by the police, from destruction of property, from dismemberment, or from death.  It was all a game to us, although quite serious in its consequences. 

We look our lives lightly. Believing ourselves to be invincible, we acted as if we would live unscathed forever.  Not all of us did.  We had a dangerous attitude and it caught up to many of us, myself included.  One by one, we began to fall victim to our karma. 

Some damaged themselves, some died, some were arrested, some become alcoholics, some addicted to drugs, and some dropped out of school.  The funeral of a high school friend is perhaps the worst.  I remember going to three.  My fall didn't come until my sophomore year at university. 

Unlike many of my friends, I had the grades to go to a four-year university right out of high school.  However, I couldn’t afford it.  Instinctively I knew that college was a necessity, but I didn’t have a clear focus on what to do with my life, and I didn't want to take out a bunch of students loans only to waste my time in a directionless haze.

Besides, I was still having too much fun with friends in my home town, many of whom were still in high school.  Luckily there was a community college at the northern end of town.  Close to a quarter of my senior class enrolled after graduation, adding to the scores of other high school graduates from previous years who also attended.  We joked about the community college being an extension of the high school.  For the most part it was.  I had actually already enrolled as a high school student because I took “advanced placement” classes.  These were dual enrollment courses earning both high school and college credit.  When I graduated from high school, I was almost done with my first year of college. 

That fall I enrolled as a full-time student at the community college.  But even going to this second-class institution was not a done deal.  My parents made it clear that they could not help pay for college.  I had only a thousand dollars saved up from my manual labor jobs.  This paltry sum wasn’t even enough for a full year of tuition, books, and living expenses at cut rate community college prices. 

But a stroke of luck came my way.  During high school we had moved into a mobile home in a trailer park at the edge of town.  It made perfect economic sense for my cash-strapped family, but I was embarrassed as hell because of the obvious connotations between trailer parks and the stereotypical “poor white trash” that tended to live there.  However, this source of embarrassment was fortuitous because the landlord offered a thousand-dollar scholarship to the best qualified graduating senior who lived in the park.  That scholarship paid for most of my freshman year at the community college.  The rest came from a federal work-study grant, which enabled me to get a job on campus. 

So, outside of classes, I worked part-time at the chemistry lab cleaning, washing dishes and mixing solutions.  It wasn’t a dream job.  I was also working about thirty hours a week as a manual laborer, doing construction, landscaping, painting, and janitorial work.  I was able to pay for tuition, books, and modest living expenses (I was still at home), while saving money for university.  But even working over forty hours a week at two jobs was not enough to secure my future.

The atmosphere of community college was liberating, but not really challenging.  It was like high school, but without nannies supervising your every move.  No one cared if you came to class or failed.  Academics were a bit harder, although still relatively easy.  More concentration was required, but one could earn Bs without much studying at all.  May professors obviously didn't care about students or professional standards, and some were outright corrupt, like one anatomy professor who gave my friend copies of the test in advance because he happened to like professional automotive racing (my friend was a professional racer). 

Skipping class, being late, or not turning in assignments were no big deal.  Instructors exuded apathy.  Nobody made any effort to teach.  Nobody really cared, especially the students.  But the freedom of this environment actually masked a looming danger.  Many of my friends were failing at least one class.  Some never left the community college, many eventually dropping out. 

I came close to failure myself after a couple of bad tests; however, I managed to increase my effort and earn Bs and Cs for the first three semesters.  I was working full-time, weightlifting, coaching soccer, and partying like a rock star at least two or three nights a week, going to school, and trying unsuccessfully to focus on my future.  This frenzied pace would continue to define my life for the first two years college, but eventually it became unsustainable.  Some aspect of this equation had to be sacrificed. 

In order to be in college, I always had to work.  This was a simple fact of my lower-middle class life.  I worked around thirty to forty hours a week for all four years of my undergraduate studies, combining work-study on campus with part-time jobs off campus.   During my eight years of graduate school, I would often combine teaching assistantships with part-time or even full-time jobs, often working more than forty hours a week on top of a full load of classes. 

But even with working, the little I earned did not come close to covering all the costs of college and living expenses.  I had to take out federal and institutional student loans almost every year I was in college.  By the time I graduated in 2007 with my third graduate degree, I had over $75,000 in student loans.  I mortgaged my future to educate myself, hoping that someday I would be able to use my skills and knowledge to find a stable, good-paying job and eventually become debt free – or at least to have a positive net-worth. 

Working during college took its toll on my academics, but social activities proved to be the most corrosive element of my life.  After a year in community college, I was accepted at a state university in Oregon as a sophomore with a major in Athletic Training.  My father was an occupational therapist.  I loved sports, especially soccer.  Heath care was a booming industry with long term job prospects and high salaries.  This major seemed like a natural fit, yet I struggled with chemistry, I didn’t really like anatomy, and I hated math. 

I was your typical American undergraduate: high ambition, no practical understanding of professional standards, academically ambivalent, and expecting an easy road to future success.  Of course, this is not a recipe for achievement.  I started to slip into failure.

At the university, I was living far from home, sharing an apartment with friends from high school.  For the first time in my life, I was free to live as I wanted, but within limits.  I was constrained by the demands of school and the confines of my poverty.  I had been demoted from my parents lower-middle class status to the ranks of the working poor.  It took the first year at university to find a balance between freedom and responsibility, first tipping heavily to the former before edging more towards the later. 

For most teenagers, college is more of a social experience rather than an academic experience.  For my group of friends, it was a hedonistic paradise of unrestrained pleasure and experimentation.  I spent the first six months at university drinking as much as possible, smoking massive amounts of weed, and going to parties four or five times a week.  I was more interested in girls than grades. 

Often my friends and I would declare holidays in the middle of the week, cut class, and stay inebriated for days at a time.  This drunken debauchery seems to have become the primary purpose that many young Americans associate with college.  I have few memories of my sophomore year (is this a reoccurring theme?), outside of a surprise visit by my prudish parents the morning after a raging party and their stern scolding – I’ll never forget that!  It was an exciting time of irresponsible youthful exuberance, but such a life style cannot be sustained.  A rude awakening was looming. 

By the end of my second term at university, I had failed several classes, I was on academic probation, I had been cut from the competitive athletic training program, and I was broke (drugs and alcohol can be quite expensive).  I was at a crossroads and I didn’t know what to do. 

The consequences of freedom had fallen like a hammer, smashing immaturity and youthful delusions into jagged shards.  I was paralyzed for a time, not sure what to do with my life.  At the same time, some of my friends were dropping out of the university, either downgrading to the local community college, going back home to live the parents, enlisting in the military, or entering the labor market full-time.  I didn't like any of those options.

Since high school I had found freedom from restrictions, which enabled me to transgress youthful taboos and experiment with adulthood.  But this freedom came barbed in consequences that I could no longer avoid. 

For the first time I was failing at life, falling helpless into a nameless abyss.  I remember reading On the Road by Jack Kerouac at this time and feeling the same desperation: "The raggedy madness and riot of our actual lives, or actual night, the hell of it, the senseless nightmare road.  All of it inside endless and beginningless emptiness.  Pitiful forms of ignorance...This can't go on all the time - all this franticness and jumping around.  We've got to go someplace, find something." (1)

At the end of my sophomore year at university, I vowed to make changes.  I would take control of my life.  I would party less, get another job, and take academics more seriously.  I would choose a goal and use my existence for some worthy cause.  I would "go someplace" and "find something." 

But go where and do what?  I didn't know. 

I was personally and academically stuck in a directionless drift.  As I took a couple months to reexamine my life, I enrolled in literature and history courses to repair my grade point average.  This proved to be a fateful turning point. 

In forsaking an aimless freedom from restriction, I found the freedom to be.  But what did I want to be? 

Taking my life in my hands, I moved forward, stepping for the first time towards deliberate living. I worded hard at constructing my self, searching for direction, and then building a way forward into life.



(1) Jack Kerouac, On the Road (New York, 1999), 241, 108.

My Mis-Education in Graduate School

Serious Games

Graduate School and the Perils of Independent Thought

Higher education has always been about advancing social status and breeding elites, turning the educated few into a ruling caste of Brahmins.  As Henry Adams noted in the 19th century, "college offered chiefly advantages vulgarly called social, rather than mental."[1]  Both the older ecclesiastical university and the modern research university have been hierarchical and authoritarian institutions, molding young minds by socially conditioning them to carry on a prescribed intellectual tradition. 

In 1876 the college student G. Stanley Hall famously fumed about "the erroneous belief that it should be the aim of the professors of this department to indoctrinate rather to instruct - to tell what to think, than to teach how to think" [author's emphasis].[2]  Professorial indoctrination of ignorant youth was standard university practice in the 19th century and it remains standard practice in the 21st century. 

While the basis of elite power has changed from the dogmas of culture and religion to the dogmas of business and science, the phenomenon of social distinction based on academic degrees has been around for thousands of years and will never disappear.  Because universities are primarily institutions of socialization, learning is often subsumed to ritualized performance, deference to power, and rites of passage. 

Professors form a priesthood.  These sacred officers preserve canonical knowledge and officiate traditional practices.  Education, if it is offered, is often reduced to memorizing information and replicating ritual.  Disciplinary theories and methodologies "degenerate into rigidity,"[3] and they are often transformed into "unchallengeable dogmas" that students must accept to pass exams.[4] 

In college, students are taught "the one and only right way"[5] to do things, and they jump through intellectual and behavioral hoops in order to become initiated into a sacred professional guild, which is now referred to as a mere “major” field of study.  Students strive to earn public distinction and academic degrees.  Learning is optional. 

Students use these markers of social status to enter the labor market or to climb further into the holy academic ladder to graduate school and maybe the pinnacle of a PhD.  Those who correctly internalize the institutional norms of the university gain a sense of accomplishment and superiority, as Herman Hesse noted, “somewhat toward smugness and self-praise.”[6]  Thorstein Veblen was so critical of the modern university system that he wanted to subtitle his treatise on the subject with "A Study in Total Depravity."[7]  He wasn't far off the mark.

One of the greatest disappointments of my life was discovering that the citadel of the modern American university was cracked, corrupted, and crumbling from within.  From an early age we are all socialized to respect teachers, worshiping them as an almost mystical class.  University professors are often revered as high priests holding the keys to the intellectual kingdom.  But deserving of reverence, most are not.   

Even if many professors are sometimes brilliant, these custodians of higher education are self-absorbed, narrow-minded, vindictive tyrants, most of whom cannot teach, and would not stoop to do so if they could.  Apocryphal stories of the absurdity and cruelty of higher education have abounded for ages. 

One graduate student recalled a typical class, notable only by the fact that it was led by one of the luminaries of the American academy, "He read from his text for an hour or more, every so often losing his place...Such silly stories did not interest me, and [his] summary of them remained remote from anything I knew or cared about...Altogether, a puzzling performance from a man reputed great...Why did he teach so badly?  It seemed unpardonable."[8]

The modern university is focused on one primary goal: the creation of new knowledge through scientific research.  The traditional goal of transmitting knowledge has been eclipsed, but it is still a necessary function; however, it is clear that most professors grudgingly dole limited amounts of time and energy to deal with students.  Established forms of knowledge transmission have always been based on tradition, authority, and the ritual socialization of students. 

Teaching is a relatively novel invention, especially within institutions of higher education.  Students are mostly a burdensome bother to professors who are obsessively concerned about cornering academic niches of power and prestige through publications, conferences, and committees.  Professional academics endure a "living hell"[9] of intense scrutiny and competition, trying to reach a pinnacle offered by no other occupation: a well-paid, self-directed career with full benefits for life.

Professors are trained to do research, not to transmit information, and certainly not to teach.  For this reason, most professors merely propagate canonical dogma in the classroom and initiate students into a ritualized academicism, as their autocratic professors had done to them for generations.  Having become thoroughly institutionalized themselves, professors as the agents of the institution we call "higher learning" merely replicate the socialization process they were once put through.  This is called "schooling," after the Latin term schola, which meant a sect with a distinct set of practices.[10]  Official knowledge is therefore by definition "what you learn when you are taught at school."[11] 

Within their classrooms, professors are often autocratic dictators who merely throw a barrage of information at a class full of bleary-eyed and confused students. Many professors do not bother to acknowledge (let alone get to know) the ill at ease and tongue-tied young people populating their classes.  These ignorant beings awkwardly seeking social mobility are merely powerless pawns to be pushed around the "serious game"[12] called the university.  These naive lambs are led by the nose through intricate rituals, duped into thinking themselves knowledgeable, and eventually dumped unprepared into the slaughter house of the real world. 

Most university courses are cruel and boring jokes with limited application to students' lives or career aspirations.  All students, except the most eager and stupid, intuitively know this.  Most of the time professors simply lecture to a crowd for an hour.  Learning has been "bureaucratized,"[13] as content is pared down to a meaningless fiction of formulas, graphs, and factoids.  If you're lucky you might get some face time for twenty minutes during office hours, but the most prestigious professors can't be bothered with even these few moments of human interaction, delegating them instead to teaching assistants. 

Few professors try to understand a particular student's learning needs or educational goals.  Even in graduate school, in an expensive doctoral program no less, I had my graduate advisor tell me that he had no time to hear about my academic goals or personal life.  He was perturbed at the suggestion that he should even care about such trifling matters. 

As one relatively frank professor noted, "Your advisor may be crucial to your life, but you are not at all crucial to your advisor's."[14]  This of course can be extended to every facet of the university.  Students are simply transient, expendable, cogs in the academic machine.  Most of the time students are merely tolerated and treated with "benign neglect."[15]

After the initial glow of earning an undergraduate degree, many students decide to move into the academic holy of holies, clamoring to become rich or join the academic priesthood.  Unlike undergraduate studies, a graduate program initiates students into a specific professional practice by socializing them into ritualized disciplinary norms. 

The assumption is that students enrolling in an anthropology or economics program want to be anthropologists or economists.  Thus, graduate school is actually glorified vocational training.  A student is trained to become a professional knowledge worker in a specific academic market.  As far as professors are concerned, there is no other possible aim or objective for graduate studies - certainly a student would never enroll just to learn and gain knowledge.  That would be inconceivable! 

But unbeknownst to most students, these programs operate more like medieval guilds than modern trade schools.  The young apprentice is sold into virtual slavery for a number of years as the price of initiation into the secrets of the restricted trade.  What is the most important characteristic of a graduate student?  Brilliance?  Hard work?  Team player?  Talent?  No! 

According to one professor who's written a book on the subject, the most important single characteristic is "resiliency." It is the "power to persevere" in the face of all the "countless hoops and hurdles" thrown at the graduate student in a veritable gauntlet of painful bullshit.[16]  Students are taught the supreme value of "conformity" and walking "the straight and narrow path."[17]  Success in graduate school is not about knowledge or skills, it is about endurance and compliance.

One former grad student explained his low position within the academy as "masochistically overworked and under-appreciated."  He viewed himself as an "idiot" for thinking that graduate school would advance his future career.[18]  Authoritarian professors treat graduate students like dumb pack mules.  They're loaded down to the breaking point and then lead around by the bit, tracing some proscribed and monotonous course that tradition dictates is appropriate. 

Most professors don't care about students' educational or professional goals.  Students exist to be molded by the institution while serving their masters' interests.  John Dewey once quipped about a fellow academic, "[He] is incapable of either permitting men near him to work freely along their own lines of interest, or to keep from appropriating to himself credit for work which belongs to others."[19] 

Graduate school is a not-so-disguised form of exploitation.  While professors would no doubt be offended by such a remark, most graduate students clearly realize and suffer from their subjugation.  Stanley Aronowitz is one professor who has acknowledged that graduate school often "destory[s] the spirit of the aspiring intellectual."[20]  Louis Menand also acknowledged that "lives are warped."[21]

Lucky graduate students actually get paid to debase themselves, but of course most of these student workers are no more than glorified indentured servants, lacking "health insurance, benefits, parking, unionization, or a living wage."  Many grad students spend their time turning a tenured professor's grant money into more grant money, which primarily benefits the established professor's academic prestige and economic security, but does little to help the graduate student.

Thus, some students have described themselves a little more than "slave labor" and "disposable academics."[22] 

As a graduate student, the pinnacle of academic success is to "discover something extremely trivial about the world."[23]  Then you take this information and "share your observations with a small room of social awkward people paying minimal attention."  Or, if you're extremely lucky, you get "to publish your ideas in a small, unpopular journal."  Of course, if your research does get published, your major professor is more than happy to take credit for your success, often claiming primary authorship, even though this person didn't do anything except criticize and berate you every step of the way.[24]

But even if you're a model student, suffer through the shit, and work your way through to a PhD, there is no guarantee that you'll ever be able to fully capitalize on your degree.  Graduate schools have been overproducing PhDs for years, while the amount of full-time academic positions has steadily declined.  Currently only about 50 percent of the academic jobs in universities are staffed by full-time professors, while the other half are staffed by part-time adjuncts.  This is an exploited and vulnerable group of workers one critic called "academic lettuce-pickers."[25]  

The ratio between full-time and part-time instructors jumps to about 30/70 in the community college.  Between 1990 and 2004 only 34 percent of history PhDs were working in a higher education history department.  This problem has only been exacerbated by the Global Recession of 2008-9, as many universities and community colleges have cut budgets and slashed academic jobs.  In California, one of the hardest hit states, the California State University system cut 10 percent of its full-time professors, around 1,230 jobs - not to mention the thousands of lost jobs at the University of California and the community college system. 

After surviving the gauntlet, one recent PhD graduate emerged into a wasteland without any employment options.  She now makes a living playing on-line poker.[26]  This has led some to criticize doctorate degrees as "a waste of time" and even a "Ponzi scheme."[27]  Louis Menand is more gracious.  He simply calls it "inefficient": "There is a huge social inefficiency in taking people of high intelligence and devoting resources to training them in programs that half will never complete and for jobs that most will not get."[28]  William Deresiwicz calls this situation a "human tragedy."[29]

Most students, like myself, entered graduate school with their own educational aspirations and vocational goals, many not even planning on an academic career because there are few full-time jobs available.  Almost all graduate students are eager, smart, ambitious, and idealistic young people looking to make a mark on the world.  Some, like myself, had very specific academic objectives to accomplish. 

Given the democratic and liberal rhetoric of most western institutions of higher education, you would expect that professors would try to understand the personal interests of their students so as to individualize courses of study and help the student on his or her path to success.  Worse case, you would expect professors to be open to negotiation on the subject of course projects and supplementary reading. 

The nightmare reality is that most professors are narrow-minded petty tyrants who nail graduate students to the syllabus as if it was canonized holy writ.  Some of the more boorish even dictate the exact subject, style, and method of the assignments, leaving the student in the position of a mere scribe transferring doctrine from textbook to term paper.  Many of my professors were like this. 

To put it nicely, most professors are guilty of "professional malpractice" when it comes to teaching and student learning, which is exceptionally ironic if you are studying in a Department of Education!  However, I would never put it so nicely.  These intellectual cops often brutally abuse their status and authority because there is no one to keep watch over the knowledge police.  While free inquiry and academic freedom are hallmark values of the modern university, these mores are meaningless to graduate students and many junior faculty. 

Most professors are "ideological bullies" and they indoctrinate students after their own disciplinarian and methodical molds.[30]  Every academic discipline has a set of "canonical hypotheses" that are the specialized province of a "religious imperium," which rules over a small corner of the intellectual world like royals controlling a fiefdom.[31]  This kind of "academic dogmatism" is not only a threat to students' academic freedom, but it also violates students' intellectual development and maturation, turning students into mere clones of their professors.[32] It also stultifies knowledge and prevents the progress of new ideas.

I was acutely aware of this situation when I was in graduate school, and I did my best to hold my ground, demand respect, and define the contours of my education.  At first, I tried to negotiate with my professors.  I had thought, wrongly it turns out, that these people are reasonable and good-natured individuals who could be persuaded by the light of logical arguments.  Some were, but most were impervious. 

I tried to explain my own educational objectives and interests, and how I wanted to design the parameters of my research papers and course of study.  But to most professors, merely making such a monstrous request was proof of my general impertinence and disrespect. 

How dare I presume such an insolent posture towards my intellectual betters?  I was told to just make it easy on myself and do the assignment as the professor had dictated.  When I pressed forward with my impassioned plea to do my own research to accomplish my own objectives, the glare of disapproval and impatience lashed out. 

How dare I disrespect my superiors with such trifling sophomoric arrogance.  Just do the assignment or leave the class.  Some professors made a more damning and vitriolic judgment: Just do the assignment or leave the program!  Why are you even hear if you are not going to do what you're told?  Unbelievable! 

I had always wrestled with professors over course essays because I always wanted to research well beyond the narrow syllabus and engage in interdisciplinary and historical research. Although I always did superior work and was without exception at the top of every class I took, I was often punished for going beyond the narrowness of the syllabus and course readings.  However, I suffered no more than lower grades (the lowest being a B) and bruised pride. 

My first real experience with the seriousness of the game of academia was in a formal defense of my second Master’s degree.  I had competed a traditional Master’s degree in English and now I was working on a very ambitious interdisciplinary theoretical work on religion and ideology for a second Master’s degree in the field of Interdisciplinary Studies.  I wrote a two-hundred-page critical tome on the historical Jesus and the birth of Christianity, which fell flat in front of a committee comprised of some of the finest minds in the fields of English, History, and Philosophy at my university. 

They acknowledged my interdisciplinary ambition, but thought it was merely a guise for an undisciplined mind that could not do serious academic work.  They questioned whether I had the ability to write a bounded disciplinary monograph and voted 3-0 to deny my degree, albeit with provision that I could resubmit a new thesis paper for their consideration by the end of spring semester. 

I finished a new manuscript (over 100 pages) in six months, brought a fourth member on the committee, and passed 4-0, earning a second Master’s degree.  I had tried to tell them all along that my interdisciplinary mind was not "undisciplined," as they had claimed, but it took sacrificing my own principled interests to produce a more traditional monograph to prove my point. 

Of course, it also meant producing a monograph that was not really interdisciplinary, which was ironic.  I had to betray the very purpose of my educational endeavor to earn an academic degree.  At best, I earned a multi-disciplinary degree and wrote disciplinary monograph with multi-disciplinary relevance.  That wasn't what I had wanted to do.   

The apogee of my quest for interdisciplinary excellence, and its unfortunate consequences, came during a meeting for my PhD dissertation.  I had one young assistant professor hijack the meeting so that he could complain about my lack of respect for disciplinary boundaries and deference to the traditional authority of professors.  I had had deep methodological disagreements with this young professor who had only just recently completed his dissertation. 

I had taken a class with him during a previous semester and tried to engage him in discussion about his methodological assumptions, course readings, and course assignments.  He viewed my criticisms and collegial debate as impertinent and disrespectful.  He ranted to my PhD committee about how I did not belong in academia.  Who was I to think that my educational needs and objectives mattered?  Who was I to ask that courses be modified to satisfy my research aims?  Who was I to criticize my professors' judgment?  The committee agreed and I was censured.  I was told to toe the line or drop out. 

But really, what kind of arrogance is this?  As a graduate student I've spent a lot of time, effort, and money, not to mention all the personal sacrifice and stress, to join a department in order to reach my goals.  I did not come to a university to invest this much of myself just to mindlessly do someone else's work.  Am I really paying tens of thousands of dollars and enduring hell just to be institutionally socialized by self-obsessed and arrogant assholes? 

As it turns out, yes, that is exactly was graduate school is all about. 

After a while I stopped negotiating because I knew it would be perceived as lack of respect.  Instead, I took a different approach.  I ignored the specific intent of course assignments imprinted on the syllabus and exploited its vague wording to justify a different, yet related project that met my own research agenda. 

I wanted to do more than slavishly follow one specific disciplinary procedure and the narrow confines of the course book list.  Instead, I wanted to incorporate interdisciplinary methods, read an expanded bibliography, and study more complex research questions - all against the grain of standard graduate programs. 

So, I began to do course projects my way and just handed the modified assignment in at the end of class.  Sometimes I would preface my papers with a logical page-long "defense." I explained why I did not follow the exact prescriptions for the course project, and the intellectual merits of my own research subject and methods.  Needless to say, I was always penalized for such impertinence, written defense of my position or not. 

Despite being the most vocal and knowledgeable student in every class I took, demonstrating advance mastery of all the material and more, and producing professional "A" level work, I would almost always receive lower grades than I deserved - although I was able to keep an A- average throughout eight years of graduate school.  Some of the more vindictive professors would give me a B grade, which in graduate school is considered just a hair above failure. 

One professor even gave me an F on my final project, although it had been hastily crossed out and officially recorded as an "incomplete."  My financial aid was held up.  I also got a written warning from the graduate advisor questioning my intellectual abilities and commitment to my studies.  She said it might be better to throw in the towel and leave the program because it seemed like I was unable to do the work.  This comment made me both furious and embarrassed.

What was my heinous sin?  Instead of writing a traditional, positivistic, theory-laden literature review, I dared to historicize the subject, criticize a lot of vapid scholarship, and explain how the scholarly literature was effected by temporal and political processes inside and outside the academy.  I tried to explain, as Isaiah Berlin, Stephen Toulmin and others have argued, that it is "irrational" to force positivistic methods used in natural science on every possible social scientific inquiry.[33]  Of course, this line of argument was unacceptable. 

Merely opening my mouth to talk back and argue with my professor (the chair of my dissertation committee) was deemed impertinence.  Stanley Aronowitz has perceptively captured the unstated graduate school status quo, "In no case ought the neophyte attempt to forge a new paradigm, or even suggest a novel interpretation that might offend the intellectual powers-that-be."[34]  My professor was pissed! 

The chair of my dissertation committee, the man who held my degree in his hands, gave me the option of dropping out of the program or retaking the class during the summer.  I seriously wanted to do the former, but instead I did the latter.  I rewrote the term paper following the exact letter of the syllabus, which meant producing a boring, formulaic, and meaningless literature review that was of no real use to me.  I got an A.

Backed into a corner, I acquiesced.  I did what he wanted me to do.  I was a good dog and rolled over.  Of course, I could do the assignment, I just didn't want to because of cogent intellectual grounds.  The paper was a waste of my time.  It actually kept me from doing the important research that I wanted to do.  Bullying and intimidation were a constant threat.  In order to survive, I had bow before the voice of authority and toe the line.  And this always kept me from spending my time working on my own research agenda, for which I had come to graduate school to accomplish.

It was a nightmare, straight out of Henry Adams' critique of the 19th century American college, which was itself a holdover from the middle ages:

He found only the lecture system in its deadliest form as it flourished in the thirteenth century.  The professor mumbled his comments; the students made, or seemed to make, notes; they could have learned from books or discussion in a day more than they could learn from him in a month, but they must pay his fees, follow his course, and be his scholars, if they wanted a degree.[35]

I could appreciate the irony of this archaic drama, as I pulled my hair out and my stomach turned in knots.  Here I was in a 21st century Department of Education at the University of California, one of the premiere institutions of higher education in the world, and I was receiving a 13th century course of study, delivered with all of the pompous and prejudiced authority of a pack of medieval catholic priests.

Having survived this bullshit in two Masters programs, I had hoped to be treated better once I reached the PhD level, but actually things got worse.  I had graduated from an Oregon university with a rare level of accomplishment: I had earned two graduate degrees in three disciplines, I had an academic book published (and another almost finished), I was an internationally published poet, I had organized several cultural festivals and edited two volumes of local poetry, and I had been invited to teach as a Lecturer at the university. 

I thought this would help me advance to the next stage.  I had proved myself competent as both student and faculty.  When I entered a PhD program at the University of California, I assumed that I had earned a level of distinction that would enable my professors to treat me (for the first time in my career), if not as an equal, then certainly as an outstanding junior colleague.  I was wrong. 

None of my professors knew anything about me or my accomplishments, nor did they care to know.  When I tried to explain my situation, I was quickly silenced.  My degrees were labeled an inferior product because "humanities" methods had no place in the field of Education, which was supposedly an austere academic discipline of "social science."  Unbeknownst to me, all my previous knowledge and publications had instantly become a liability. 

When I would ask philosophical, historical, critical, or hermeneutical questions in class, they were dismissed as "out of place" and "inappropriate."  We do not ask those types of questions in this department, I was told.  It is just not done in our discipline, professors would sneer.  And I was duly told that it was not my place, as a lowly graduate student, to challenge disciplinary norms or institutional conventions.[36]

Of course, what none of my professors would ever acknowledge is that the university is a fractured political body of diverse units fighting over scarce resources, jockeying for legitimacy, authority, and social prestige.  All academic disciplines, especially ad hoc ill-defined disciplines like Education,[37] were rife with methodological diversity and factional dispute, as well as interpenetrated with various stripes of interdisciplinary niches.  As Henry Kissinger quipped, "academic politics are vicious precisely because the stakes are so small."[38] 

There is no such thing as a unified professoriate and the notion of scholarly "consensus" on any subject is largely a myth.  Robert Maynard Hutchins once joked, "the modern university [is] a series of separate schools and departments held together by a central heating system.  In an area where heating is less important and the automobile more, I have sometimes thought of it as a series of individual faculty entrepreneurs held together by a common grievance over parking."[39]

I specifically entered the field of Education not only because I wanted to be a better teacher and impact educational reform, but also because the field of Education was known for many outstanding interdisciplinary works of scholarship.  In 2000 I had started my first Master’s thesis with these words, "Creativity and independent thought seem to have been sucked out of the learning process in all stages of education. I find that a travesty and something that needs to be addressed and remedied."[40] 

I thought getting a PhD in Education would not only make me a better teacher, but would give me a platform to help effect real educational change in the United States.  I realize now that I was a naive fool.  At the specific university where I was enrolled there were only two acceptable forms of research: quantitative statistics or qualitative ethnography.  It turned out to be a dichotomous multidisciplinary department, and not at all interdisciplinary.  You learned to be a sociologist or anthropologist - and that is all.  No other options.  Case closed. 

Of course, there was a deeper irony.  In university Education departments across America, most professors had no professional background in the practice of education, like teaching, curriculum construction, or student learning.[41]  It was like joining an Engineering department filled with sociologists and economists and no experienced engineers, or like a department of Medicine with anthropologists and philosophers, but no experienced doctors.  It borders on the absurd!  But university education departments in the United States, and before them 19th century normal schools, have always been staffed with pedantic academic lecturers training unimaginative bureaucratic task-masters masquerading as teachers. 

The progressive movement to reform education in America during the early 20th century was never very effective and left most of its reforms unfinished, slowly reversed by the "back to basics" movement of the 1980s and 1990s, and then completely undone by the "accountability" movement at the turn of the 21st century which now cripples all facets of our educational system.

As if inhabiting such as stale academic environment was not bad enough, what was worse was the fact that I could find no niche to nurse my interests.  My background in history, philosophy, textual criticism, and the sociology of knowledge had no place in this Education department.  Although "education history-cum-philosophy" once had a central place in the discipline of education, this form of inquiry has been gradually abolished from most university schools of education over the 20th century. 

This precipitous loss of prestige followed the more general "devaluation of the humanities" in the western university - a field of study I squarely work within and have tried to defend in my own academic work.  Literally, everything about me and my intellectual objectives were considered invalid, inappropriate, and unacceptable.  How could this be?  How did I ever get accepted into this department in the first place?

Now I bear part of the blame for enrolling in such an inhospitable department.  Most graduate students don't know how to select a good program that fits their aims.  However, there was really no way for me to know how bad it was before I got there.  In fact, while I was enrolled, there was an external audit of our department because there had been so many complaints from graduate students and adjunct lecturers. 

Based on the departmental website, it seemed like nice enough place.  But once I arrived, I quickly realized my descent into hell.  I was told time and again by my professors that I did not belong in this department and that I should have gone somewhere else.  Yet, how fair is this claim?  Graduate students are extremely limited by geography, lack of money, lack of personal connections, and lack of direct knowledge of organizational cultures. 

Even if a perfect department is found, one cannot necessarily get accepted into that university for various reasons.  But the whole idea of fitting into a perfect department is a myth.  Most academic departments in this country are dysfunctional, perhaps not as bad as the one I was enrolled in.  And "fitting in" is often random, as it depends upon finding a small group of professors that like you and your work, which can't really happen until after you've been with a department for a while. 

I had applied to Berkeley, Stanford, and the University of Chicago, places where quality interdisciplinary work is done, but I was not accepted.  I wasn't told why, but obviously I didn't have outstanding test scores or a straight A average, which probably disqualified me from the start.  Even if I had been accepted, I would never have been able to afford the tuition and living expenses to attend such universities.  So, while students do have some power over "choosing" a specific school that could fit their interests, there is no way to fully grasp the disciplinary culture of any given department, nor is there any real control over which institutions will accept you and how much the program will cost.  Rational choice models ignore not only the powerlessness and ignorance of graduate students, but also the mystified and dysfunctional nature of most academic departments.

Is it so outlandish to think that academic departments should adjust in some way to the educational objectives of the student?  But they don't.  Graduate students are expected to assimilate completely into their new institutional environment.  No melting pot, just a uniform mold slammed down imperiously on each graduate students' head!  How ironic, if not flat out hypocritical, given the outspoken condemnation of many liberal professors when it comes to ethnic assimilation in nation states.  However, when it comes to their own department or class, most professors act like rabid supremacists: shut up and assimilate, or get out!  This is how I was treated, as were all the graduate students in my department and many others in graduate programs across the country. 

However, unlike the rest of my peers, I didn't put up with being pushed around, nor did I accept being mistreated.  I earned a reputation for leadership and independent thought, and most of my peers looked up to me - although at times their veneration morphed into catharsis as I was crucified time and again.  I advised fellow students on how to survive the tortures of institutional assimilation and helped however I could.  I never gave into the petty dictates of my professors and I would speak up in class if I found something to be unreasonable or unjust.  I have a strong set of values, especially when it comes to education, and I would not compromise these values just to make things easier on myself.  I consider this attitude to be a virtue, one that I will never forsake, but it eventually led me into trouble.  In fact, it led me to drop out of graduate school and leave my PhD unfinished.  

Of course, the department should have known better.  My former history professor had warned in my letter of recommendation, "He will debate with anyone, challenge anything, ask deeply searching questions, spread doubt and confusion about certainties, in short, play the role of a gadfly, albeit one with deep convictions of his own... He will yield, but only after considerable butting of heads, argumentation, and resistance."  And so, I did. 

But isn't this a description of the vocation of scholar, critic, and scientist?  How do we ever reach the truth if no one questions certainties and asks searching questions?[42]  The physicist Werner Heisenberg said that the scientist "should always be prepared to have the foundations of his knowledge changed by new experience."[43]  Looking back on the history of western thought, Karl Popper argued that "the tradition of critical discussion" is the "only practical way of expanding our knowledge."[44] 

How will scholarship or science advance if everyone simply gives in to the voice of established authority, and rolls over like a dog when the master speaks.  Many scholars grow complacent in their tenured security, preserving the antiquated custom of a gentleman's game (dandy, prim and proper with status and authority) rather than doing the dirty, hard work of scholarship and science. 

Few professors were gracious when I asked difficult questions.  Some professors were outright unkind.  Some were out of touch, and should have been fired long ago for malpractice, or worse.

One tenured professor I had spent most classes ranting and raving about the "filth" and moral decay of American culture, sometimes jumping on desks or screaming in students' faces.  He was raised a Jesuit, attending a masochistic parochial school, which he later wrote about.  He carried his ideological rigidity and moral fervor into his classrooms, decrying communists, fascists, capitalists, and moral relativists all in one breath.  He was culturally conservative, intellectually traditional, and socially combative. 

Often, I was public enemy number one.  Once, discussing the validity of a source I had used, and by extension the place of ethnic studies programs in the American academy, we spent over an hour deadlocked in an unceasing debate, just the two of us, as the rest of the class sat silently watching.  My arguments were sound and I wouldn't give in to his conservative traditionalism and intellectual bullying.  I looked at my classmates from time to time in exasperation, but the professor would not stop attacking me.  Eventually I said, look, all of these other people are paying for an education and you're wasting their time so let’s get back to the agenda.  It wasn't the last time that we would but heads.

But he wasn't the only one I had trouble with.  Some professors can be kind, yet still equally rigid in their authoritarian traditionalism.  There was a young associate professor of educational history.  She was soft spoken, engaging, and kind.  But she had rigidly prescribed assumptions about "proper" scholarship and disciplinary standards, and she expected assignments to be completed in a very specific way.  In one class I earned a final grade of B because of a B- on the term paper (remember, B grades are one step away from failure in graduate school, so the term paper was technically a failing paper). 

She did not directly address the merits of the paper itself, instead pointing out how I did not follow the assignment stated on the syllabus.  That same paper, with only slight revisions, was peer reviewed and accepted several months later by a scholarly journal for publication.  Granted the paper had its faults, but if my paper was a "failure" then how did it pass a professional peer review and get published?

When it comes to the judgments of professors in their classrooms, they are local gods and their evaluations are sacrosanct, above dispute.  But when it comes to the actual activity of professional scholarship, it is a messy game of reasoned debate and power politics.[45]  Henry Adams once complained that while both congressmen and professors suffered from the same "maelstrom" of political bickering, "he preferred Congressmen," perhaps because they were more honest in the naked exercise of their power. 

Adams dryly noted, "Education, like politics, is a rough affair."[46]  A couple of professors have frankly noted in a book on academic culture, "Most academic fields are dominated by...powerful people."[47]  These powerful academic barons battle each other for intellectual supremacy, prestige, and research grants, and they autocratically reign over their own local fiefdoms like kings. 

While some academics will admit that the practice of science is filled with "disputes," "controversies," "violence," and "political methods,"[48] I've found that most professors hide this aspect of their profession from the public (and students), concealing the messy nature of knowledge creation behind the myth of "consensus."  Academics also frequently ignore or deny the very real "exercise of authority or other power" in scholarly debates. 

As Charles E. Lindblom has noted, "Aside from flights into the most fanciful utopias, one cannot even conceive of a solution or outcome reached wholly by examining its merits.  For all participants in problem solving live in a network of existing impositions and coercions."[49] 

The notion of scholarly consensus became more important as disciplines became professionalized during the early 20th century because "internecine intellectual warfare carried on in public compromised the image [academics] were trying to cultivate as professionals with insights that deserved to be taken seriously."[50]  But this myth of consensus now makes it much harder for scholars to criticize academic practice or the university system from within. 

As the philosopher and academic maverick Stephen Toulmin once confided, "Academics who criticize the Academy, of course, put themselves at risk."[51]  I find it disingenuous, if not flat out hypocritical that academics consider it their right and duty to criticize every aspect of social and physical reality, except themselves and their own practices.  And when the public finds out about the dirty little secrets of academia, as it did in the recent debate over global warming, it does much more harm to the reputation of science than if practitioners simply admitted the existence of politicized debates within the academy. 

The myth of the "ivory tower" must be overcome and replaced by the more sordid but palatable truth: professors play at power and politics just like everyone else.  Knowledge, like other disputed goods, is shaped by subjectivity and power, and it is constructed through messy political processes.  And while like laws and sausages, most people would prefer not to see the gritty truth, there is no excuse for practitioners to deny the dirty nature of their work - especially to graduate students who are being initiated into the trade. 

In such an environment, open and reasoned debate should be the highest virtue, but sadly, "intellectual orthodoxy" and "ideological conformity" define the rules of the game.[52]  As one scholar noted, an academic discipline is "a group of scholars who ha[ve] agreed not to ask certain embarrassing questions about key assumptions."[53]  These "canonical assumptions"[54] cannot be questioned because doing so would reveal the arbitrary and overly simplified analytical boundaries demarcating one field of study from another.  And unfortunately, since I was young, I have always pushed boundaries and questioned dogma.  In this I shared a sentiment with John Kenneth Galbraith, who once said of himself, "For me, at least, there has always been a certain pleasure in questioning the sacred tenets."[55]

Once I went all the way to the Dean of the university Graduate School to make this argument about the politics of disciplinary boundaries and the unstated dogmas of academic discourse.  I was trying to dispute the unreasonable and invisible rubric that professors were using to subjectively grade and unfairly judge students - the same invisible and subjective rubrics that most professors use to evaluate their peer's work.[56]  I was told that the university operated on the assumption that all professors were experts in their fields, which meant their knowledge was infallible and their judgments beyond reproach, especially by students. 

The voice of tradition and authority was unassailable.  It was a frank admission that the university and scientific practice is founded not on reason and consensus, but on the ancient feudal tradition of power and authority, as Michael Polanyi had argued.[57]  More recently Jonathan Cole pointed out that faculty "tend not to be tolerant of those in their midst who are courageous enough to challenge prevailing systems of thought," instead most faculty "define and enforce dominant orthodoxies."[58]

And as I know all too well, when you question the dictates of established authority, the hammer of tradition falls on your head.  As I had done with other professors, I ended up getting into an argument with my PhD program chair, except this wasn't just any term paper, it was my dissertation.  We had been butting heads for some time over the scope, methods, and arguments of my dissertation.  Quite frankly, he told me that historical research was not done in his field and that I was addressing too many large questions.  I knew that he was uncomfortable with the project because most of it was beyond his expertise (in terms of methods, scope, if not also in terms of the breadth of issues I wanted to address).  He didn't even have a PhD, he only held a EdD, which does not signify advanced competence in research or disciplinary knowledge.  Yet he was the only scholar in the department who was an "expert" in the general subject matter that I wanted to study. 

I had several other professors sign on to the project, both inside our department and outside in the history department.  But most of the committee had to come from one specific area of the department because this is where my subject was arbitrarily located within the intellectual bureaucracy.  Needless to say, none of these scholars really fit my research agenda, nor did they really want to work with me because I had a bad reputation for independent thinking.  I kept pushing for a historical dissertation that addressed several key philosophical issues and I was getting nowhere.  I was instructed to make things easy on myself and just do as I was told.  I couldn't do it.  This was my project and I would do it my way.  Why else did I come to this university?

My advisor, the chair of the committee, was very upset with my persistent attempt to have a say in my own education.  He was the one who gave me the failing grade in his class and forced me to re-write the term paper over the summer.  There is no doubt in my mind that the failing grade in that class was a deliberate move to coerce me into accepting his agenda for my dissertation.  He wanted to make it clear how much power he held.  But it wasn't the worst thing he did to me. 

I had been working for a year and a half as his research assistant.  I was outperforming all of his other graduate students, and he often praised my work.  I had authored a research paper (which he took primary credit for), which was later published in a peer reviewed journal.  I also authored several conference proposals (all of which he took primary credit for), which were accepted at two important national conferences.  No other graduate research assistant came close to my intellectual output.  As a bonus for all researchers, he had always generously paid for us (out of his grant money) to go to conferences.  We did all of the work writing and presenting the proposals, he took primary authorship and increased his prestige, and he paid our expenses.  It wasn't a bad deal. 

Well, just after a nasty dissertation committee meeting, he failed my term paper (as already described), and then he withdrew funding for a conference where I was to present my original research and join with the other research assistants to present our group projects.  I had already bought my plane ticket and was registered for the conference.  But because he was no longer reimbursing my expenses, I couldn't afford to go.  He ended up presenting my paper, which he had taken primary credit for, and I was stuck at home with a non-refundable plane ticket.

Less than two weeks later, I was fired - no cause given.  He sent me an email saying that I would no longer be needed.  My contract was going to be canceled at the end of the month.  Had I been a regular graduate student employee, I could have sued him for unjust termination.  But there is a dark side to graduate research positions paid by grant money: you are a private employee with no rights, completely unprotected by university labor laws or contracts. 

After talking with the graduate union, I found there was nothing I could do.  Thus, not only did I lose a good paying job (half of my monthly income), I lost my tuition grants (around $3,000 a term), I got stuck with a $600 non-refundable plane ticket, I received an incomplete on my transcript, and I had to retake a class over the summer.  This professor made his message very clear: either I play the game his way and do his dissertation, or he would push me out of the university.  Even if I had wanted to stay and to do the dissertation his way, I could not afford to pay the tuition, so really, I had no other option but to leave.  Later that summer, after receiving an A on my revised term paper, I dropped out of the program.

Leaving that PhD program has been my greatest failure of my life.  I still bear the psychological scars.  I tried to question some intellectual orthodoxies and blaze my own academic trails, but I was hammered down because of it.  Jonathan Cole is one of the few to have exposed the dangers of independent thinking in the American academy: "In truth, there is both intellectual and personal risk involved in challenging the presumptions of the group...rather than viewing unconventional thinking as an appropriate challenge to received wisdom and ideology, those being challenged often become defensive, and these questions, even is posed in the most neutral of forms, get people into trouble."[59] 

But looking back, knowing the danger of my intellectual positions, I would not have done anything different.  Personally, professionally, and morally I was compelled to resist the narrow, egotistical authoritarianism of my professors.  As Michael Polanyi once remarked, there are professors who are "uninspired, pedantic, and oppressive," "misguided by their personal bias," "who try to impose their personal fads" on students.  These members of the academy must be "firmly opposed" because education "would be impossible and science would soon become extinct" if they are not firmly challenged.[60]

I've never been one to give in or give up.  After this harrowing experience, I pulled myself together, and over the next year and a half I orchestrated a rebirth.  I had a great dissertation planned and I had already done much of the research.  So, without any funding or support, and living precariously on a slashed monthly income during what would become the Great Recession of 2007-09, I decided to finish the project myself and write a book.  When I had finished, several scholarly presses were interested, although almost all of them eventually declined to publish it because I did not have a PhD.  I did manage to find a publisher who actually looked at the quality of my scholarship, rather than just rely on my lack of appropriate credentials. 

I also sent the manuscript to major scholars in the field of Education at UC Berkeley, Stanford, Columbia, and UCLA.  And unlike the patronizing and demeaning criticisms of my former dissertation committee, I received a lot of positive feedback.  One renowned scholar at UC Berkeley was especially kind, and he agreed to write the forward.  In a personal note he said, "I think this is a thought-provoking book, in the sense that it asks us to think hard about why we construct educational institutions that are so contradictory, and checkered in their outcomes. It’s a 'big' book — it asks us to think expansively about what a particular educational institution accomplishes — and we have too few of these."[61] 

Another scholar at UCLA called me a "courageous visionary" and praised my "boldness" for researching questions that few had dared to ask.[62]  One scholar went so far as to say, "Educators, researchers, administrators, and government officials concerned about the future of community colleges, and U.S. higher education in general, cannot afford to ignore J. M. Beach's findings and conclusions."[63]

Less than two years later, with the book soon to be published, I presented a conference paper on the subject of community colleges, sharing the session with my former dissertation advisor who treated me so badly and pushed me out of the PhD program.  He barely acknowledged my presence and wouldn't look at me in the eyes.  He scarcely said one word in greeting, a "hello" half mumbled. 

We sat uneasily together in the front of the room with one of his graduate students between us.  I presented first.  After finishing my lecture, I let the audience know it was part of my new book, soon to be published.[64]  It was the very same project my advisor had severely criticized and rejected, saying it couldn't be done in our field.  It was the very same project for which I had endured the scorn of my committee and sacrificed my PhD. 

The book will never bring me fame, fortune, or a stable teaching job, but it was important to finish the project my way.  I ask some significant questions and the book has some very important things to say about education in the United States, as early reviewers have pointed out.  Like few academic books published these days, this book challenged excepted notions about education, asking the reader to think deeply about difficult and un-resolvable issues - the very issues that often get unaddressed in universities because they fall between the cracks of the intellectual bureaucracy. 

Some might say that my experiences were atypical.   Perhaps.  At some institutions and in some disciplines, the academic ideals of collegial critical analysis and rational discourse are the norm.  David Deutsch has recounted his experience with the ideal of scientific debate:

“The majority of the scientific community is not always quite as open to criticism as it ideally should be.  Nevertheless, the extent to which it adheres to 'proper scientific practice' in the conduct of scientific research is nothing short of remarkable.  You need only attend a research seminar in any fundamental field in the 'hard' sciences to see how strongly people's behavior as researchers differs from human behavior in general...In this situation, appeals to authority (at least, overt ones) are simply not acceptable, even when the most senior person in the entire field is addressing the most junior...The professor tries hard to show no sign of being irritated by criticism from so lowly a source [i.e. a graduate student].  Most of the questions from the floor will have the form of criticisms which, if valid, would diminish or destroy the value of the professor's life's work.  But bringing vigorous and diverse criticism to bear on accepted truths is one of the very purposes of the seminar.  Everyone takes if for granted that the truth is not obvious, and that the obvious need not be true; that ideas are to be accepted or rejected according to their content and not their origin.” [65]

While I too believe in this ideal, I have never seen it as perfectly practiced as Deutsch portrays.  I was the that graduate and junior colleague, as Deutsch describes, and I was often savagely beaten down for my impertinence.  Perhaps Deutsch's experience was more ideal due to the fact that he was educated and is still employed by two of the most prestigious research universities in the world, Cambridge and Oxford.  At these privileged institutions of higher learning, I would imagine that things work very differently than your average public research university in the United States. 

Perhaps Deutsch's experience is also a product of the "hard sciences" where key theoretical assumptions and quantitative methodology are less contentious than the social sciences and the humanities.  But even acknowledging these legitimate factors, I think that Deutsch is still overly idealistic, albeit sharing an ideal that I also firmly believe in. 

I would agree more with a statement Deutsch made leading up to the above quoted passage, "The academic hierarchy is an intricate power structure in which people's careers, influence and reputation are continuously at stake, as much as in any cabinet room or boardroom - or more so."[66]  In short, the academic hierarchy, and the research university as an institution, are fundamentally political, as everything that humans say or do is filtered through various political processes based on power, prestige, and struggles to limited resources or contested values.    

In the wake of 9-11 and the political repression of dissent and unconventional viewpoints, Lisa Anderson, professor of political science and former dean of Columbia's School of International and Public Affairs, reminded the nation of the importance of free speech.  She warned, "We must be constantly, restlessly open to new ideas, searching for new evidence, critical of received wisdom, old orthodoxies, and ancient bigotries, always crating and criticizing ourselves, each other and our world.  This is the life of scholarship and we must embrace it for what it is and do it well.''[67] 

Over a half century before, in 1945 after the World War II, Michael Polanyi had forcefully argued that scientific enquiry must be based on the freedom of scientific research and discussion.[68]  Stephen Toulmin called this openness "intellectual democracy."[69] 

Unfortunately, the whole notion of free inquiry and intellectual democracy has begun to corrode and rot away in the very place it was supposed to be preserved and supported.  Perhaps this ideal is still strong at the more prestigious (and well-funded) research universities, and perhaps more in the physical sciences than in the social sciences and humanities. 

But I contend that this ideal is beleaguered not only from outside the university, but most disturbingly, from within.  Unless more academics stand against the authoritarianism, orthodoxy, and conformity of higher education, especially in their own departments and with their own students, we risk the corruption of the whole scientific enterprise, and the death of the last great hope of humanity.

 


[1] Adams, The Education of Henry Adams, 91.

[2] G. Stanley Hall, "College Instruction in Philosophy," The Nation 23 (Sept 1876), 180.  Passage was quoted in Ellen Condliffe Lagemann, An Elusive Science: The Troubling History of Education Research (Chicago, 2000), 28.

[3] Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 41.

[4] Andrew Collier, "Critical Realism," in The Politics of Method in the Human Sciences, George Steinmetz, ed. (Durham, NC, 2005), 327.

[5] Toulmin, Return to Reason, 42.

[6] Hermann Hesse, The Glass Bead Game, 348-49.

[7] Robert L. Heilbroner, The Worldly Philosophers (New York, 1961), 251; Michael Spindler, Veblen & Modern America: Revolutionary Iconoclast (Sterling, VA, 2002), 51-56.

[8] The professor was famed historian Carl Becker who was teaching at Cornell.  McNeill, Mythistory and Other Essays, 149, 152.

[9] One anonymous faculty member described the tenure process as a "living hell."  Jack Stripling, "Burning Out, and Fading Away," Inside Higher Ed (June 10 2010).

[10] Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 149-161.

[11] Ibid., 161.

[12] Sherry B. Ortner, Anthropology and Social Theory: Culture, Power, and the Acting Subject (Durham, 2006).

[13] Toulmin, Return to Reason, 45.

[14] Steven M. Cahn, From Student to Scholar: A Candid Guide to Becoming a Professor (New York, 2008), 12.

[15] Menand, The Marketplace of Ideas, 142.

[16] Ibid., 5.

[17] Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 140.

[18] Adam Ruben, Surviving Your Stupid, Stupid Decision to Go to Grad School (New York, 2010), ix, xvii.

[19] Dewey's remarks were directed against G. Stanley Hall.  Qtd. in. Lagemann, An Elusive Science, 30.

[20] Stanley Aronowitz, The Knowledge Factory, 148.

[21] Menand, The Marketplace of Ideas, 152.

[22] "The Disposable Academic: Why Doing a PhD is Often a Waste of Time," The Economist (Dec 18 2010), 156, 158.

[23] This quote and the following two quotes come from Ruben, Surviving Your Stupid, Stupid Decision to Go to Grad School, ix, xvii, 49, 61, 69, 81.

[24] Generalization is these paragraphs are also based on my own experiences as a graduate student.  For a brief line on the "exploitation" of graduate students see Paul Gray and David E. Drew, What They Didn't Teach You in Graduate School (Sterling, VA, 2008), 100.

[25]William Deresiewicz, "Faulty Towers," The Nation (May 23 2011), 30.

[26] Alana Semuels, "Universities are Offering Doctorates but Few Jobs," The Los Angeles Times (June 3 2010); Jenna Johnson Daniel De Vise, "Students Protest Cuts to Higher Education Funds" The Washington Post (March 4 2010); Lexi Lord, Beyond Academe <www.beyondacademe.com>

[27] "The Disposable Academic," 156.

[28] Menand, The Marketplace of Ideas, 152.

[29] Deresiewicz, "Faulty Towers," 30.  He went on to explain, "It's also a social tragedy, and not just because it represents a colossal waste of human capital.  If we don't make things better for the people entering academia, no one's going to want to do it anymore."

[30] Cole, The Great American University, 60-63, 379.

[31] David M. Kreps, "Economics - The Current Position," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 77-78.

[32] While I agree with Cole's emphasis on the "core values" of the university and the importance of these values, I don't think Cole realizes the social gulf between students and professors, and between junior professors and senior professors.  I think Cole drastically underplays the importance of "academic dogmatism," especially between professors and graduate students.  Cole, The Great American University, 60-63, 379.  I agree with Stanley Aronowitz, "I believe that advice that stifles the voice of the student who really has something to say, the intellectual means to say it, and the stamina to tolerate perpetual wagging heads is cockeyed and indefensible." The Knowledge Factory, 147.

[33] Isaiah Berlin, qtd. in Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), viii.

[34] Stanley Aronowitz, The Knowledge Factory, 147.

[35] Adams, The Education of Henry Adams, 75.

[36] On this point see Stanley Aronowitz, The Knowledge Factory, 147.

[37] See Jurgen Herbst, “Nineteenth-Century Normal Schools in the United States: A Fresh Look,” History of Education, 9, no. 3 (1980): 219-27; David F. Labaree, The Trouble with Ed Schools (New Haven, 2004).

[38] The quote is a paraphrase of Kissinger's remark.  "A Post-Crisis Case Study," The Economist (July 31 2010), 55.

[39] Robert Maynard Hutchins, quoted in Cole, The Great American University, 141.

[40] Beach, Expression and Identity, 8.

[41] Ellen Condliffe Lagemann, An Elusive Science: The Troubling History of Education Research (Chicago, 2000).

[42] Imre Lakatos and Alan Musgrave, eds., Criticism and the Growth of Knowledge (Cambridge, UK, 1970).

[43] Heisenberg, Physics and Philosophy, 114.

[44] Karl Popper, Conjectures and Refutations (London, 1963), 148-52.

[45] Lamont, How Professors Think: Inside the Curious World of Academic Judgment, Ibid.

[46] Adams, The Education of Henry Adams, 306-7.

[47] Gray and Drew, What They Didn't Teach You in Graduate School, 7.

[48] Heisenberg, Physics and Philosophy, 141.  See also Thomas S. Kuhn, The Structure of Scientific Revolutions (Chicago, 1962); Peter Novick, That Noble Dream: The ‘Objectivity’ Question and the American Historical Profession (Cambridge, UK, 1988); Steven Shapin, Never Pure: Historical Studies of Science as if It Was Produced by People with Bodies, Situated in Time, Space, Culture, and Society, and Struggling for Credibility and Authority (Baltimore, 2010).

[49] Charles E. Lindblom, Inquiry and Change: The Troubled Attempt to Understand and Shape Society, 46.

[50] William J. Barber, "Reconfigurations in American Academic Economics: A General Practitioner's Perspective," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 117.

[51] Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), ix.

[52] Cole, The Great American University, 494.

[53] Mark Nathan Cohen, Health and the Rise of Civilization (New Haven, 1989), viii.

[54] David M. Kreps, "Economics - The Current Position," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 97.

[55] John Kenneth Galbraith, American Capitalism: The Concept of Countervailing Power (New Brunswick, NJ, 1997), xi.

[56] Lamont, How Professors Think: Inside the Curious World of Academic Judgment, Ibid.

[57] Michael Polanyi, Science, Faith and Society: A Searching Examination of the Meaning and Nature of Scientific Enquiry (Chicago, 1964).

[58] Cole, The Great American University, 494.

[59] Ibid., 494-495.  Stanley Aronowitz is another.

[60] Michael Polanyi, Science, Faith and Society: A Searching Examination of the Meaning and Nature of Scientific Inquiry (Chicago, 1964), 46.

[61] W. Norton Grubb, David Gardner Chair in Higher Education, University of California, Berkeley, personal e-mail (Nov 2009).

[62] Robert Rhoads, Professor of Higher Education and Organizational Change, University of California, Los Angeles, endorsement on the back cover of my book, Gateway to Opportunity.

[63] V. P. Franklin, University of California Presidential Chair, Distinguished Professor of History and Education, University of California, Riverside, endorsement on the back cover of my book, Gateway to Opportunity.

[64] J. M. Beach, A Gateway to Opportunity? A History of the Community College in the United States (Sterling, VA, 2010).

[65] David Deutsche, The Fabric of Reality: The Science of Parallel Universes - and Its Implications (New York, 1997), 325-26.

[66] Ibid., 325.

[67] Lisa Anderson, quoted in Cole, The Great American University, 446.

[68] Polanyi, Science, Faith and Society, 62.

[69] Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 99-100.

What is Enlightenment?

originally written 2014

Human beings have defined education very broadly for thousands of years.  One of the oldest concepts of education, found in diverse human societies around the globe, has been the idea of "enlightenment."   

From our earliest records, human beings have explained education in terms of illuminating the darkness of the human condition, overcoming fear of the unknown through the light of human understanding, which paves the way for liberation and purposive action (often against the oppressive power of the gods, kings or social institutions).  In the tradition that I was raised, the western concept of enlightenment can be traced to the very origins of philosophy in Ancient Greece. 

Later in the 17th and 18th centuries, European philosophers referred to their time as an "age of enlightenment" because they were expanding the horizons of human knowledge through both a rediscovery of ancient knowledge and through new forms of intellectual inquiry that would be called science.  Then in the late 19th and early 20th centuries, European and American philosophers re-examined the ancient western conceptions of enlightenment in conjunction with the early developments of modern science, while at the same time discovering other ancient traditions of human enlightenment found in eastern and middle-eastern philosophy from China, India, Persia, Palestine, and Japan. 

During the 20th century, the western world had more sustained and reciprocal cultural contact with eastern cultures and their traditions of knowledge.  This cultural exchange enabled eastern philosophers, poets, and religious teachers to have a larger impact on western philosophy through translations of eastern texts and also through direct contact, as many European scientists visited the Asian sub-continent and some eastern mystics and philosophers began to visit Europe and America. 

The 20th century also saw scientists from around the globe asking classically philosophical questions about the nature of reality and the human condition, thus, redrawing the boundaries of human knowledge by using a wide array of empirical data and modern scientific theories. 

At the dawn of the 21st century we have access to an unprecedented amount of knowledge, three thousand years of global philosophy and several hundred years of empirical scientific study.  The challenge for 21st century education is synthesizing the diverse traditions of global knowledge about the human condition into a coherent intellectual framework.  One way of doing this is by focusing back on the ancient idea of enlightenment and critically analyzing it in relation to the past three thousand years of human history.

At the center of the ancient concept of enlightenment are three interrelated ideas: (1) clarifying human perception, (2) knowing the "real" world as accurately as possible, and (3) using this knowledge to freely act, thereby living more successfully and meaningfully. 

At root the concept of enlightenment equates knowledge of reality with human freedom.  The idea is that more complete knowledge of reality enables individuals to disentangle from restrictive determinants in order to more freely and completely act.  Furthermore, the enlightenment of the individual is often assumed to advance the goals and harmony of human society, which would ultimately lead to a "perfect" utopian culture. 

John Gray calls this later goal "the Enlightenment project," which historically has been a deep-set religious impulse of Europe and America, whereby people have placed faith in the unlimited capacity for human perfection and material progress.1 

From the earliest historical records, people have been asking a recurring set of perennial questions: What is reality?  How do we know?  How shall we live together?  What does it mean to be human? 

I want to attempt to give a short, critical history of the concept of enlightenment, which I think provides a broad, coherent, multicultural, and meaningful key to understanding the promise of education for the 21st century.  An educational philosophy based on the principle of enlightenment would encourage not only the pursuit of knowledge, but also the possibility of increased freedom.  It would also warn people about the various constraints of being human that limit our freedom and our knowledge, the understanding of which would help enable a more pragmatic ideal of human possibility. 

For most of human history the cosmos of the known universe was populated by deities and devils in a simplistically ordered whole, which we now refer to as myths.  As Henry Adams poetically noted, "Chaos was the law of nature; order was the dream of man."2  Across the globe, humans acted in broad ignorance of the objective world, mixing local intelligence and tradition with fanciful belief in the power of magic, prayer, and fate. 

Human beings seemed to have some small measure of power over their lives, but they were often at the mercy of the larger physical environment and chaotic natural processes, which were beyond comprehension, yet alone human control.  Mythological stories of gods, spirits, and heroes enabled humans to make some sense of a world they couldn't really know, and these stories gave human beings meaning and hope, which enabled them to survive and eventually thrive.

Early philosophers in ancient Greece, and later in the Roman imperium, began to challenge some of these myths through dialogue and critical reasoning, but these ancient intellectuals had little evidence to actually convince people.3  Socrates (c 469-399 BCE) was perhaps the most famous example.  He argued that humans should admit when they are ignorant,4 they should debate the nature of reality, and they should "test the truth" of all ideas.5 

Socrates was eventually condemned to death because he upset his society by questioning the gods and the traditional explanations of the cosmos, but even at his trial he was without remorse: "I say that it is the greatest good for a man to discuss virtue every day...conversing and testing myself and others, for the unexamined life is not worth living."6  Socrates' student Plato (428-347 BCE) famously extended this central insight into a parable about a cave.  In this story, the enlightened philosopher realizes individuals are bound to cultural truths, which turn out to be nothing but manipulations, shadows on a wall.  So, the philosopher escapes the cave and becomes enlightened by the sunny real world, which symbolizes the actual truth of existence.7

Several ancient eastern philosophers also criticized the myths of society and encouraged people to critically evaluate their perceptions to understand the deeper truths of reality.  In ancient India, Gautama Buddha (c 563-483 BCE) taught that "this world has become blinded" because the average person's perception was "polluted."  Only a few could "see insightfully."  He taught his followers to "make a lamp for yourself," to purify their vision, and to "become a wise one" by "knowing full well, the mind well freed."8  A later follower of the Buddha described the central teaching of Buddhism as self-knowledge, which incidentally was also the core teaching of Socrates: "When the cloud of ignorance disappears, the infinity of the heavens is manifested, where we see for the first time into the nature of our own being."9 

A contemporary of the Buddha in ancient China also developed a tradition of enlightenment.  The philosopher K'ung Fu-tzu (c 551-479 BCE), we know him in the west as Confucius, tried to teach his students "insight" by "understand[ing] through dark silence."  He encouraged his students to "study as if you'll never know enough, as if you're afraid of losing it all."10  A later follower, Meng Tzu (c 372-289 BCE) or Mencius, told a story about the revered master and his dedication to knowledge:

Adept Kung asked Confucius, "And are you a great sage, Master?

"I couldn't make such a claim," replied Confucius.  "I learn relentlessly and teach relentlessly, that's all.

At this, Adept Kung said, "To learn relentlessly is wisdom, and to teach relentlessly is Humanity.  To master wisdom and Humanity - isn't that to be a sage?"11

Confucius always pleaded a profound ignorance about life, as did Socrates, who famously claimed, "All I know is that I know nothing." 

For Socrates, Buddha, Confucius, and Mencius, the root of human wisdom was to "relentlessly" learn and teach, perfecting the art of human living through a constant dedication to the practice of education, all the while professing a profound humility about the knowledge and wisdom they actually possessed.12  As philosopher Owen Flanagan has summarized:

“In separate places between the fourth and sixth centuries BCE, Plato and the Buddha describe the human predicament as involving living in darkness or in dreamland - amid shadows and illusions.  The aim is to gain wisdom, to see things truthfully, and this is depicted by both as involving finding the place where things are properly illuminated and thus seen as they really are.”13

These ancient teachers lived their own philosophies and tried to embody wisdom.  They also preserved a cannon of traditional texts that taught about the enlightened actions of gods, heroes, and sages.  They wanted to teach not only about knowledge, but also a better way of life.

While the old epistemology of myth and tradition was questioned by a long line of western and eastern philosophers,14 it was not completely challenged until the birth of western science.  However, early modern scientists displayed a continuity with ancient wisdom in the continued focus on enlightenment, albeit a new form of enlightenment based on new epistemological methods.  Johannes Kepler (1571-1630) and Galileo Galilei (1564-1642) ushered in what has been called the “Copernican” revolution with the “new art of experimental science.”  This new science allowed a new type of truth to be fashioned based on empirical observation, innovative technology (like the telescope), experimentation, and mathematical theory.15

At the same time philosophers of science developed new scientific methods to logically explore and confirm a new type of truth based on logic and empirical evidence.16  Francis Bacon (1561-1626) popularized inductive logic and Rene Descartes (1596-1650) developed physical reductionism and analytical geometry.17  Isaac Newton (1643-1727) unified these developments, creating modern science by synthesizing experimental empiricism with more effective logical methods.  He not only developed the calculus, in conjunction with the German philosopher Gottfried Leibniz (1646-1716), but Newton also further developed the application of mathematical logic to the study of the physical world.  Newton unequivocally replaced the divinely governed geocentric chain of being with a new empirical conception of heliocentric universe governed by natural laws.  Newton wanted "to define the motions" of all physical objects "by exact laws allowing of convenient calculation.”18

As the Newtonian revolution reached across Europe it inspired many philosophers to conduct wide-ranging empirical studies of the natural world and human society.  This period of European history was called the "age of enlightenment" by various philosophers of that era.19  By the end of the 18th century, the popularity of the term was criticized by Immanuel Kant (1724-1804).  In an essay written in 1784, Kant asked "What is Enlightenment?"  He explained that many human beings had finally reached a state of intellectual maturity, whereby they were able understand the world and themselves through direct evidence and without recourse to myths or traditions. 

While the majority of humans were still mired down by "laziness and cowardice," there were a brave few who were "think[ing] for [them]selves," "cultivating their spirit," developing "rational" thought, and spreading the "spirit of freedom."  Kant argued that Europeans were not living "in an enlightened age," but they were living in "an age of enlightenment," by which he meant an age where humans were gradually becoming more and more enlightened.  Kant believed that if the ethos of this age were allowed to run its course, it would slowly spread to the public at large, although he was skeptical about whether the "great unthinking masses" could ever get beyond their subjective "prejudices" and achieve "a true reform in one's way of thinking."20

By the 19th century, both Newton’s theoretical methods and his belief in rationally ordered natural laws spread across Europe and America, as more and more philosophers and early scientists empirically studied the physical and social world.  Most of these enlightened Europeans thought they could collect and assemble a vast array of empirical data, which would thereby transparently reveal the secret workings of Newton's rational laws. 

These early scientists believed that a small number of hidden natural laws ordered and determined not only the physical universe, but also human society and individual action.  The invention of the Encyclopedia in the 18th century embodied this simple assumption, as it was based on medieval compilations which purported to contain all the knowledge a person needed to know, like the Bible and the Speculum Mundi.21  For over a millennium, the learned men of Europe thought one book could hold everything a society needed to know.  The new scientific literature merely replaced the Bible as the single source of intellectual and moral authority, but it did not challenge the simplistic monism of a transparent natural law. 

This assumption of transparent, rational, transcendent, natural laws would stand unchallenged until the late 19th and early 20th century.  During this time two later scientific revolutions corrected Newton's simplified view of the world and ushered in a new, more fruitful and frightful age of enlightenment.  The great revolution in science during the 19th century was Charles Darwin’s (1809-1882) theory of natural selection, which has been called “evolution.” 

Darwin’s theory delivered a crippling blow to both anthropocentric beliefs about the uniqueness of human beings as a species and to deep set assumptions of the "progressive" advancement of human civilization.  As the philosopher Friedrich Nietzsche explained, Darwin was able to "translate man back into nature."22  Darwin helped historicize and naturalize human knowledge, as most people still believed in timeless supernatural forces to be the source of all change on Earth.       

Basically, Darwin’s theory of natural selection described all life metaphorically as a dense interconnected tree.  The trunk represented a primordial common ancestor of all life on Earth, which over billions of years branched out into millions of new forms through a process of blind, natural selection.  New organisms arise due to random genetic modification over long periods of time as they struggle to adapt to changing environments.  Certain modifications are “naturally selected” by the external environment because these random changes allow the organism to better survive, mate, and reproduce.23 

Darwin’s theory enabled not only a “new science,” giving rise to fields like ecology and evolutionary biology, but it also created a “new way of thinking,” which focused on the dense interconnection, interdependence, and historically conditioned form of all life on Earth.24  Ernst Mayer claimed that Darwin “caused a greater upheaval in man’s thinking than any other scientific advance since the rebirth of science in the Renaissance.”25  I. Bernard Cohen, a pioneering historian of science, argued, “The Darwinian revolution was probably the most significant revolution that has ever occurred in the sciences.”

Its effects and influences were significant in many different areas of thought and belief.  The consequence of this revolution was a systematic rethinking of the nature of the world, of man, and of human institutions.  The Darwinian revolution entailed new views of the world as a dynamic and evolving, rather than static, system, and of human society as developing in an evolutionary pattern…The new Darwinian outlook denied any cosmic teleology and held that evolution is not a process leading to a “better” or “more perfect” type but rather a series of stages in which reproductive success occurs in individuals with characters best suited to the particular conditions of their environment – and so also for societies.26

Darwin’s “dangerous idea” completely naturalized and secularized the known world, completing the dream of Socrates and the Buddha, as the long assault on metaphysical belief finally rendered obsolete all notions of an "enchanted cosmos."27

However, modern advances in science have not completely unraveled the mystery of the physical universe.  In fact, the most recent scientific revolution has unveiled a strange, unpredictable cosmos filled with wondrous and frightening possibilities.  Quantum theory and quantum mechanics ushered in the unsettling notions of relativity and probability.  The Newtonian framework of physical science was based on the idea of a limited number of absolute laws of the universe that determined all matter and motion in discrete, observable, and rational patterns. 

But Albert Einstein’s (1879-1955) theory of relativity and Werner Heisenberg's (1901-1976) theory of quantum mechanics (and more recent developments in string theory) would revise the Newtonian assumption of simple laws and predicable patterns of motion, although many of these patterns seem to hold for the largest bodies of observable phenomena, like planets and solar systems. 

Instead of a discrete, well-ordered universe of rational laws, Heisenberg discovered that physical matter at its most basic level is dynamic, chaotic, random, and wholly uncertain.  Physicists are still debating the very nature, and thereby the names, of the building blocks of life.  Because the physical universe is in constant chaotic motion and the exact position or course of any sub-atomic particle uncertain, the vantage-point of any subjective observer plays a role in trying to objectively observe, record, and understand data. 

The implications of this revolution disordered basic scientific assumptions, such as objectivity, physical laws, predictability, and positive forms of knowledge.  Even Einstein was concerned about his own discoveries, later in his life turning toward a unifying theory of everything because he believed that “God does not play dice with the universe.”28

But Einstein’s later reaction represented the last throws of an old Western belief in a fundamental order to reality that was both unchanging and discretely knowable.  As David Lindley explained, "Heisenberg didn't introduce uncertainty into science.  What he changed, and profoundly so, was its very nature and meaning.  It had always seemed a vanquishable foe...The bottom line, at any rate, seems to be that facts are not the simple, hard things they were supposed to be."29 

There is a new, profound truth that physical and social scientists are finally coming to terms with.  The objective world that we inhabit is not singular, nor is it governed by simplistic and unchanging laws that determine everything according to a singular metaphysical clockwork.  Further, in the wake Gödel's thesis and the Church-Turing thesis, scientists have had to admit that not every facet of reality is amenable to human intelligence.30

Even more so than Darwin's theory of natural selection, the idea of uncertainty has been perhaps the most profound and disquieting discovery in human history.  The irrational and ingrained assumption of a singular, simplistic, and immutable order of the universe was not only the “great myth” of the ancient world, but it was also the “central dogma” at the foundation of the Western Enlightenment and the birth of modern science. 

From the ancient Greek, Chinese, and Indian sages to the modern world of science, philosophers and scientists have believed in a "fundamental" principle of rational "order."31  Friedrich Nietzsche was one of the first philosophers to call this monistic belief in a rational order "a metaphysical faith," which Nietzsche connected to "a faith millennia old, the Christian faith, which was also Plato's, that God is truth, that truth is divine."32 

According to philosopher and historian of science Isaiah Berlin (1909-1997), “One of the deepest assumptions of Western political thought is the doctrine, scarcely questioned during its long ascendancy, that there exists some single principle which not only regulates the course of the sun and the stars, but prescribes their proper behavior to all animate creatures…This doctrine, in one version or another, has dominated European thought since Plato…This unifying monistic pattern is at the very heart of the traditional rationalism, religious and atheistic, metaphysical and scientific, transcendental and naturalistic, that has been characteristic of Western civilization.”33 

The breakdown of traditional monistic sources of authority began with the political philosophy of Niccolo dei Machiavelli (1469-1527) and was later questioned by counter-Enlightenment romantic philosophers.  But it was not until the late 19th century, and more fully during the 20th century, that the assumed singular “Truth” of human rationality was fatally assaulted and finally pronounced dead by a tradition of radical European philosophers, from Friedrich Nietzsche (1844-1900) and Martin Heidegger (1889-1976) to Michel Foucault (1926-1984) and Jacques Derrida (1930-2004).  This European strain of philosophy also impacted and co-existed with an American school of thought called Pragmatism, which argued against singular truths in favor of epistemological and ontological plurality, from William James (1842-1910) and George Herbert Mead (1863-1931) to John Dewey (1859-1952) and Richard Rorty (1931-2007). 

Upon closer examination, the myth of a singular, immutable physical Law proved to be no less quaint than the geocentric conception of the universe or the belief in a singular omniscient deity that sat at the edge of reality manipulating the strings of myriad human marionettes.  It was only in the last century, and really in the past twenty-five years, that this myth of a singular order of the universe has been fully exposed as false - although many scientists still cling to this assumption.  Discussing the aim of science in 1957, Karl Popper admitted, "the conditions obtaining almost everywhere in the universe make the discovery of structural laws of the kind we are seeking - and thus the attainment of 'scientific knowledge' - almost impossible."34 

Many physical scientists (although not all) 35 now admit that reality is not a singular, static system governed by universal laws.  Instead, the objective world is understood as a complex, interdependent ecology composed of multiple dynamic levels, with many complex systems continually producing emergent qualities that are difficult to understand because they are more than the sum of their parts.36  The physical world is a complex "open system" composed of "pluralistic" domains, which are in interrelated and in constant flux.37  Some have ceased calling the total expanse of reality the universe and instead refer to it as a "multiverse."38 

This notion of a complex web of interdependent life was acknowledged by Darwin in the 1860s and later caused a paradigm shift in the physical and social sciences beginning in the 1890s.39  Henry Adams noted the breakdown of the old order and the beginning of a new, more chaotic universe at the turn of the 20th century.40  However, the older monistic paradigm continues to be a powerful assumption, as it has only been partially eliminated from current scientific theory and practice.41 

The old notion of "universal laws," assumed by classical scientists, has largely been proven a fiction that does not accurately describe the physical world.42  For some scientists, the over-turning of this old dogma is tantamount to a "crisis of faith."43  Over the past century there has been a widespread acknowledgement of the "delusion of the universal," which has led to a "re-conceptualization of science."44 

Based on a broad reading of scientific discoveries over the full range of physical, biological, and human sciences, it is clear that the objective world consists of multiple, interrelated, interdependent, fluctuating, and evolving levels of reality: from the smallest elements at the sub-atomic level to the atomic level, the molecular level, the chemical level, leading up to the organism level, ranging from very simple to very complex organisms, to various biological social-groups which compose larger societies, to the many ecosystems across the planet, to the global level, to our solar system, and to the larger galactic systems all the way to the edge of the expanding universe.45  The Nobel Prize winning chemist Ilya Prigogine is one of many 20th century scientists who recognized the inherent plurality of the objective world: "Nature speaks with a thousand voices, and we have only begun to listen."46 

Various possible languages and points of view about the system may be complementary.  They all deal with the same reality, but it is impossible to reduce them to one single description.  The irreducible plurality of perspectives on the same reality expresses the impossibility of a divine point of view from which the whole of reality is visible...emphasizing the wealth of reality, which overflows any single language, any single logical structure.  Each language can express only part of reality.47

In order to be studied, the objective world has to be broken down into these many conceptual levels of analysis (and even many dimensions of space and time, as string theorists have been arguing), each with their own special properties and emergent functions, co-existing in a dense ecological web of life.  Each level needs its own "regional theory,"48 with accompanying analytical concepts and methods, which are all analytical fictions that scientists create to practically label and know an isolated part of dense interconnected reality that is not entirely knowable, let alone predictable or controllable.49 

All levels and dimensions of this dense web of reality are connected together and dependent upon each other, deriving their very substance out of the complex interplay of invisibly interwoven ecological relationships.  Most of the levels and dimensions of the objective world cannot be directly perceived by the human mind and we rely on complex technologies to catch a glimpse.  Some of these levels and dimensions cannot even be fathomed, let alone conceptualized into a coherent theory, as string theorists are dealing with at the sub-atomic level.  As Alan Lightman pointed out, "We are living in a universe incalculable by science."50 

As a species, we do not have the innate capacities to fully conceive, let alone know the complex reality of the objective world in its totality.  It is completely beyond us, and perhaps always will be unless we can develop some kind of super technology that will enhance our cognitive abilities.  This is a profound truth that most scientists have not yet fully acknowledged, as many practicing scientists seem to believe that humans have an unlimited capacity for rationality, knowledge, and technological progress.51 

We need to constantly remind ourselves, in the words of Paul Ricoeur, that humanity is both an "exalted subject" and a "humiliated subject."52  Human beings are not gods with infinite powers of reason and will.  They are in fact quite limited in their abilities to know and freely act.  And when humans do act, they are often animated by various motivations and ideals that conflict with each other, forcing hard decisions over which good should prevail. 

We are also continually plagued by the unforeseen consequences of our short-sided decisions.  This old notion of human limits was the bedrock of many conservative philosophers and the literary genre of tragedy: from Sophocles (497-406 BCE) and Euripides (480-406 BCE) to Marcus Aurelius (121-180 CE) and St. Augustine (354-430 CE) to Blaise Pascal (1623-1662) and Edmund Burke (1729-1797) up to contemporary thinkers Isaiah Berlin (1909-1997) and John Gray (1948-). 

All of these insightful philosophers, in the most expansive sense of this term, warned against transgressing traditional boundaries out of a blind pride in human knowledge and will.53  And this notion of human limits is not confined to conservative thought.  There has also been a tragic strain of liberalism that questions the complexity of reality, the conflicting diversity of human goods, and the constraints bounding human rationality.54  For most of human history there has been widespread doubt about the ability of human beings to understand and control the social and physical environment, which would be the foundation of what we call "freedom."  The ancient Greeks even used a special word for those who dared to defy the traditional order of the universe and try to freely act, hubris

But with the development of critical philosophy, and later, the methods of logic and the empirical sciences, humans began to believe that they possessed special powers that could overcome inherent limitations of the species and the rest of the biological world.  And to some extent, with the new tools of science and technology, they did.  Thus, using a revised concept of enlightenment, 18th and 19th century philosophers believed that increased knowledge would allow humans to control the natural world and their own destiny.  This was the dream of early modern science and the European age of enlightenment.  John Gray notes that this philosophy of human progress was a "secular version" of the Christian faith it sought to replace.55 

The great apotheosis of this belief in human rationality and progress came in the philosophy of Georg W. F. Hegel (1770-1831).  He believed that there was a fundamental "rational process" to the natural world.  This rationality slowly revealed itself through history via the unique creation of human beings who gradually perfected their ability to know and freely act:  "The History of the world is none other than the progress of the consciousness of Freedom...we know that all men absolutely (man as man) are free."56  This lead Hegel to famously claim that "What is rational is actual and what is actual is rational," by which he meant whatever exists is rational and right and must be accepted as the inherent unfolding of universal laws that unequivocally govern all life - laws which just so happen to have expressed themselves through the progressive perfection of human beings and human society.57

But as already discussed, the universe turned out to be much more complex and stranger than humans had always imagined it, and human beings turned out to be much more flawed than enlightenment philosophers assumed.  Thus, in the 20th century, modern intellectuals had to revise the older optimistic vision of enlightenment humanism to deal with the very real biological, social, and physical constraints of the human condition, not the least of which was admitting the savage, self-destructive capabilities of human beings.  Far from a progressive unfolding of rationality and freedom, as Hegel suggested, the 20th century seemed to ominously hint that humans might utterly destroy themselves and their planet. 

Early in the century the Irish poet W. B. Yeats warned that the "ceremony of innocence" of enlightenment humanism would be "drowned" because the "blood-dimmed tide" of human irrationalism had been "loosed" upon the world.  Human beings were revealing their true nature as "rough beast[s]" whose self-inflicted apocalypse had "come round at last."58  Later, in the midst of the second World War, the

English poet W. H. Auden noted that "Evil is unspectacular and always human."59  The 20th century would remind humans of their animal nature ("human, all too human," as Nietzsche sighed60), which gave rise to another form of frightful uncertainty.  If humanity did not have the capacity for enlightenment and freedom, then what hope for a better world?

Michel Foucault (1926-1984) was one of the most profound 20th century philosophers who dealt with the paradoxical predicament of eclipsed enlightenment principles in the "post-modern" world.  In an unpublished manuscript Foucault revisited the classic essay of Immanuel Kant and asked again, "What is enlightenment?"  Foucault started by noting that Kant had used the German word Aufklarung, which was a way of saying an "exit" or a "way out," by which Kant had meant a way out of the limitations of a traditional humanity ruled by irrational myths and the cruelty of our animal instincts.  Kant wanted humans to "escape" from tradition and human imperfection, to "dare to know," and thereby to create the condition for enlightenment and human freedom which were yet still only ideals. 

The implications of this directive were at once personal and also social.  Human freedom would take not only an individual act of will, but also a political revolution to free humans from the "despotism" of traditional sources of authority.61  This led philosopher Allan Bloom to declare that the European enlightenment was the "first philosophically inspired 'movement'" that was both "a theoretical school" and a "political force at the same time."62 

Yet Kant was not a revolutionary and he was very distrustful of the notion of democracy.  At the same time that Kant had written his essay, enlightenment notions of human freedom inspired an unprecedented wave of democratic revolutions that would sweep the globe.63  The principles of enlightenment meant "political activism" and "the transformation of society," the basic tenets of progressivism, or the political Left.64  Radicals like Thomas Paine believed that we as human beings "have it in our power to begin the world over again."65 

The first enlightenment inspired revolution happened in the North American English colonies.  The founding fathers of the United States of America saw themselves as an enlightened vanguard who were "spreading light and knowledge" to the rest of the world.  They wanted to defeat the tyranny of monarchical despotism and free themselves to fulfill the promise of enlightenment principles.66  John Adams (1735-1826) saw America as "the opening of a grand scene and design in Providence for the illumination of the ignorant, and the emancipation of the slavish part of mankind all over the earth."67  Thomas Jefferson (1743-1826) declared that "all men are created equal; that they are endowed by their Creator with certain inalienable Rights; that among these are life, liberty & the pursuit of happiness."68 

These ideas spread across the globe and inspired oppressed people yearning to be free.  One notable group of young European intellectuals in the early 19th century were especially inspired by these enlightenment ideals.  They were called the Young Hegelians, after their famous teacher Georg W.F. Hegel.  Believing in Hegel's theory of a progressive rational order in human history, this group of radical philosophers wanted to put enlightenment principles into practice in order to revolutionize the whole world, giving freedom and knowledge to all people.69  The most famous and influential intellectual among this group was Karl Marx (1818-1883).  He argued, "The philosophers have only interpreted the world, in various ways; the point is to change it."70  In order to change the world, Marx introduced what he saw as the intellectual and political culmination of Hegel's world-spirit (Weltgeist) of enlightenment humanism. 

This would be a philosophy Marx called "communism," which he saw as "the solution of the riddle of history."71  Harkening back to both Hegel and to older notions of eastern enlightenment, Marx wanted a "reform of consciousness."72  He wanted human beings to "give up their illusions about their condition" and to "give up a condition that requires illusions."73  Marx and his later followers, taking a page from the American and French revolutions, wanted to use science and technology to free the exploited peoples of the world, both intellectually and politically, so as to create a global utopia of free and enlightened human beings.

But the 20th century saw the corruption and destruction of this radical social and political hope.74  The rapid increase in scientific knowledge and technology, as John Gray has pointed out, left the human species "as they have always been - weak, savage and in thrall to every kind of fantasy and delusion."75  The revolutionary spirit of American and French democracy (among others) was blocked and slowly aborted by entrenched conservative elites and the manipulation and coercion of the uneducated masses (and in the French case, the use of terror), all of which led enlightenment humanists to latch their progressive idealism onto authoritarian autocrats or illiberal bureaucratic states.76  Marx's revolutionary socialism and the revolt of the proletariat smashed naively and recklessly across the world, devolving into fascism, authoritarian states,77 and various ethnic holocausts.78 

Also, the very notions of western enlightenment and human freedom were desecrated by the vicious global imperialism of Europe and America.  These global empires plundered the riches of the world and subjugated the majority of the people on Earth, leaving most humans as little more than colonial slaves, and then these empires stood back as former colonies devolved into perpetual war and genocide.79  Finally, by mid-century, the two dominant empires on the earth were locked in a Cold War with nuclear weapons on hair triggers, threatening to destroy all life on Earth in assured mutual destruction.80 

During the long 20th century, the ancient dream of enlightenment seemed to have been shredded at the cruel hands of a barbarous humanity tearing itself to pieces.81  One Jewish survivor of the Nazi death camp Auschwitz argued that humans would have to admit that they live in "a monstrous world, that monsters do exist" and that "along-side of [enlightenment] Cartesian logic there existed the logic of the [Nazi] SS": "It happened [the holocaust], therefore it can happen again."82  Lawrence L. Langer declared, "the idea of human dignity could never be the same again."83

In revisiting the simplicity of Kant's text on enlightenment, Michel Foucault asked whether or not a great existential "rupture" had taken place in the 20th century.  Foucault questioned Kant's idea of intellectual maturity and doubted that humans could ever reach this state.  While humans had more knowledge about themselves and their world, it was a "limited" and "always partial" knowledge.  Foucault argued that we needed to give up the notion of "complete and definitive knowledge," and instead we needed to focus on how we have been shaped by historical circumstance and how we have acted upon those circumstances within certain "limits."  Foucault wanted to more fully understand the constraints and possibilities of the human condition.  But with this notion of limited knowledge and action, Foucault cautioned that we must not lose our "faith in Enlightenment" because as an ideal it pushes us to know more, to better ourselves, and to try to better our society.84

Isaiah Berlin (1909-1997) agreed with this revised version of enlightenment.  Berlin argued that our knowledge of the world and of ourselves in the late 20th century brought about two basic truths: reality is much more complex than we ever imagined, and we as human beings are much more flawed than we ever before admitted.  Berlin explained that there was "too much that we do not know" and that "our wills and the means at our disposal may not be efficacious enough to overcome these unknown factors."85 

As Friedrich Nietzsche once pondered, "When you look long into an abyss, the abyss also looks into you."86  The developing practice of science had brought about great gains in knowledge and technology, but it had also further enabled the brutal tendencies of human beings to destroy themselves and their environment.  Our great increase in knowledge and technology has not enabled greater human freedom for most people, nor has it solved the intractable problems of our paradoxical nature and society.87  The history of human folly has not ended, as some idealistic philosophers have assumed.88

In the early 20th century, one of the most educated and technologically advanced human societies caused a global war and engineered the efficient murder of millions.  Writing just before the Nazis democratically rose to power, Sigmund Freud was very pessimistic concerning human beings’ ability to use knowledge and technology toward noble ends in an effort rise above our biological constraints: "Men have gained control over the forces of nature to such an extent that with their help they would have no difficulty in exterminating one another to the last man."89 

About a century later, John Gray came to the same conclusion: "If anything about the present century is certain, it is that the power conferred on 'humanity' by new technologies will be used to commit atrocious crimes against it."90  After World War II and the horrors of the holocaust, Raymond Aron caustically pointed out that humans would "like to escape from their history, a 'great' history written in letters of blood.  But others, by the hundreds of millions, are taking it up for the first time, or coming back to it."91  In his moral history of the 20th century, Jonathan Glover forcefully emphasized that "we need to look hard and clearly at some monsters inside us."92

At the start of the 21st century humans face not only continuing warfare, poverty, disease, and outbreaks of genocide, but also a new looming catastrophe, the environmental destruction of the planet, which could actually bring out Freud's fateful apocalypse, the extinction of the entire human species.93  John Gray went so far as to call humans a "plague animal" and all but warned that our demise as a species was imminent.94  Amartya Sen, on the other hand, has acknowledged the great strides humanity has taken since the atrocities of the first and second World Wars, while noting that we still "live in a world with remarkable deprivation, destitution and oppression."95 

Surveying previous human societies that have destroyed themselves, geologist Jared Diamond said he remains a "cautious optimist" on the capacity of human beings in the 21st century to learn from the past in order to solve the looming environmental crisis and the socio-political upheaval it will cause.  He explained, "we have the opportunity to learn from the mistakes of distant peoples and past peoples. That's an opportunity that no past society enjoyed to such a degree."96 

Given the tragedy of the 20th century, I would agree with Foucault that the concept of enlightenment needs to be revised.  As John Gray has perceptively pointed out, "We live today amid the dim ruins of the Enlightenment project, which was the ruling project of the modern period."97  We now realize that enlightenment is not blind faith in knowledge combined with the false hope of unlimited progress.  Instead, we must focus on the constrained possibilities and limitations of being human, which includes our limited capacity to know, act, and shape our world.  Some philosophers call this position of limited freedom a "soft determinism."98 

The philosopher and psychologist Erich Fromm (1900-1980) explained the new promise of this revised concept of enlightenment: "We are determined by forces outside of our conscious selves, and by passions and interests which direct us behind our backs.  Inasmuch as this is the case, we are not free.  But we can emerge from this bondage and enlarge the realm of freedom by becoming fully aware of reality, and hence of necessity, by giving up illusions, and by transforming ourselves from somnambulistic, unfree, determined, dependent, passive persons into awakened, aware, active, independent ones."99  It is important to note that Fromm said "enlarge the realm of freedom" instead of perpetuating the naive myth of being completely free to determine our destiny in any way we please. 

Recent scientific discoveries in cognitive psychology, sociology, evolutionary psychology, and socio-biology lend credence to this new concept of enlightenment as limited rationality and constrained freedom, which was developed philosophically in the 20th century by Foucault, Berlin, and Fromm, among others.  The philosopher of science Daniel C. Dennett argues that modern science has proven a biological, social, and environmental determinism that shapes and constrains both individual action and human society.  He also argues that fundamentally life is essentially chaotic and "random," which means that humans have limited predictive powers to understand the present and plan for the future. 

John Gray has taken these facts to an extreme and argued that humans are just "deluded animals" because they "think they are free, conscious beings."100  But determinism and chance do not entirely rule out freedom and volition, they just circumscribe it.  "Free will is real," Dennett claims, "but it is not a preexisting feature of our existence, like the law of gravity.  It is also not what tradition declares it to be: a God-like power to exempt oneself from the causal fabric of the physical world.  It is an evolved creation of human activity and beliefs, and it is just as real as such other human creations as music and money." 

Free will exists because we as humans believe it to exist and we act according to this belief, just like we believe that certain kinds of colored paper allow us to purchase goods and services - not because of any inherent property in ourselves (or in the colored paper), but because we say it exists, believe it exists, and institutionally structure our society around this belief.  We do have the power to alter our reality through ideas.101 

But our capacity for free will is also grounded in our biology from which we will never entirely escape.102  We know that much of who we are as humans, including our behavior, it largely determined by our genes.  But as Matt Ridley pointed out, "Genes are often thought of as constraints on the adaptability of human behavior.  The reverse is true.  They do not constrain; they enable."103  As a species we are uniquely endowed with the ability to learn and enhance our lives with knowledge and technology collectively stored in our culture. 

We have evolved from an animal-like state into what Dennett calls "informavores," "epistemically hungry seekers of information."104  We use this information to create knowledge, which in turn is used to better understand our world and improve our condition.  While we are biologically, socially and environmentally determined, our human nature is not fixed.  We can change.  Not only can we use information to change our environment, but we can also build tools, like eye glasses, penicillin or computers, that compensate for biological deficiencies or maladies and give us greater power over our human nature.105  This is the realm of human culture, which separates us from all other species of life on Earth.106 

Sigmund Freud famously called humans a "kind of prosthetic God," due to our technological advances over our environment and our biological bodies.107  Karl Popper believed that human evolution in the 20th century was being driven not by biology anymore but by technology.108  Some scientists studying our human genes argue that we will be able to soon manipulate the very building blocks of our biology, allowing for a new "directed evolution."109  Other scientists in the field of cybernetics and artificial intelligence are even trying to create a new kind of human, blending technology into our biology to produce the "cyborg" and the "human-machine civilization," which they optimistically say promises not only greater knowledge and freedom, but also immortality.110 

E. O. Wilson has claimed that mastering our genetic code could bring a new form of "volitional evolution," which would allow humans to be "godlike" and "take control" of our "ultimate fate."111  Mark Lynas has gone so far as to call us "the God species."112  While their visions might vary, all of these Cornucopian idealists believe in a form of "technofideism," i.e. a "blind faith" that innovative technology will solve all of the world's problems.113  Francis Fukuyama bluntly declared, "Technology makes possible the limitless accumulation of wealth, and thus the satisfaction of an ever-expanding set of human desires."114 

The economist Julian Simon believes "the material conditions of life will continue to get better for most people, in most countries, most of the time, indefinitely."115  The marvel of current technological wonders aside, the verdict is still out on just how far technology can take us as a species.  But one thing is sure, technology is no panacea for all of the human-created problems on planet Earth, let alone naturally occurring problems, like earthquakes, hurricanes, volcanoes, and the odd asteroid smashing into our planet.

But these idealistic visions of human grandeur are not entirely misconceived.  Even though we are determined as a species, like all other biological organisms on this planet, our unique evolutionary adaptations and cultural development have enabled "a degree of freedom," which we can exploit for our own improvement.  Language, abstract thought, and culture in particular are tools unique to human beings.116  How far we can change is dependent on many factors: our personal knowledge and experience, our vast store of cultural knowledge, the advancement of our technology, and our environment, which includes both physical resources and constraints, and also social resources and constraints, like access to money and political power. 

Rene Dubos pointed out, "Man's ability to transform his life by social evolution and especially by social revolutions thus stands in sharp contrast to the conservatism of the social insects, which can change their ways only through the extremely slow processes of biological evolution."117  Thus, we can use our language, our critical thinking, and a vast store of cultural and physical resources to partially create ourselves and our social reality, albeit within certain fixed biological, physical, and social limits.118 

Perhaps science and technology might push those limits farther than we can now imagine, but there will always be limits.  Naively believing that science and technology will erase those limits is a "modern fantasy."119  Knowledge is freedom, that much is true.  However, due to the "imperfect rationality" and constrained abilities of our species, caused by various determining factors in our biology and environment, ours will forever be a bounded, limited and imperfect freedom.120

Thus, we need to re-conceive the concept of enlightenment and human nature through a more biological and ecological understanding of physical life in terms of "nature via nurture."121  The physicist Fritjof Capra has pointed out, "cognition is the very process of life...Cognition...is not a representation of an independently existing world, but rather a continual bringing forth of a world through the process of living...'To live is to know.'"122 

This insight has led some scientists, like the entomologist Edward O. Wilson, the philosopher Daniel C. Dennett, and the cognitive psychologist Steven Pinker, to argue that it is time to revisit the ancient notion of "human nature" in order to formulate "a realistic, biologically informed humanism," which combined with our knowledge of human culture would be the newest phase of human enlightenment.123  At the center of our biological nature is the process of cognition, how we know: "reasoning, intelligence, imagination, and creativity are forms of information processing."  But we as humans do not know in isolation, we utilize other people and our culture, which is "a pool of technological and social innovations that people accumulate to help them live their lives...culture is a tool for living."124 

We need to understand how our brain "evolved fallible yet intelligent mechanisms" to perceive and understand reality, as discussed in the first chapter of this book.  We also need to understand how the "world is a heterogeneous place," and how we as humans are "equipped with different kinds of intuitions and logics" to apprehend and understand different aspects of reality.  There is no one way to know, just like there is no one thing to know. 

One of the most important ways of knowing is through the tool we call an "idea," which is a complex network of meaningful information put into language that we use to better know our world and to more wisely act.  Human beings are a species "that literally lives by the power of ideas."125  Philosophers merely extend this "personal commitment to ideas" a bit further than the average human, but the practice of philosophy is inherent trait that all human beings share (whether they consciously realize this or not).126

We need to also realize that we are not always in complete control of our ideas or the technologies they enable.  Ideas have a tangible, objective reality that make them a constitutive part of human culture and subjectivity.127  Many ideas grow beyond the mind of their originator, spread to other minds, and evolve through particular cultures through historical processes to become what social scientists call "institutions."128  Institutions are the self-evident and often taken for granted social structures, both ideological and organizational, found within particular human societies.  The social structure of an institution can be described as the organized ideas and procedures that pattern particular social practices.129  These institutional procedures can also be described as the "working rules" governing a particular social practice,130 which can be thought of as a "rule-structured situation."131 

While we do create our own ideas and institutions, at some point they begin to take on a life of their own and create us.  The early historical-sociologists, Karl Marx, Emile Durkheim, and Max Weber, each studied different social structures of constituting rule systems in Western society.  They wanted to understand the underlying logics that established and maintained the modern world: the social, political, economic, and religious rules, organizations, procedures, rituals, and ideas that ordered societies.  Recent institutional theorists have complicated older notions of institutions, which were often overly simplistic and monistic, because we now know that institutions are “rich structures” of diverse, overlapping, and often conflicting patterns of social practice.132

The concept of social institutions hurdles a social-scientific dualism that has been unresolved for the past century.  At the center of the social sciences has been a central debate over how societies and social institutions are constituted and how they change.  Societies and social institution can be seen, on the one hand, as the “product of human design” and the outcome of “purposive” human action.  However, they can also be seen as the “result of human activity,” but “not necessarily the product of conscious design.”  One of the paradigmatic examples of this dualism is language.  Human beings are born speaking a particular language with pre-defined words and a pre-designed grammar; however, individual human beings are also able to adopt new languages, create new words, and change the existing definition of words or grammatical structures. 

But is any individual or group of individuals in conscious control of any particular language?  The obvious answer is no, but each individual has some measure of effect, yet just how much effect is subject to debate.  For the past quarter century or so, scholars have rejected the idea that societies, institutions, and organizations can be reduced to the rational decisions of individuals, although purposive individuals do play a role.  The new theory of institutions focuses on larger units of analysis, like social groups and organizations “that cannot be reduced to aggregations or direct consequences of individual’s attributes or motives.”  Individuals do constitute and perpetuate social structures and institutions, but they do so not as completely or as freely as they believe.133 

The new institutional theory has focused mainly on how social organizations have been the locus of “institutionalization,” which is the formation and perpetuation of social institutions.  While groups of human beings create and sustain social organizations, these organizations develop through time into structures that resist individual human control.  Organizations also take on a life of their own that sometimes defies the intentions of those human beings directing the organization.  While institutions can sometimes begin with the rational planning of individuals, the preservation and stability of institutions through “path dependent” processes (what we generally call “history”) is often predicated on ritualized routines, social conventions, norms, and myths. 

Once an institution becomes “institutionalized,” the social structure perpetuates a “stickiness” that makes the structure “resistant” to change.  Individual human actors, thereby, become enveloped and controlled by the organization’s self-reinforcing social norms, rules, and explanatory myths, which are solidified through positive feedback mechanisms that transcend any particular human individual.  These organizational phenomena, thereby, shape individual human perception, constrain individual agency, and constitute individual action. 

As one institutional theorist has argued, all human “actors and their interests are institutionally constructed.”   To a certain extent humans do create institutions and organizations, but more immediately over the course of history, institutions and organizations create us.  Many millions of individuals have consciously shaped the English language, but as a child I was constituted as an English-speaking person without my knowledge or consent.  It is perhaps more accurate to say that English allowed for the creation of my individuality than it is to say that my individual subjectivity shaped the institution of English.134 

But if all human thought and action is constituted by previously existing institutions, do human beings really have any freedom to shape their lives or change society?  This is actually a very hard question to answer and it has been the center of many scientific debates over the past century.  Durkheim and Parsons seemed to solidify a sociology that left no room for individual volition.  Marx stressed human control, but seemed to put agency in the hands of groups, not individuals.  Weber discussed the possibility of individual agency, especially for charismatic leaders, but he emphasized how human volition was always “caged” by institutions and social organizations.  Michel Foucault conceptualized human beings as almost enslaved by the various modern institutions of prisons, schools, and professions.135 

The novelist Henry Miller brilliantly expressed the predicament of human agency, whereby his knowledge of himself and society failed to enable any real freedom: "I see that I am no better, that I am even a little worse, because I saw more clearly than they ever did and yet remained powerless to alter my life."136  Taking stock of all of the possible arguments for human freedom, the philosopher Thomas Nagel explained, "The area of genuine agency...seems to shrink under this scrutiny to an extensionless point.  Everything seems to result from the combined influence of factors, antecedent and posterior to action, that are not within the agent's control."137

The enlightenment notion of unencumbered individual freedom was deconstructed during the 20th century and revealed to be nothing but a myth.  However, some recent neo-institutional theorists have recently left open the possibility of individual rationality and freedom, albeit in a limited and constrained form.  Human agency sometimes defined as the mediation, manipulation, and sometimes modification of existing institutions.  Pierre Bourdieu argued that there was a "dialectical relationship" between institutional structures and individuals.138  Human beings can act in concert with institutions or against them, and individuals can also refuse institutionalized norms and procedures, thereby, highlighting another type of agency. 

Humans can also exploit contradictions between different institutional structures, and use one institution to modify another.139  Ronald L. Jepperson argues that there can be “degrees of institutionalization” as well as institutional “contradictions” with environmental conditions.  This means that certain institutions can be “relative[ly] vulnerab[le] to social intervention” at particular historical junctures.  Jepperson is one of the few institutional analysts who conceptualize a theory of human action and institutional change, which allows for “deinstitutionalization” and “reinstitutionalization.”  But Jepperson does not validate rational choice theories of individual agency.  He argues instead that “actors cannot be represented as foundational elements of social structure” because their identity and “interests are highly institutional in their origins.” 

However, this position does not disavow institutionally mediated individual choice and action.  As Walter W. Powell has argued, “individual preferences and choices cannot be understood apart from the larger cultural setting and historical period in which they are embedded,” but individual actors have some freedom within institutional environments to “use institutionalized rules and accounts to further their own ends.”   Roger Friedland and Robert R. Alford argue that “the meaning and relevance of symbols may be contested, even as they are shared.”  “Constraints,” Powell paradoxically argued in one essay, “open up possibilities at the same time as they restrict or deny others.”140

The anthropologist Sherry B. Ortner has developed a comprehensive theory of human agency that allows individuals more power to consciously participate in, and thereby, shape and modify institutions.  She describes the individual agent in a “relationship” with social structures.  This relationship can be “transformative” on both parties: each acts and shapes the other.  While the individual is enveloped by social structures, there is a “politics of agency,” where individual actors can become “differentially empowered” within the layered “web of relations” that make up the constraints of culture.  Individuals can act through a process of reflexivity, resistance, and bricolage. 

Humans use an awareness of subjectivity and negotiate their acceptance and refusal of the status quo.  Through this process, humans can re-create existing social structures by reforming traditional practices and also by introducing novel practices.  Ortner conceptualized the process of agency as the playing of “serious games,” utilizing a metaphor originally deployed by the analytical philosopher Ludwig Wittgenstein.  She argued forcefully that existing cultural structures and social reproduction is “never total, always imperfect, and vulnerable,” which constantly leaves open the possibility of “social transformation” to those who dare to act out against the status quo.141 

However, researchers have pointed out that those who are moderately alienated or marginalized from existing institutions seem to have a greater chance of imagining new institutional forms and acting against existing institutional power: "Those at the peripheries of the system, where institutions are less consolidated, are more likely to discover opportunities for effective opposition and innovation...Change is more likely to be generated by the marginally marginalized, the most advantaged of the disadvantaged."142           

Recent scholarship has also emphasized how organizations and institutions are structured within "particular ecological and cultural environments."143  Looking at the wider sphere of organizational ecology allows researchers to understand how individuals and social organizations are interconnected within a dense social web.  Interdependent social groups interact with each other to mutually shape the physical and social environment, which in turn impacts the evolution of organizations, organizational forms, and institutionalized practices and norms.144  Organizations are mutually influenced by a host of social sectors, including nation-states, geographical regions, local governments, other organizations, and micro social groups, like the family and peer networks.145 

Within each sector there are diverse “clusters of norms” and organizational typologies that institutionally define and constrain individual and organizational actors, and thereby, a host of institutional norms and forms are continually reified and perpetuated across a diversely populated social and organizational landscape, which slowly changes through time.  Because societies are characterized by such diversity of social sectors, each with their own institutions and norms, different institutions can be “potentially contradictory,” which can allow for social conflict and social change through time as institutions develop in relation with the institutional and physical environment.146 

However, it is still unclear how institutions “change” and what change actually means.  Theorizing the nature and extent of institutional change is an unresolved issue.  Institutions are seen as stable social structures outside the control of rational agents which seem to slowly adapt to internal and environmental conditions through an incremental process, although there is some evidence to suggest that rapid changes can occur in short periods due to environmental shocks.147  Due to the contingent and evolving complexity of human institutions, one economist acknowledged, "We may have to be satisfied with an understanding of the complexity of structures and a capacity to expect a broad pattern of outcomes from a structure rather than a precise point prediction."148

Because human institutions are embedded within dense webs of social and physical ecologies, it is hard to study their complexity utilizing the simplistic theories of traditional academic disciplines.  As the Nobel Prize winning Economist Elinor Ostrom pointed out, "Every social science discipline or subdiscipline uses a different language for key terms and focuses on different levels of explanation as the 'proper' way to understand behavior and outcomes," thus, "one can understand why discourse may resemble a Tower of Babel rather than a cumulative body of knowledge."149 

The only way forward is to break down artificial disciplinary boundaries in order to "integrate the natural sciences with the social sciences and humanities" into a unified body of knowledge,150 which is something that Ostrom has tried to do with her Institutional Analysis and Development (IAD) framework.151  This new interdisciplinary way of knowing would focus on physical, biological, and social structures that enable and constrain human beings in specific historical contexts.152  Our current notion of human enlightenment is still based on the foundational idea that "knowledge is the ultimate emancipator," as Edward O. Wilson recently put it.153  In fact, Wilson consciously linked the goals of 21st century science with the older notions of enlightenment humanism, as he invoked its underlying ethos, "We must know, we will know."154 

But our new age of enlightenment needs to be grounded by an understanding of both the complexity of knowledge and also the epistemological limitations of human beings.  Wilson acknowledges that the "immensurable dynamic relationships" between different types of organisms, and also different levels of reality combined with the "exponential increase in complexity" between lower levels of reality and higher levels of reality, all pose the "greatest obstacle" to a unified human knowledge. 

It is extremely hard (if not ultimately impossible) to get "accurate and complete description of complex systems."155  On top of this, human knowledge needs to be grounded in the new sciences of the human mind (cognitive science and epistemology) because "everything that we know and can ever know about existence is created there."156  At the root of these new sciences is our "humbling" realization that "reality was not constructed to be easily grasped by the human mind," thus, we need to know both what can and cannot be known and also how to live better without perfect knowledge.157

I completely agree with Wilson's socio-biological explanation of human nature and cognition, including his arguments on how genes condition knowledge and the structure of society;158 however, I disagree with Wilson's assertion that the old enlightenment myth of "reductionism" and physical "laws" will be central to 21st century science and a unified field of knowledge.  Wilson and a great many other scientists still believe that "nature is organized by simple universal laws of physics to which all other laws and principles can eventually be reduced." Although it must be noted that Wilson is exceptional in his humility, admitting that this fiction of universal laws is at least "an oversimplification" of the complexity of reality, and further, "it could be wrong."159 

While I am willing to admit that there are some physical laws, I think it has already been readily established that even physical laws are constant only in specific spatial-temporal environments, leading to what Pierre Bourdieu called "regional theories," or what Bernard Williams called "local perspectives."160  Physicists now understand that different levels of reality have their own rules,161 and the validity of these rules are explained by "effective theories," which can only effectively explain a specific level of reality and nothing more.162

Einstein theoretically proved that both time, space, and gravity can bend, which alters the constant properties of each, therefore, even these "constants" are not all that constant throughout the universe.  Nobel Laureate Richard Feynman later discovered that some other assumed theoretical constants seem "to vary from place to place within the universe."163  At the smallest most fundamental level of reality, the sub-atomic level, there does not appear to be any governing laws at all.  It appears to be complete chaos, which is paradoxical given the ordered complexity of the chemical level up through the planetary level.164  

Thus, the laws of physics may be multitudinous across a vast universe that we have only begun to understand;165 however, this does not mean that there are "a total absence of reliable rules" governing reality.166  We just can't assume that the rules controlling one level of reality will be valid at a higher or lower level of complexity.167  While I would agree with biologists that evolution is a "law" that governs the process of change in all organisms and even human societies here on the planet Earth, I think it would be foolish to say that sub-atomic particles or solar systems are bound by this same principle of natural selection, although perhaps this theory might still have explanatory power for the level of solar systems.168 

Over the course of the 20th century there has been a clear move away from "rigidly deterministic and mono-causal models of explanation" towards more a more complex understanding of the diverse nature of reality.169  We must get away from universal "laws" and instead look for the overlapping or layered "regional theories" that approximate the various levels and processes determining physical reality.  Elinor Ostrom has argued that human culture and institutions have "several layers of universal components," which means that "multiple levels of analysis" are needed to fully explain the complex ecology of any social practice.170  There are also many layers to the physical world, ranging from the infinitely small sub-atomic particles through the many layers of the biological world to the large realm of planets and galaxies.  Each level of reality that has been identified by scientists is an area of specialization with its own name, terminology and methods of analysis.171 

As the Physicist and philosopher David Deutsche pointed out, multiple disciplinary frameworks reflect the underlying complexity of the physical world and they cannot be collapsed into a single reductionist framework: "None of these areas of knowledge can possibly subsume all the others.  Each of them has logical implications for the others, but not all the implications can be stated, for they are emergent properties of the other theories' domains."172  It is especially important to understand that higher orders of more complex life cannot be reduced to the simpler base components of lower levels because there is an "emergence" of higher order properties that have their own unique dynamics.  As biologist Ernst Mayr pointed out, "In each higher system, characteristics emerge that could not have been predicted from a knowledge of the components."173

We must guard against scientists with valid regional theories in one domain of reality who encroach on other domains, especially scientists who seek to reduce higher order systems to the simpler function of lower order parts.  David M. Kreps has explained this phenomenon as a form of epistemological "imperialism."174  A theory that might be valid on one level of reality can easily become invalid by distorting another level of reality it cannot comprehend. 

For example, while I agree with biologists and socio-biologists that both humans and human societies evolve, I completely disagree with many physical scientists over the mechanism of evolution in human culture.  The zoologist and socio-biologist Richard Dawkins famously reduced human beings to genes and reduced human culture to "memes" as biological "structures" of cultural transmission.175

Quite literally, Dawkins theorized that our bodies and our ideas replicate independently of human intention: "It is not success that makes good genes.  It is good genes that make success, and nothing an individual does during its lifetime has any effect whatever upon its genes."176  Likewise, Dawkins argued that human ideas can be reduced to memes, which are a form of social gene, where "replicators will tend to take over, and start a new kind of evolution of their own."177  Thus, human beings are merely passive incubators for genes and memes.  Nothing more, nothing less. 

This notion has been widely accepted in both scholarly and popular circles,178 but it is naively absurd,179 and it merely proves my point about extending valid scientific theories beyond their corresponding level of reality.  While the mechanism of natural selection on genes does explain most of human biological evolution, this theory is not valid or insightful when extended to social phenomenon, where intentional human actors purposively affect their own lives and cultures (to a certain extent). 

John Gray has pointed out, "Biology is an appropriate model for the recurring cycles of animal species, but not for the self-transforming generations of human beings."180  Likewise, the evolutionary biologist and geologist Jared Diamond conclusively demonstrated that "our rise to humanity was not directly proportional to the changes in our genes."181  Further, Diamond warned, "While sociobiology is thus useful for understanding the evolutionary context of human social behavior, this approach still shouldn't be pushed too far.  The goal of all human activity can't be reduced to the leaving of descendants."182 

To illustrate this obvious point, one needs to look no further than Dawkins own mother.  She once explained to Dawkins when he was a boy that "our nerve cells are the telephone wires of the body."183  Now if human communication is nothing but a blind and random biological transmission of memes, then how does Dawkins purposefully remember this meaningful story after so many years?  And if human communication can be reduced to electrical impulses and memes, then where did the meaning of this story come from. 

Further, how am I able to take this story out of context to change the meaning in order to criticize Dawkins for being an absurd reductionist?  While the reality and importance of genes cannot be discounted, many scientists arrogantly believe that "hard" quantitative sciences, like chemistry or socio-biology, are the best (and often the only) way to understand human beings and our culture.  This arrogant reductionism is often called "scientism."184  The "question of meaning" cannot be answered by physical science, which is not to say that meaning and the human mind are metaphysical entities.185  Both meaning and the mind are physical and social realities grounded in and facilitated by the empirical world, but that empirical world is very complex and one cannot simply reduce higher order psychological and social phenomena to chemical or genetic processes. 

Because they discount the reality of social and psychological ecologies, such reductionists also never bother to think about individual and social consequences of reductionist theories, such as memes.186  The notion that "nothing an individual does during its lifetime has any effect whatever upon its genes" can be easily translated into a pernicious form of nihilism, leading to individual paralysis and social anarchy.187  Such reductionism ignores the reality of the "culture gap" between us and all other species on Earth.188 

While we are physical beings programmed and constrained by natural laws, like all other organic organisms on this planet, the homo-sapien is a very unique creature that has also created a subjective world through consciousness and society, which has its own emergent properties and governing structure.  Culture exists and is every bit as important as genes in terms of shaping our behavior as a species, yet socio-biologists like Wilson and Dawkins miss this important level of reality because of their reductionist scope.

Many physical scientists, especially in the relatively new fields of socio-biology and evolutionary psychology,  have been drunk on their particular methodology and the scientific success it has enabled.  Seeking to imperially expand their explanatory power, these physical scientists inevitably ride rough with their simplistic reductionism through the complexities of human life and culture.  The analytical concept of genes is important, but ultimately only one small part of the complex puzzle of life.  Richard Dawkins wants us to believe that "the gene's perspective" is the most important, valid and insightful uber-perspective of all.189 

Why?  Why not reduce human beings and culture to sub-atomic strings, atoms, chemicals, or organs?  If every living thing can be reduced to DNA and genes then why study any individual organism at all?  There are a host of important physical components that we could use in a reductionist fashion to "explain" the significance of any species, but all of these factors miss the forest for the trees.  As the physicist Lisa Randall points out, "Understanding the most basic components is rarely the most efficient way to understand the interactions at larger scales."190  And yet for the past century the myopia of physical scientists have distorted the most important and unique aspects of human beings: individual intentionality and culture. 

In 1968 the microbiologist and philosopher of science Rene Dubos caustically pointed out, "The most damning statement that can be made about the sciences of life as presently practiced is that they deliberately ignore the most important phenomena of human life," namely individual intentionality and culture.191  Thirty-five years later, the notion of studying human beings "as if they were human beings" is considered a "radical innovation" in the social sciences,192 although some physical scientists like Jared Diamond are notable exceptions.193  The archaeologist Timothy Taylor recently made the same point.  He argued, "There has been an extraordinary - and often extraordinarily arrogant - underestimation of the complexity of the humanities by some hard scientists who extend themselves across the arts/sciences divide...when [they] stray into more 'humanistic' domains, [they] make an unwitting ass of [themselves]."  In particular, Taylor cites Richard Dawkins' notion of memes as a primary example of this misguided arrogance.194 

Ironically, Daniel C. Dennett has also admitted this same sad fact.  He wrote, "When scientists decide to 'settle' the hard questions of ethics and meaning, for instance, they usually manage to make fools of themselves, for a simple reason: They are smart but ignorant."195  This is ironic because Dennett is one of those aforementioned "smart but ignorant" scientists who have tried to reduce the complexities of human culture to the absurdity of memes.196  While admitting the anthropologist's established fact that "human beings spin webs of significance," Dennett proceeded to dismiss all humanistic and most social scientific literature in a recent book on culture, in order to reduce the phenomenon of religion to mere biological and psychological processes, which of course boils down in reductionist fashion to genes and memes.197 

Rene Dubos accurately acknowledged that "cultural evolution has long been of much greater importance than biological (genetic) evolution."  Thus, to fully understand human beings, one has to focus primarily on the "sociocultural environment," using theories and methodologies appropriate to that level of reality.198  When trying to understand the human, as Karl Popper once pointed, "obviously what we want is to understand how such non-physical things as purposes, deliberations, plans, decisions, theories, intentions, and values, can plan a part in bringing about physical changes in the physical world [author's emphasis]."199

Humans must move beyond the simplistic and silly notion that a handful of simplistic and reductionist laws can explain the complexity of all life in the universe, especially the every changing reality of our socio-political world.200  In our attempt to unify human knowledge in the 21st century, we must recognize that each level of reality demands its own theories and methods that are appropriate to the phenomenon being studied, which necessarily implies the acceptance of a wide range of epistemological theories and methods.  And we also must accept, in the wake Gödel's thesis and the Church-Turing thesis, that not every facet of reality is amenable to human intelligence.201 

When it comes to human culture, and issues like religion, politics, education, law, and economics, it is absurd to think that the theories and methods of biologists or physicists can completely explain the dense complexity of human society, or that economists can explain the root function and value of all cultural activity.  This is not to say that biologists, physicists, and economists cannot lend some fundamental insights into cultural phenomenon and social processes.  What I am saying is that when only one disciplinary framework is used to explain any complex phenomenon, it will obscure more than it reveals.  As Stephen Toulmin pointed out, any narrow academic science delivers "at best" an "oversimplification of human life and experience" and at worst it distorts reality beyond recognition.202 

Edward O. Wilson is one example of a physical scientist who has been able to admit that human evolution is determined not only by biological evolution but also by the "unique" processes of cultural evolution.  He explained that "the most distinctive qualities of the human species are extremely high intelligence, language, culture, and reliance on long-term social contracts."203  But he appears to be utterly ignorant of the voluminous humanistic and social scientific literature on human culture, which he never references in any of his books.  Instead, he has imperiously assumed time and again that biology and physics can explain everything.  What physical scientists do not understand, due to their extreme empirical focus on the objective world, is that human culture is a fundamentally different type of phenomenon because it is both objective and subjective, at the same time, and what is more, it is diversely subjective because multiple individuals and groups constantly contest the constitution of subjective reality through divisive and sometimes violent political processes. 

Thus, to study human society one must not only study the macro-level of objective-sociology, but also what anthropologist Larry Hirschfeld calls "naive sociology," which is the way that humans make sense of their own social world subjectively and give it meaning.204  As philosopher Helen A. Fielding explained, "Reality can only be given to us through the multiple moving and moved perceptions of embodied being and the potentiality of our being with others."205 

This is the domain of "culture."  The idea of culture became an important social-scientific concept over the 20th century, enabling new knowledge about human society and individual action.206  Many physical scientists fail to recognize that culture "is a rather practical thing," not a bunch of abstract concepts and theories applied to discrete objects.  Thus, cultural transmission is not the movement of some impersonal, objective meme, but rather it is a "relevance-driven" process that relies on subjective necessity and values, which emerge from the individuals who selectively transmit information based on many possible intentions conditioned by many possible environments.207  Human beings are able to "generate an infinity of practices adapted to endlessly changing situations," and rarely are they fully cognizant of what they are doing and why.208 

This requires utilizing diverse methods to know diverse individuals embedded in complex cultures.  The philosopher and sociologist Raymond Aron called such a method "interpretive pluralism," as Brian C. Anderson explained, "Given the complexity of historical and social reality and the intrinsic limits of human cognition, a methodological approach that takes into account several dimensions of analysis or several interpretive frameworks will be more objective than a reductive, monistic approach."209  Clifford Geertz utilized such interpretive pluralism in his own practice as a cultural anthropologist.

He sought to understand "directly and fully the diversities of human culture."  Geertz was especially fascinated with the "ethos" of individual cultures, which he defined as the "historically transmitted pattern of meanings embodie[d] in symbols, a system of inherited conceptions expressed in symbolic forms by means of which mean communicate, perpetuate, and develop their knowledge about and attitudes toward life...their world view."210

Physical scientists also fail to understand that studying human society is not just an objective scientific endeavor.  It is at the same time a moral and political project too.211  Even when the social scientist tries to be objective by accepting and recording the status quo, this is a political act that will help legitimate a historically contingent status quo that undergirds a particular balance of power.212  When it comes to human society, science is not a neutral activity made by a neutral observer.  Scientists are active participants that can influence political discourse and/or action, or be influenced in turn.

The average human being does not need, nor can they often use, scientifically produced objective data, which to most of us is just meaningless information.  As Edward O. Wilson poetically explained, "We are drowning in information, while starving for wisdom."213  People need knowledge to help them wisely make difficult practical and moral judgments, which will affect the quality of their individual and socio-political life.214  Scientists are extremely focused on understanding "what is" reality, but this information cannot help ordinary humans make moral judgments about "what could be" reality or "what should be" reality.215

Humans constantly negotiate the boundaries of "what is" with "what could be" and "what should be" in their daily lives.  Individuals have some measure of control over their lives and the shape of their society, "contributing," as C. Wright Mills once explained, "however minutely, to the shaping of this society and to the course of its history, even as [they are] made by society and by its historical push and shove."  Thus, when scientists merely analyze existing social reality they miss an important point: "it is not a stable phenomenon with a definable essence" because we change the constitution of society every day in innumerable ways.216

The totality of physical life seems to be completely determined by a mindless law of cause and effect,217 but not human beings or human society.  This very special realm of the natural world is triply determined: it is effected not only by physical patterns of causation, but also by institutional environments and by the conscious freedom of human actors, as they negotiate, accept, attack, reject, revolt, or revise "what is" with what they think the world could be and should be.  As Mills explained in his classic treatise on sociology, "Whatever else he may be, man is a social and an historical actor who must be understood, if at all, in close and intricate interplay with social and historical structures."218

“Surely we ought occasionally to remember that in truth we do not know much about man, and that all the knowledge we do have does not entirely remove the element of mystery that surrounds his variety as it is revealed in history and biography.  Sometimes we do want to wallow in that mystery, to feel that we are, after all, a part of it, and perhaps we should; but...we will inevitably also study the human variety, which for us means removing the mystery from our view of it.  In doing so, let us not forget what it is we are studying and how little we know of man, of history, of biography, and of the societies of which we are at once creatures and creators.”219

In the science saturated society of mid-20th century America, C. Wright Mills warned that it might be possible for human beings to be increasing their "rationality" while also living "without reason."  Just because scientifically produced information and technology make our lives materially better, it "does not mean that men live reasonably and without myth, fraud, and superstition."  And paradoxically, the exponential increase in human rationality over the past century has not come with equal gains in human freedom.220  Mills was not hopeful that science would circle the square of the human condition.  The only hope he could point to was the liberating effects of education, largely in agreement with John Dewey, whereby average human beings could transform themselves into "free and rational individuals" and come together democratically to build a better future.221  Raymond Aron too believed that education held the only real hope for humanity, although he pessimistically admitted that "rational humanists" like himself "bet on the education of humanity, even if he is not sure he will win his wager."222

The continued and enlarged practice of science will surely have an important part to play in the future of our species.  And no doubt the unification of knowledge will help the human species better address the complexities of physical and social reality.  In particular, scientists must find a way to "create some kind of synthesis of human evolution with our new understanding of cultural diversity."223 

But scientific progress is not a sufficient condition of knowledge or freedom for the average human being.  Individual men and women must be given the intellectual tools that they need to live at a practical level: conceptual tools that will allow them to understanding themselves, their subjective and cultural ethos, and the physical world that surrounds them.  Knowledge helps us order the world, "like a string in a maze," so that we don't "lose [our] way."224  Human beings do not need objective knowledge so much as humanistic education that will allow us to learn and live better.  We need knowledge to be wise in making difficult decisions so as to become more rational actors, debating the parameters of the good, participating in the co-construction of our lives, and hopefully working towards a better world. 

I have offered in this book both an inquiry into knowledge, education, and the human condition, which I have gleaned from some of the finest minds in human history.  But I offer few answers to the riddle of the human condition, largely because I firmly believe that there are few answers to be found.  I agree with John Gray who wrote, "Technical progress leaves only one problem unsolved: the frailty of human nature.  Unfortunately, that problem is insoluble."225  Recognizing this perennial problem, the philosopher Ann V. Murphy argues that we need to re-conceive our humanity in terms of "corporeal vulnerability," allowing for the "contingency and imperfection" of all that we do, all that we as humans are and will ever be.226

However, we do not have to be victims of our frailty and vulnerability.  I fundamentally believe that proper education is the key to enabling our humanity, despite our limitations as a species.  We must live to learn, and in learning, to live a life more meaningful and free. 

But we must also recognize that we exist within determining boundaries, physical, biological, social, and institutional.  We must continually negotiate our humanity within the bounds of these environments and our own biology.  We need to find sustainability as a species, and this means "living within the physical limits of the ecosphere" as well as our own biological bodies.227  As Ernst Mayr pointed out, "To 'know thyself," as the ancient Greeks commanded us, entails first and foremost knowing our biological origins."228 

Knowing the basis and boundaries of human existence is a never-ending process of seeking enlightenment and formulating contingent judgments based on incomplete information, often in stressful circumstances.  Even when we discover great insight about ourselves or the world we live in, we must never forget the enduring power of human ignorance and our imperfect ability to translate knowledge into purposive action.  John Gray sardonically noted, "those who imagine that great errors of policy are not repeated in history have not learnt its chief lesson - that nothing is ever learnt for long."229  Or, even more pessimistically, Arthur Wichmann despaired of humanity's flawed nature and caustically warned, "Nothing learned, and everything forgotten."230 

Thus, we need to not only seek better knowledge and more sophisticated forms of action, but we also need better ways of institutionalizing the search for enlightenment and the transmission of knowledge in the face of our enduring weaknesses as a species, so that we don't forget ever the hard-earned lessons we've learned through history.  As the public intellectual Christopher Hitchens insightfully and succinctly pointed out, "human stupidity" is the "enemy" that we will eternally face.231

This was a central insight of many great 20th century philosophers, like Isaiah Berlin, Raymond Aron, Michele Foucault, and Ludwig Wittgenstein.232  Summarizing Wittgenstein's opus, Hilary Putnam argued, "[He] wants not to clarify just our concepts, but to clarify us; and, paradoxically, to clarify us by teaching us to live, as we must live, with what is unclear."233  The ancient concept of enlightenment promises us a way out of the bondage of existential darkness, but it does not guarantee a clear path or purpose, nor the freedom to be whatever we might want to be.  Berlin, Aron, Foucault, and Wittgenstein all affirmed what one scholar has called a "self-critical, chastened Enlightenment, aware of the imperfections of man, the importance of history, the constancy of the tragic, and the limits of rationality."234 

These critics of enlightenment believed in the power of rational inquiry and human ingenuity, but not over much; they believed in humanity "as inherently unfinished and incomplete, as essentially self-transforming and only partly determinate, of man as at least partly the author of himself and not subject comprehensively to any natural order."235  It is a humble vision of skeptical optimism that promises nothing but a chance at better thinking and living. 

In the early 18th century, the English poet Alexander Pope (1688-1774) started an ambitious philosophical treatise in verse called An Essay on Man, which he was never able to finish.  Yet despite its fragmented state, it spread across Europe and made a contribution to the "age of enlightenment."  Pope was more conservative than most enlightenment philosophers and the essay reinforced Pope's belief in a great chain of being and an omnipotent creator God. 

But his poem can also be read more metaphorically as a description of a human condition that we will never escape, caught betwixt the heaven and the earth with only a fragment of knowledge to guide our way.  In Epistle 2 of the Essay Pope makes a fairly radical claim for his day and age, "Know then thyself, presume not God to scan; / The proper study of mankind is Man."  Pope described humanity as a "middle state" between "too much knowledge" and "too much weakness,"

“In doubt to deem himself a god, or beast...

Born but to die, and reasoning but to err;

Alike in ignorance, his reason such,

Whether he thinks too little, or too much:

Chaos of thought and passion, all confused;

Still by himself abused, or disabused;

Created half to rise, and half to fall;

Great lord of all things, yet a prey to all;

Sole judge of truth, in endless error hurled:

The glory, jest, and riddle of the world!”236

The radical departure of this poem for the 18th century was the eclipse of divinity with a new focus on the human condition as our "proper study."  Pope praises the uniqueness of humanity, but notes its inherent contradictions and weaknesses that hobble any clear hope for the future.  This poem at once celebrates the potential of human beings, while simultaneously warning against too much pride in our deeply flawed nature.  It is a paean to human possibility within limits that are beyond our control.

Writing several decades earlier, the French philosopher Blaise Pascal came to a similar conclusion.  Having declared that "man is beyond man," Pascal still argued that humans must try to better apprehend their nature and their condition with the limited knowledge and skill that they possessed.  Pascal claimed that humans "burn with desire to find a firm foundation, an unchanging, solid base on which to build a tower rising to infinity," but humans must realize that this type of knowledge is impossible, "so let us not look for certainty and stability."  Instead, Pascal, like Pope, wanted human beings to turn inward and outward to better understand the human condition:

“Let us, having returned to ourselves, consider what we are, compared to what is in existence, let us see ourselves as lot within this forgotten outpost of nature and let us, from within this little prison cell where we find ourselves, by which I mean the universe, learn to put a correct value on the earth, its kingdoms, its cities, and ourselves...For in the end, what is humanity in nature? The end of things and their beginning are insuperably hidden for him in an impenetrable secret.”237

And like Pope, Pascal placed human beings within a complex ecology so as to contextualize human nature within the larger natural and social world.  Human beings have a measure of reason and freedom to understand their nature, but they are still trapped in the "prison cell" of both their physical reality and their own mind.  Pascal claimed that humans will never know the origins or endings of life, thus he asked human to narrow their gaze to the past, present and future that can be known, and to be satisfied with the limited knowledge we have at our disposal.

While the "impenetrable secret" of absolute reality has been a constant thorn in the side of humanity's quest for ultimate knowledge, it was Friedrich Nietzsche who first philosophized a way to know and live without the "firm foundation" of some supreme Truth.  Nietzsche criticized the irrational myths that continued to keep humanity "sunk deep in untruth" through both the official lies of the powerful and what the poet William Blake called the "mind-forg'ed manacles" of our own subjectivity.238 

Nietzsche criticized the tradition of western philosophy and science for holding onto the very notion of absolute truth: "A lack of historical sense is the congenital defect of all philosophers...everything has evolved; there are no eternal facts, nor are there any absolute truths.  Thus, historical philosophizing is necessary."  He argued that the historically minded philosophers would realize that the human "faculty of knowledge," as well as the quality of our knowledge, have both "evolved" as the species has evolved in relation to an ever-changing environment.239 

Nietzsche was the first philosopher to extend the implications of Darwin's scientific revolution to the human condition and the limited possibilities of the human future.  It was Nietzsche who first argued that humans had to historicize and contextualize their knowledge in relation to the natural world and to our continued evolution.  Humans had to look hard at the past in order to weigh the meaning and possibility of the future.  Only then could we devise clear knowledge about the human condition.

“Stroll backwards, treading in the footprints in which humanity made its great and sorrowful passage through the desert of the past; then you have been instructed most surely about the places where all later humanity cannot or may not go again...When your sight has become good enough to see the bottom in the dark well of your being and knowing, you may also see in its mirror the distant constellations of future cultures.”240

Nietzsche argued that through the study of the human past, we could gage the limited potential of the human future.  But unlike other philosophers, Nietzsche argued against a fixed human nature, instead postulating an evolving human nature in dynamic relation to its ever-changing environment.241 

He also argued that stepping outside of our own cultures and taking a look at the diversity of human life world would enlighten us about our possibilities: "There are great advantages in for once removing ourselves distinctly from our time and letting ourselves be driven from its shore back into the ocean of former world views.  Looking at the coast from that perspective, we survey for the first time its entire shape, and when we near it again, we have the advantage of understanding it better on the whole than do those who have never left it."242

The American born English poet T. S. Eliot (1888-1965) had seen in his lifetime both the rising promise of science and the horrors of the early 20th century.  He felt the same mixed admiration and fear for humanity, as did Pope and Pascal, but Eliot was also darkly hopeful that humans could find their way to a better future if only, as Nietzsche implored, they could understand the past: To know what humans have been will greatly determine what humans can be.  Eliot was able to articulate in unmatched poetic beauty the depth of these important perspectives on the paradoxical human condition and our limited ability to fully know ourselves within the larger ecology of space and time.  In an effort to bring this book to a close, I want to first quote T. S. Eliot at length to share his beautiful and profound poetry:

“...both a new world

And the old made explicit, understood

In the completion of its partial ecstasy,

The resolution of its partial horror...

Only through time time is conquered...

Or say that the end precedes the beginning,

And the end and the beginning were always there

Before the beginning and after the end.

And all is always now.  Words strain,

Crack and sometimes break, under the burden,

Under the tension, slip, slide, perish,

Decay with imprecision, will not stay in place,

Will not say still...

Quick now, here, now, always...

Do not let me hear

Of the wisdom of old men, but rather of their folly...

The only wisdom we can hope to acquire

Is the wisdom of humility...

Knowing myself yet being someone other -

And he a face still forming...

All touched by a common genius,

United in the strife which divided them...

What we call the beginning is often the end

And to make an end is to make a beginning.

The end is where we start from...

Every phrase and every sentence is an end and a beginning,

Every poem an epitaph.  And any action

Is a step to the block, to the fire, down the sea's throat

Or to an illegible stone: and that is where we start.

We die with the dying:

See, they depart, and we go with them.

We are born with the dead:

See, they return, and bring us with them...

A people without history

Is not redeemed from time, for history is a pattern

Of timeless moments...

We shall not cease from exploration

And the end of all our exploring

Will be to arrive where we started

And know the place for the first time.

Through the unknown, remembered gate

When the last of earth left to discover

Is that which was the beginning.”243

In this long poem Eliot offered human beings only two humble hopes: To know one's place in time through careful study of human evolution and to learn from the mistakes of the past so as to not repeat its fatal errors.  Above all, Eliot encouraged a measured caution and humility; we must

recognize our frail mortality as we ceaselessly explore the human "face still forming" and act in "timeless moments" to make a "new world."  

In our quest to push the boundaries of the human condition ever further in the 21st century, we must continually resist the false lure of utopian plans for perfection, the easy bliss of hucksters, the patronizing promises of politicians, the myths of transcendent absolutes.244  From the origins of our culture producing species, philosophers have oversold the promise of human rationality and innovation as adequate tools to produce a perfect utopian future.245  For as Thomas Moore (1478-1535) ironically warned in his treatise on utopia, "things will never be perfect, until human beings are perfect," which as the title and story suggested, was never (utopia is Latin for nowhere).246   

During the 20th century one of the most enlightened and technologically advanced cultures in the world caused two World Wars, while engineering the technological marvel of efficiently locating, incarcerating, and murdering about eight million people in under eight years.  Amid this unfolding holocaust, the German playwright Bertolt Brecht (1898-1956) fled the Nazi regime and began writing a play on Galileo in which he would revisit the notion of enlightenment and human rationality in the midst of the horrors of fanaticism, war, and mass slaughter.247  In the play Brecht's Galileo is asked if "the truth will prevail" in the end, to which he responds, "No, no, no.  Truth prevails only when we make it prevail.  The triumph of reason can only be the triumph of reasoning men."248  Galileo would go on to warn in a long monologue at the end of the play,

“If mankind goes on stumbling in a pearly haze of superstition and outworn words and remains too ignorant to make full use of its own strength, it will never be able to use the forces of nature which science has discovered.  What end are you scientists working for?  To my mind, the only purpose of science is to lighten the toil of human existence.  If scientists, browbeaten by selfish rulers, confine themselves to the accumulation of knowledge for the sake of knowledge, science will be crippled and your new machines will only mean new hardships.  Given time, you may well discover everything there is to discover, but your progress will be a progression away from humanity.  The gulf between you and humanity may one day be so wide that the response to your exultation about some new achievement will be a universal outcry of horror.”249

Brecht has his Galileo criticize the utopian faith in human progress, once projected onto religion, now turned toward the practice of science.  Brecht's Galileo warns, "The aim of science is not to open the door to everlasting wisdom, but to set a limit to everlasting error."250

At the dawn of the 21st century we as human beings have not yet escaped from our irrational myths, the cruelty of our animal instincts, and our insatiable propensity for violence and war.  We have not achieved the ancient enlightenment dream because few humans "dare to know" and build a better world.  We have not created the conditions for enlightenment and human freedom, which are still only ideals that have been imperfectly practiced and institutionalized. 

We have as a species an unprecedented storehouse of knowledge about the human condition: what we've done in the past, what we're doing now in the present, and what we're capable of doing in the future.  We have mastered new forms of technology and engineered unimaginable marvels.  But we still suffer from war, disease, genocide, systemic socio-economic inequality, and injustice.  Our environment is on the brink of catastrophic change due to pollution and global warming, threatening the sustainability of our ecosystem, not to mention the thousands of nuclear and biological weapons armed and waiting that could annihilate all life on earth.

Despite many advances in science over the 20th century, we are all still profoundly constrained by our subjectivity and culture, and we always will be.  John Gray has called these perennial limitations "the enduring contradictions of human needs."251  While objective knowledge created through the practice of science is important, it is not a sliver bullet, and it is out of reach of the majority of human beings.  Every individual human being needs practical tools for personal "enlightenment" and dealing with daily life, not sophisticated scientific tools for social "engineering."252  We need to be aware of how our bodies and minds work and how they can be managed, within biological, cultural, and environmental constraints. 

Thus, as Albert Camus pointed out last century, like Blaise Pascal, Ralph Waldo Emerson, and Friedrich Nietzsche had done before him in previous centuries, the foundation of human knowledge must be a combination of "optimism and doubt" based on a keen understanding of "human possibilities and limits."253  In his 1968 Pulitzer prize winning book, the microbiologist Rene Dubos also repeated this old theme.  Dubos wrote,

“Knowledge of the past is essential for the understanding of life in the present and in the future, not because history repeats itself - which it never does exactly - but because the past is incorporated in all manifestations of the present and will thereby condition the future...The constitution of a particular person includes the potentialities that his experiences have made functional; its limits are determined by his genetic endowment...[and]environmental stimuli...In addition to the determinants that survive from man's evolutionary past and are common to all mankind, there are those acquired by each person in the course of his own individual life...differ[ing] from culture to culture and from person to person...Even though all manifestations of life are known to be conditioned by heredity, past experiences, and environmental factors, we also know that free will enables human beings to transcend the constraints of biological determinism...[Thus] each one of us can consciously create his personality and contribute to the future...by providing a rational basis for option and action.  Man makes himself through enlightened choices that enhance his humanness.”254

Toward this end of enhancing our humanness through "enlightened choices", we must devise better ways to teach human beings how to become philosophically aware of their own lives - how to understand their personal and cultural ethos so that they can more fully and freely act and contribute to a better human future.  Ralph Waldo Emerson once said that the most important question is "a practical question of the conduct of life.  How shall I live?"255 

I have tried to address this vexing question.  Philosophy is a tool to better understand ourselves as human beings and the world that we live in.  It is also a tool to help us form judgments so that we can wisely choose the course of our lives and create a sustainable future.  Rhetoric is the tool to help us organize and communicate our thoughts, and to participate in our culture. 

While philosophy based on personal experience is not without flaws and limitations, it is the best tool that most humans will ever have.  We use philosophy to understand, judge, set priorities, and often to make trade-offs when our priorities conflict – all in an attempt to create a better life.  A philosophy of experience and a rhetoric of human motives are tools to help us make and communicate the difficult decisions that we all must face.  We learn as we live, we communicate our lives, and we learn to live better.  As the French social scientist and philosopher Raymond Aron pointed out, "the power of man" comes from "assessing his place in the world and in making choices," to "take possession of the history that he carries within him and that becomes his own."256

The old philosophically oriented course of study called the "Humanities," which was used to instill a "liberal education" has been in decline over the past century.257  The whole notion of a liberal education is suspect because the scientifically and vocationally oriented system of schooling in the west has ostracized this kind of learning as impractical, elitist, and obsolete.258  Accordingly, there has been a general and widespread "devaluation of the humanities" in both higher education and K-12 public schooling.259  This is unfortunate because the core curriculum of liberal education is needed more than ever: subjective awareness, creativity, cultural analysis, understanding human values, critical thinking, and effective communication, all working towards a greater human freedom and meaningful life.260 

The liberal arts are a type of "wisdom literature."261  They teach us about the historical record of human agency and the consequences of human action so that we may make more enlightened decisions.  As Louis Menand has explained, a liberal arts education gives a student the "historical and theoretical knowledge" that helps one "learn how to learn," about one's individuality, one's culture, and the larger world.262 

The end of the liberal arts is "to enable students...to make more enlightened contributions to the common good."263  And as Martha Nussbaum writes, "If we cannot teach our students everything they will need to know...we may at least teach them what they do not know and how they may inquire."264  Students can then use this enlarged awareness of the world and an improved capacity for critical thinking to address "questions that are of prime human importance, for which no answers are uncontestably certain."265  That is the promise of a liberal education and the only true hope for humanity.

The priority of all humanistic and scientific disciplines in the 21st century should be to come together in common cause to equip human beings with the epistemological, ontological, and axiological tools needed to live in a complex, dangerous, and data saturated world.  People need to be taught how to develop their own capabilities, which will enable them to be free and to make better decisions. 

This is especially important in democratic countries where citizens have significant responsibilities, including debating public policy and making informed voting decisions.266  Amartya Sen argued, "As competent human beings, we cannot shirk the task of judging how things are and what needs to be done...It is not so much a matter of having exact rules about how precisely we ought to behave, as of recognizing the relevance of our shared humanity in making the choices we face."267 

More so than the sciences, the humanities can teach an existential and conceptual cartography, which humans can use to map their worlds, know their worlds, and act freely and responsibly.  This should be the heart of every educational enterprise.  Human must learn to become aware of how biology, subjectivity, and culture influence perception and behavior. 

Humans must also learn how subjectivity, and culture can be influenced and modified in turn, thus, creating the conditions of freedom and moral responsibility.  We have the capacity to be "self-transforming beings" and this allows us to "perpetually reinvent" our ethos within the constraints of the human condition.268  But we must never forget that individual, social and political "responsibility requires freedom," thus, even if we will never be completely free, we still must strive to increase the boundaries of human freedom as much as we can.269 

The purpose of education in the 21st century should not be overly focused on information, skill acquisition, vocational training, or national economic development.  Education is an individual endeavor of exploration and development that is also simultaneously a public good because society is at root a congregation of diverse individuals living together in common need and mutual dependence.  Society is enriched and strengthened when subjectively aware, self-assured, and knowledgeable individuals pursue excellence in an environment of free exchange and mutual benefit. 

At its core, education should be focused on the what Owen Flanagan calls "eudaimonistic scientia," knowledge of the human condition which enables human flourishing.270  This type of education would help foster the creation of individual character (ethos), cultural identity (ethos) and ecological sustainability.  This type of education would nurture the human self through social discovery, individual creativity, and critical analysis, whereby, the individual would create their own self and a meaningful life within the social and physical constraints outlined above. 

This type of education is thousands of years old, at least, but it is not anachronistic.  It was and is and will ever be the foundation of all human wisdom. 

As the stoic philosopher Seneca once said, "while we live, while we are among human beings, let us cultivate our humanity."271

  


1 This claim is central to many of Gray's works.  See Enlightenment's Wake (New York, 2009), ix; Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2008); Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003).

2 Adams, The Education of Henry Adams, 451.

3 A. A. Long, ed., The Cambridge Companion to Early Greek Philosophy (Cambridge, UK, 1999); Anthony Gottlieb, The Dream of Reason: A History of Philosophy from the Greeks to the Renaissance (New York, 2000); Gregory Vlastos, Socrates: Ironist and Moral Philosopher (Ithica, NY, 1991).

4 Plato, "Apology," in Plato: Complete Works, John M. Cooper, ed. (Indianapolis, IN, 1997), 21.

5 Plato, "Protagoras," in Plato: Complete Works, John M. Cooper, ed. (Indianapolis, IN, 1997), 778.

6 My emphasis.  Plato, "Apology," Ibid., 33.

7 Plato, "Republic," in Plato: Complete Works, John M. Cooper, ed. (Indianapolis, IN, 1997).

8 Buddha, The Dhammapada, trans. John Ross Carter and Mahinda Palihawadana (Oxford, 2000), 3, 5, 33, 43, 49.  See also Karen Armstrong, Buddha (New York, 2001); Andrew Skilton, A Concise History of Buddhism (Birmingham, UK, 1997).

9 D. T. Suzuki, Zen Buddhism: Selected Writings of D. T. Suzuki (New York, 1996), 3.  Suzuki described Zen Buddhism as the purist form of Buddha's teachings: "Zen takes hold of the enlivening spirit of the Buddha, stripped of all its historical and doctrinal garments" (41).

10 Confucius, The Analects, trans. David Hinton (Washing DC, 1998), 67, 85.

11 Mencius, Mencius, trans. David Hinton (Washington DC, 1998), 49-50.

12 I highlight here a common preoccupation with the notion of enlightenment, but I do not discuss the different methods used by these teachers that reflect differing conceptions of knowledge and the good.  Socrates and Buddha were much more critical of tradition and society than were Confucius and Mencius; however, it must be noted that all four criticized and revitalized their respective cultures.  I think this commonality is much more important than their differences.

13 Owen Flanagan, The Bodhisattva's Brain: Buddhism Naturalized (Cambridge, MA, 2011), 119.

14 Although the practice of critique is more of a western tradition.  See Gottlieb, The Dream of Reason.

15 I. Bernard Cohen, Revolution in Science, 135, 142.  See also Thomas S. Kuhn, The Structure of Scientific Revolutions.

16 Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 225.  Ong demonstrated that the middle-ages and early modern period were an "age when there was no word in ordinary usage which clearly expressed what we mean today by 'method,' a series of ordered steps gone through to produce with certain efficacy a desired effect - a routine efficiency.  This notion is not entirely missing in early sixteenth-century consciousness, but it has yet no independent existence" (p. 225).

17 Cohen, Revolution in Science, 148, 153, 156.

18 Cohen, Revolution in Science, 161-62, 168.  Quote from Newton’s De Motu in Cohen, 168.

19 Peter Gay. The Enlightenment: The Rise of Modern Paganism (New York. 1995).

20 Immanuel Kant, "An Answer to the Question: What is Enlightenment?", in Practical Philosophy: The Cambridge Edition of the Works of Immanuel Kant, trans. Mary J. Gregor (Cambridge, UK, 1996).

21 Lucien Febvre and Henri-Jean Martin, The Coming of the Book: The Impact of Printing, 1450-1800 (London, 2010), 258; Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 123.

22 Friedrich Nietzsche, Beyond Good and Evil, trans. Walter Kaufman (New York, 1989), 161.

23 Dennett, Darwin’s Dangerous Idea: Evolution and the Meaning of Life.

24 Cohen, Revolution in Science, ch 19, quote on 289; Dennett, Darwin’s Dangerous Idea: Evolution and the Meaning of Life.

25 Ernst Mayr, “The Nature of the Darwinian Revolution,” Science, 176 (1972), 987.

26 Cohen, Revolution in Science, 299.

27 Dennett, Darwin’s Dangerous Idea; Charles Taylor, A Secular Age (Cambridge, MA, 2007), 77.

[28] Cohen, Revolution in Science, ch 27; Isaacson, Einstein; Heisenberg, Physics and Philosophy; Lindley, Uncertainty: Einstein, Heisenberg, Bohr, and the Struggle for the Soul of Science; Greene, The Elegant Universe.

[29] David Lindley, Uncertainty: Einstein, Heisenberg, Bohr, and the Struggle for the Soul of Science (New York, 2008), 2, 4; Karl R. Popper, Objective Knowledge: An Evolutionary Approach, revised edition (Oxford, 1979), 214.

[30] Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology (New York, 2005), 453; David Deutsche, The Fabric of Reality: The Science of Parallel Universes -and Its Implications (New York, 1997), 234-37.

[31] Werner Heisenberg, Physics and Philosophy: The Revolution in Modern Science (New York, 2007),36-37,48-49; Alfred North Whitehead, Science and the Modern World (Cambridge, UK, 1926), 17; Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 165.

[32] Friedrich Nietzsche, The Gay Science, qtd. in. Tyler T. Roberts, Contesting Spirit: Nietzsche, Affirmation, Religion (Princeton, 1998), 39, footnote 11.

[33] Isaiah Berlin discusses this point at length in several essays in The Proper Study of Mankind.  See “The Apotheosis of the Romantic Will,” 555-559; “Herder and the Enlightenment,” 426; “The Divorce Between the Sciences and the Humanities,” 326-28; “The Originality of Machiavelli,” 312-313; “The Counter-Enlightenment,” 245-46; “The Pursuit of the Ideal,” 5.  The long quote comes from “The Originality of Machiavelli,” 312-313.

34 Karl R. Popper, Objective Knowledge: An Evolutionary Approach, revised edition (Oxford, 1979), 204.

35 It must be acknowledged that not all scientists have given up Newton's assumption of singular Laws.  The Harvard socio-biologist Edward O. Wilson is perhaps the most important holdout.  He admits that science has now uncovered "impenetrably complex systems" (54), but he argues that "reductionism" will eventually allow scientists "to fold the laws and principles of each level or organization into those at more general, hence more fundamental levels," which would lead to "simple universal laws of physics to which all other laws and principles can eventually be reduced" (55).  However, he is one of the few such traditionalists to admit that this assumption is based on a "transcendental world view" that "could be wrong" (55).  Consilience: The Unity of Knowledge (New York, 1998).

36 Fritjof Capra, The Web of Life: A New Scientific Understanding of Living Systems (New York, 1996); Steven Johnson, Emergence: The Connected Lives of Ants, Brains, Cities, and Software (New York, 2001).

37 Ilya Prigogine and Isabelle Stengers, Order out of Chaos: Man's New Dialogue with Nature (New York, 1984), xv, xxvii, 9, 73.

38 David Deutsche, The Fabric of Reality: The Science of Parallel Universes - and Its Implications (New York, 1997), 46; Alan Lightman, "The Accidental Universe: Science's Crisis of Faith, Harpers (Dec 2011), 35.

39 Haskell, The Emergence of Professional Social Science, 3-23.

40 Adams, The Education of Henry Adams, see especially chapters 15, 22, 25, 31.

41 On epistemological diversity see George Steinmetz, ed., The Politics of Method in the Human Sciences: Positivism and Its Epistemological Others.  For a work that acknowledges the diversity of knowledge, yet keeps the monistic assumption of a singular and unified ontology see Wilson, Consilience: The Unity of Knowledge.

42 Prigogine and Stengers, Order out of Chaos: Man's New Dialogue with Nature, 7, xiv.

43 Lightman, "The Accidental Universe, 35.

44 Ibid., 25, xxviii.

45 Ibid., xiv-xv, 9, 73; Wilson, Consilience: The Unity of Knowledge, 83-84; Capra, The Web of Life.

46 Prigogine and Stengers, Order out of Chaos: Man's New Dialogue with Nature, 77.

47 Ibid., 225.

48 Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK, 2010), 31.

49 Prigogine and Stengers, Order out of Chaos: Man's New Dialogue with Nature, 225.

50 Lightman, "The Accidental Universe, 36.  Lightman goes on to explain the paradox of 21st century Physics, especially the strange and unsettling implications of string theory: "Not only must we accept that basic properties of our universe are accidental  and incalculable.  In addition, we must believe in the existence of many other universes.  But we have no conceivable way of observing these other universes and cannot prove their existence.  Thus, to explain what we see in the world and in our mental deductions, we must believe in what we cannot prove" (p. 40).

51 See for example Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology (New York, 2005).

52 Paul Ricoeur, Oneself as Another, trans. Kathleen Blamey (Chicago, 1992), 16.

53 John Gray, Isaiah Berlin (Princeton, 1996), 39-40.

54 Tony Judt, "Introduction," in Raymond Aron, The Dawn of Universal History: Selected Essays from a Witness of the Twentieth Century, trans. Barbara Bray (New York, 2002), xxiii.  See for example the philosophy of Raymond Aron and Isaiah Berlin.

55 John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003), 4.

56 Georg W. F. Hegel, The Philosophy of History (Amherst, NY, 1991), 9, 19.

57 Georg W. F. Hegel, qtd. in Shlomo Avineri, Hegel's Theory of the Modern State (Cambridge, UK, 1989), 123.

58 W. B. Yeats, "The Second Coming," The Collected Poems of W. B. Yeats (New York, 1996), 187.

59 W. H. Auden, "Herman Melville," Collected Poems (New York, 1991), 251.

60 Friedrich Nietzsche, Human, All too Human: A Book for Free Spirits, trans. Marian Faber and Stephen Lehmann (Lincoln, NB, 1996).

61 Michel Foucault, "What Is Enlightenment?" In The Foucault Reader (New York, 1984), 32, 35.

62 Allan Bloom, The Closing of the American Mind (Touchstone, 1988), 262.

63 Eric Hobsbawm, The Age of Revolution, 1789-1848 (New York, 1996); Adam Zamoyski, Holy Madness: Romantics, Patriots, and Revolutionaries, 1776-1871 (New York, 1999).

64 Bloom, The Closing of the American Mind , 289.

65 Qtd. in John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2007), 107.

66 Gordon S. Wood, The Radicalism of the American Revolution (New York, 1991), 191.

67 John Adams, qtd. in Wood, The Radicalism of the American Revolution , 191.

68 Thomas Jefferson, qtd. in Ellis, American Sphinx, 54.

69 Lawrence S. Stepelevich, The Young Hegelians: An Anthology (Atlantic Highlands, NJ, 1997); Warren Breckman, Marx, The Young Hegelians, and the Origins of Radical Social Theory (Cambridge, UK, 1999); Harold Mah, The End of Philosophy, The Origin of "Ideology": Karl Marx and the Crisis of the Young Hegelians (Berkeley, 1987).

70 Karl Marx, "Theses on Feuerbach," Karl Marx: Early Writings (London, 1992), 423.

71 Karl Marx, "Economic and Philosophical Manuscripts," Karl Marx: Early Writings (London, 1992), 348.

72 Karl Marx, "Letters from the Franco-German Yearbooks," Karl Marx: Early Writings (London, 1992), 209.

73 Karl Marx, "Critique of Hegel's Philosophy of Right," Karl Marx: Early Writings (London, 1992), 244.

74 John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia. (New York, 2007).

75 John Gray, Enlightenment's Wake (New York, 2009), xv.

76 Alan Dawley, Struggles for Justice: Social Responsibility and the Liberal State (Cambridge, MA, 1991); Michael McGerr, A Fierce Discontent: The Rise and Fall of the Progressive Movement in America (Oxford, 2003); Nell Irvin Painter, Standing at Armageddon: The United States, 1877-1919 (New York, 1987); Desmond King, In the Name of Liberalism: Illiberal Social Policy in the United States and Britain (Oxford, 1999).

77 Gray, Black Mass; F. A. Hayek, The Road to Serfdom (Chicago, 1994); Eugen Weber, Varieties of Fascism: Doctrines of Revolution in the Twentieth Century (Malabar, FL, 1982); Timothy Snyder, Bloodlands: Europe between Hitler and Stalin (New York, 2010). John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2008).

78 John K. Roth and Michael Berenbaum, eds., Holocaust: Religious and Philosophical Implications (St. Paul, MN, 1989); Nora Levin, The Holocaust Years: The Nazi Destruction of European Jewry, 1933-1945 (Malabar, FL, 1990); Primo Levi, Survival in Auschwitz (New York, 1966); Primo Levi, The Reawakening (New York, 1995); Primo Levi, The Drowned and the Saved (New York, 1988); Fergal Keane, Season of Blood: A Rwandan Journey (London, 1996); Gray, Black Mass: Apocalyptic Religion and the Death of Utopia.

79 Eric Hobsbawm, The Age of Empire, 1875-1914 (New York, 1989); Adam Hochschild, Bury the Chains: Prophets and Rebels in the Fight to Free an Empire's Slaves (New York, 2005); Bill Ashcroft, Gareth Griffiths, and Helen Tiffin, eds., The Post-Colonial Studies Reader (London, 19995); V. G. Kiernan, America: The New Imperialism (London, 2005); Michael Hardt and Antonio Negri, Empire (Cambridge, MA, 2000).

80 Jonathan Schell, The Fate of the Earth and Abolition (Stanford, 2000).

81 Jonathan Glover, Humanity: A Moral History of the Twentieth Century (New Haven, 2000); Gray, Black Mass.

82 Primo Levi, The Drowned and the Saved (New York, 1988), 144, 199.

83 Lawrence L. Langer, "The Dilemma of Choice in the Deathcamps," In Holocaust: Religious and Philosophical Implications, John K. Roth and Michael Berenbaum, eds. (St. Paul, MN, 1989), 223.

84 Michel Foucault, "What Is Enlightenment?" In The Foucault Reader (New York, 1984), 39, 49, 46-47.

85 Isaiah Berlin, "The Sense of Reality," In The Sense of Reality: Studies in Ideas and Their History (New York, 1998), 38.

86 Friedrich Nietzsche, Beyond Good and Evil, trans. Walter Kaufmann (New York, 1989), 89.

87 Sigmund Freud was one of the first philosophers to criticize modernity with such a point.  See Civilization and its Discontents, trans. James Strachey (New York, 1961), 45.  See also John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003); John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia.

88 Francis Fukuyama, The End of History and the Last Man (New York, 1992).

89 Ibid., 112.

90 John Gray, Straw Dogs: Thoughts on Humans and Other Animals, 14.

91 Tony Judt, The Burden of Responsibility: Blum, Camus, Aron, and the French Twentieth Century (Chicago, 1998), 158.

92 Glover, Humanity: A Moral History of the Twentieth Century, 7.  See also Raymond Aron, The Century of Total War (Lanham, MD, 1985).

93 Jared Diamond, The Third Chimpanzee: The Evolution and Future of the Human Animal (New York, 1992), 217; Jared Diamond, Collapse: How Societies Choose to Fail or Succeed (New York, 2005).

94 Gray, Straw Dogs: Thoughts on Humans and Other Animals, 12.

95 Amartya Sen, Development as Freedom (New York, 1999), xi.

96 Diamond, Collapse, 521, 525; Diamond, The Third Chimpanzee, 4. See also Wilson, Consilience: The Unity of Knowledge, 280-98); "The Antropocene: A Man-Made World," The Economist (May 28 2011), 81-83. Diamond's basic thesis was articulated over thirty years earlier by Rene Dubos in A God Within: A Positive Philosophy for a More Complete Fulfillment of Human Potentials (New York, 1972).

97 John Gray, Enlightenment's Wake (New York, 2009), 216.

98 Owen Flanagan, Self Expressions: Mind, Morals, and the Meaning of Life (Oxford, 1996), 57.

99 Erich Fromm, Beyond the Chains of Illusion (New York, 1966), 118.

100 John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003), 120.

101 Daniel C. Dennett, Freedom Evolves (New York, 2003), 64, 85, 13.

102 Jared Diamond, The Third Chimpanzee: The Evolution and Future of the Human Animal (New York, 2006); Matt Ridley, Nature Via Nurture: Genes, Experience, and What Makes Us Human (New York, 2003).

103 Ridley, Nature Via Nurture, 64.

104 Jared Diamond, The Third Chimpanzee, 93, 143.

105 Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology (New York, 2005).  For a more political treatment of technology and human progress see Francis Fukuyama, The End of History and the Last Man (New York, 1992).

106 Ridley, Nature Via Nurture, 16-17, 209.

107 Freud, Civilization and its Discontents, 44.

108 Karl R. Popper, Objective Knowledge: An Evolutionary Approach, revised edition (Oxford, 1979), 238.

109 Michael Bess, "Blurring the Boundary between 'Person' and 'Product': Human Genetic Technologies through the Year 2060," The Hedgehog Review 13, no. 2 (Summer 2011), 57; Wilson, On Human Nature, 96-97, 208.

110 Kevin Warwick, I, Cyborg (London, 2002); Kurzweil, The Singularity is Near, 30, 309; Lev Grossman, "Singularity," Time (Feb 21 2011).  For a critique see John Gray, Straw Dogs: Thoughts on Humans and Other Animals.  Gray argues "Technology is not something humankind can control...Technical progress leaves only one problem unsolved: the frailty of human nature.  Unfortunately that problem is insoluble" (14-15).  For a critique of techno-posthumanism see Michael E. Zimmerman, "Last Man or Overman? Transhuman Appropriations of a Nietzschean Theme," The Hedgehog Review 13, no. 2 (Summer 2011): 31-44.

111 E. O. Wilson, qtd. in John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003), 5.

112 Mark Lynas, The God Species: Saving the Planet in the Age of Humans (New York, 2011).

113 Naomi Oreskes and Erik M. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming (New York, 2010), 256, 261.

114 Francis Fukuyama, The End of History and the Last Man (New York, 1992), xiv.

115 Julian Simon, The Ultimate Resource (Princeton, 1996).

116 Diamond, The Third Chimpanzee, 54-56; Ridley, Nature Via Nurture, 16-17, 209; Roy D'Andrade, "Cultural Darwinism and Language," American Anthropologist 104, no. 1 (2002), 223; Jane H. Hill, "On the Evolutionary Foundations of Language," American Anthropologist 74 (1972), 308-210.

117 Rene Dubos, A God Within: A Positive Philosophy for a More Complete Fulfillment of Human Potentials (New York, 1972), 239.

118 Ridley, Nature Via Nurture; Pinker, The Blank Slate.

119 Gray, Black Mass: Apocalyptic Religion and the Death of Utopia, 41.

120 Dennett, Freedom Evolves, 93, 143, 162, 166, 250-51, 269.

121 Ridley, Nature Via Nurture, 4.

122 Fritjof Capra, The Web of Life: A New Scientific Understanding of Living Systems (New York, 1996), 267.

123 Steven Pinker, The Blank Slate: The Modern Denial of Human Nature (New York, 2002), xi, 34.  See also Edward O. Wilson, On Human Nature (Cambridge, MA, 1978); Dennett, Freedom Evolves; On Human Nature, Deadalus: Journal of the American Academy of Arts & Sciences 133, no. 4 (Fall 2004).

124 Pinker, The Blank Slate, 34, 65-66.

125 Pinker, The Blank Slate, 65-66, 201, 219, 238.

126 Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 193.

127 Tony Judt, "Introduction," in Raymond Aron, The Dawn of Universal History: Selected Essays from a Witness of the Twentieth Century, trans. Barbara Bray (New York, 2002), xiii-xiv.

128 Using a historical term, I called the study of ideas "ideology" and I put forth a theory for how ideas are born and become social institutions in my book, Studies in Ideology: Essays on Culture and Subjectivity (Lanham, 2005).

129 Philip Selznick, The Moral Commonwealth: Social Theory and the Promise of Community (Berkeley, 1992), 232-34.

130 Elinor Ostrom, Governing the Commons: The Evolution of Institutions for Collective Action (Cambridge, UK, 1990), 51.

131 Elinor Ostrom, Understanding Institutional Diversity (Princeton, 2005), 3.

132 Vincent Ostrom and Elinor Ostrom, “Rethinking Institutional Analysis: An Interview with Vincent and Elinor Ostrom,” Mercatus Center, George Mason University (Nov 7, 2003), 1; Anthony Giddens, Capitalism and Modern Social Theory: An Analysis of the Writing of Marx, Durkheim, and Max Weber (Cambridge, UK, 1971).

133 DiMaggio and Powell, “The Iron Cage Revisited,” 8; Roger Friedland and Robert R. Alford, “Bringing Society Back In: Symbols, Practices, and Institutional Contradictions,” The New Institutionalism in Organizational Analysis, Ibid., 232-263; James Miller, The Passion of Michel Foucault (New York, 1993), 150; Searle, The Construction of Social Reality.  On the debate over “free-will” and structural “determinism” see Isaiah Berlin, “Historical Inevitability,” The Proper Study of Mankind (New York, 2000) and . Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK, 2010), 73.

134 Berger and Luckmann, The Social Construction of Reality, 15; DiMaggio and Powell, “The Iron Cage Revisited,” 20, 23, 26, 28; Andre Lecours, “New Institutionalism: Issues and Questions,” New Institutionalism: Theory and Analysis, Andre Lecours, ed. (Toronto: University of Toronto Press, 2005), 3-25; John W. Meyer and Brian Rowan, “Institutionalized Organizations: Formal Structure as Myth and Ceremony,” The New Institutionalism in Organizational Analysis, Ibid., 41, 44; Paul Pierson, Politics in Time: History, Institutions, and Social Analysis (Princeton, 2004), 20-21, 43, 51; Lynne G. Zucker, “The Role of Institutionalization in Cultural Persistence,” The New Institutionalism in Organizational Analysis, Ibid., 85; Searle, The Construction of Social Reality.

135 Isaiah Berlin, “Historical Inevitability,” Giddens, Capitalism and Modern Social Theory; Miller, The Passion of Michel Foucault, 15, 150, 336; Karl R. Popper, Objective Knowledge: An Evolutionary Approach, revised edition (Oxford, 1979), 217.

136 Henry Miller, Tropic of Capricorn (New York, 1961), 11.

137 Thomas Nagel, "Moral Luck," In Mortal Questions (Cambridge, UK, 1979), 35.

138 Bourdieu, Outline of a Theory of Practice, 83.

139 Elisabeth S. Clemens, The People's Lobby: Organizational Innovation and the Rise of Interest Group Politics in the United States, 1890-1925 (Chicago, 1997), 44-45.

140 James A. Berlin, “Postmodernism in the Academy,” Rhetorics, Poetics, and Cultures (West Lafayette, 2003): 60-82; Friedland and Alford, “Bringing Society Back In,” 232, 254; Jepperson, “Institutions, Institutional Effects, and Institutionalism,” 145, 149, 151-52, 158; Walter W. Powell, “Expanding the Scope of Institutional Analysis,” The New Institutionalism in Organizational Analysis, Ibid., 188, 194-195; Steven Brint and Jerome Karabel, “Institutional Origins and Transformations: The Case of American Community Colleges,” The New Institutionalism in Organizational Analysis, Ibid., 337-360; Ronald Jepperson and John W. Meyer, "Multiple Levels of Analysis and the Limits of Methodological Individualisms," Sociological Theory 29, no. 1 (March 2011): 54-73.

141 Sherry B. Ortner, Anthropology and Social Theory: Culture, Power, and the Acting Subject (Durham, 2006), 7, 18, 127, 130, 133, 139, 147, 152.  On the relationship between institutions and agency see: Ludwig Wittgenstein, Philosophical Investigations, Trans. G. E. M. Anscombe (Oxford, 2001), part I, 23.  Paul Pierson argues that the term institutional “change” is misleading because it is almost impossible to change institutions fundamentally.  Instead, Pierson recommends the term “institutional development” as a more accurate description of how institutions change through time. See: Politics in Time, 133, 137.

142 Elisabeth S. Clemens, The People's Lobby: Organizational Innovation and the Rise of Interest Group Politics in the United States, 1890-1925 (Chicago, 1997), 92.

143 Ostrom, Understanding Institutional Diversity, 3.

144 Joel A. C. Baum and Jitendra V. Singh, “Organizational Hierarchies and Evolutionary Processes,” Evolutionary Dynamics of Organizations, Joel A. C. Baum and Jitendra V. Singh, eds., (Oxford, 1994), 5; DiMaggio and Powell, “The Iron Cage Revisited;” Friedland and Alford, “Bringing Society Back In;” Scott and Meyer, “The Organization of Societal Sectors,” 137; Powell, “Expanding the Scope of Institutional Analysis.”

145 Jepperson, “Institutions, Institutional Effects, and Institutionalism;” Doug McAdam and W. Richard Scott, “Organizations and Movements,” Social Movements and Organizational Theory, Gerald F. Davis, Doug McAdam, W. Richard Scott, and Mayer N. Zald, eds., (Cambridge, UK, 2005), 4-40; Scott and Meyer, “The Organization of Societal Sectors,” 117; Powell, “Expanding the Scope of Institutional Analysis;” Zucker, “The Role of Institutionalization in Cultural Persistence.”

146 Friedland and Alford, “Bringing Society Back In,” 232; McAdam and Scott, “Organizations and Movements;” Pierson, Politics in Time; Zucker, “The Role of Institutionalization in Cultural Persistence,” 84.

147 Siobhan Harty, “Theorizing Institutional Change,” New Institutionalism: Theory and Analysis, Ibid., 51-79.

148 Ostrom, Understanding Institutional Diversity, 10.

149 Ostrom, Understanding Institutional Diversity, 11.

150 Wilson, On Human Nature, 6.

151 Ostrom, Understanding Institutional Diversity, 8.

152 Wilson, On Human Nature; Ostrom, Understanding Institutional Diversity.

153 Ibid., 96.

154 Wilson, Consilience: The Unity of Knowledge, 44.

155 Ibid., 83-85.

156 Ibid., 96.

157 Ibid., 31.

158 See for example: Sociobiology (Cambridge, MA, 2000); The Social Conquest of Earth (New York, 2012); Howard W. French, "E. O. Wilson's Theory of Everything," The Atlantic (Nov 2011), 70-82.

159 Ibid., 55.  See also 30-38.

160 Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK, 2010), 31; Hilary Putnam, "A Half Century of Philosophy, Viewed from Within," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 201.

161 Lisa Randall, Knocking on Heaven's Door: How Physics and Scientific Thinking Illuminate the Universe and the Modern World (New York, 2011), 5.

162 Ibid., 16.

163 "The Nature of the Universe: Ye Cannae Change the Laws of Physics," The Economist (Sept 4 2010), 85-86.

164 Randall, Knocking on Heaven's Door; Brian Greene, The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory (New York, 2003).

165 Ibid.

166 Randall, Knocking on Heaven's Door, 6.

167 Ibid., 8, 15.

168 David Deutsche has also made this point.  He explains that when physicists use the word "evolution" they mean "development" or "motion," and "not variation and selection" (182).  The Fabric of Reality: The Science of Parallel Universes - and Its Implications (New York, 1997).

169 Bess, "Blurring the Boundary between 'Person' and 'Product'", 58.

170 Elinor Ostrom, Understanding Institutional Diversity (Princeton, 2005), 6

171 Ernst Mayr, This Is Biology: The Science of the Living World (Cambridge, MA, 1997), 107, 112, 122.

172 Deutsche, The Fabric of Reality, 27-28.  This diversity of analytical frameworks presents an epistemological problem, as Ostrom pointed out: "One can understand why discourse may resemble a Tower of Babel rather than a cumulative body of knowledge."  See Ostrom, Understanding Institutional Diversity, 11.

173 Ernst Mayr, This Is Biology: The Science of the Living World (Cambridge, MA, 1997), xii, 19-20.

174 David M. Kreps, "Economics - The Current Position," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 78, 83.

175 Richard Dawkins, The Selfish Gene (Oxford, 1989), 192-93.  The notion of memes was more fully developed by Susan Blackmore in The Meme Machine (Oxford, 1999).

176 Richard Dawkins, River Out of Eden: A Darwinian View of Life (New York, 1995), 3.

177 Dawkins, The Selfish Gene, 192-93.

178 See for example Susan Blackmore, The Meme Machine; James Gleick, The Information: A History, A Theory, A Flood (New York, 2011).

179 I borrow this strong criticism of reductionism from Ernst Mayr, This Is Biology: The Science of the Living World (Cambridge, MA, 1997), xii, 8.  Even the technology enthusiast Ray Kurzweil argues that human knowledge and information "will survive only if we want it to," meaning that it is not independent of our consciousness and intentionality (330).  The Singularity is Near: When Humans Transcend Biology (New York, 2005).

180 John Gray, Isaiah Berlin (Princeton, 1996), 81.

181 Diamond, The Third Chimpanzee, 37.

182 Ibid., 98.  Diamond admits, as do I, that human "cultural traits" are based on "genetic foundations," but he details how we as an inventive species have modified ourselves through other mechanisms, like sexual selection and social evolution (137).

183 Dawkins, River Out of Eden: A Darwinian View of Life, 15.

184 Owen Flanagan, The Really Hard Problem: Meaning in a Material World (Cambridge, MA, 2007), 22.

185 Ibid., xii.

186 See for example Daniel C. Dennett, Breaking the Spell: Religion as a Natural Phenomenon (New York, 2006); Richard Dawkins, The God Delusion (New York, 2006).

187 Dawkins, River Out of Eden: A Darwinian View of Life, 3.

188 Ridley, Nature Via Nurture, 209.

189 Ibid., 28.

190 Randall, Knocking on Heaven's Door, 15.

191 Rene Dubos, So Human an Animal (New York, 1968), 128; Dubos, A God Within, 21.

192 "Wisdom about Crowds," The Economist (April 23 2011), 85.

193 Jared Diamond, The Third Chimpanzee: The Evolution and Future of the Human Animal (New York, 1992), 1.

194 Timothy Taylor, "Letter to Editor," In The New Humanists: Science at the Edge, John Brockman, ed. (New York, 2003), 372-73.

195 Daniel C. Dennett, "Letter to Editor," In The New Humanists: Science at the Edge, John Brockman, ed. (New York, 2003), 395.

196 Dennett, Breaking the Spell.

197 Ibid., 261-62.

198 Rene Dubos, So Human an Animal (New York, 1968), 57.

199 Karl R. Popper, Objective Knowledge: An Evolutionary Approach, revised edition (Oxford, 1979), 229; see also Diamond, The Third Chimpanzee.

200 Prigogine and Stengers, Order out of Chaos, 225; Ridley, Nature Via Nurture, 64; Deutsche, The Fabric of Reality, 27-28.

201 Ray Kurzweil, The Singularity is Near: When Humans Transcend Biology (New York, 2005), 453.

202 Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 153.

203 Wilson, Consilience: The Unity of Knowledge, 166, 224.

204 Larry Hirschfeld, qtd. in Pacal Boyer, Religion Explained: The Evolutionary Origins of Religious Thought (New York, 2001), 251-52.

205 Helen A. Fielding, "Multiple Moving Perceptions of the Real: Arendt, Merleau-Ponty, and Truitt," Hypatia 26, no. 3 (Summer 2011), 532.

206 Thomas Bender, "Politics, Intellect, and the American University, 1945-1995," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 41.  In 1953 Merle Curti called culture "one of the most important and emancipating of all twentieth-century contributions to knowledge in the social field."  Curti, "The Setting and the Problem," American Scholarship in the Twentieth Century, Merle Curtie, ed. (Cambridge, MA, 1953), 5.

207 Boyer, Religion Explained, 138, 164; Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK, 2010).

208 Bourdieu, Outline of a Theory of Practice, 16; Brian C. Anderson, Raymond Aron: The Recovery of the Political (Lanham, MD, 1997), 39.

209 Anderson, Raymond Aron: The Recovery of the Political, 46.

210 Clifford Geertz, The Interpretation of Cultures (New York, 1973), 41, 89.

211 C. Wright Mills, The Sociological Imagination (Oxford, 2000), 79.

212 Ibid., 78.

213 Wilson, Consilience: The Unity of Knowledge, 269.

214 Philip K. Howard, The Death of Common Sense: How Law is Suffocating America (New York, 1994), 12, 60-61, 87, 186-187; Hugo Mercier and Dan Sperber, "Why Do Humans Reason? Arguments for an Argumentative Theory," Behavioral and Brain Sciences 34 (2011): 57-111.

215 Mills, The Sociological Imagination, 77; Richard Rorty, "Philosopher-envy," Deadalus 133, no. 4 (Fall 2004): 18-24.

216 Mills, The Sociological Imagination, 6.

217 Wilson, Consilience: The Unity of Knowledge, 266.

218 Mills, The Sociological Imagination, 158.

219 Ibid., 164.

220 Ibid., 167-69.

221 Ibid., 187-88.

222 Raymond Aron, In Defense of Political Reason, qtd. in Brian C. Anderson, Raymond Aron: The Recovery of the Political (Lanham, MD, 1997), 170.

223 Peter Wood, "Session II: The Actual Preoccupations of the Social Sciences," Conference on the State of the Social Sciences, Critical Review 16, nos. 2-3 (2004), 187.

224 Cormac McCarthy, Blood Meridian (New York, 1985), 256.

225 John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003), 15.

226 Ann V. Murphy, "Corporeal Vulnerability and the New Humanism," Hypatia 26, no. 3 (Summer 2011), 589.

227 J. Donald Hughes, "Sustainability and Empire," The Hedgehog Review 14, no. 2 (Summer 2012), 35.  See also: J. Donald Hughes, An Environmental History of the World (New York, 2009).

228 Ernst Mayr, This Is Biology: The Science of the Living World (Cambridge, MA, 1997), xv. 

229 John Gray, False Dawn: The Delusions of Global Capitalism (New York, 1998), 16.

230 Qtd. in Jared Diamond, The Third Chimpanzee: The Evolution and Future of the Human Animal (New York, 1992), 366.

231 Christopher Hitchens, "Political Animals," Arguably: Essays by Christopher Hitchens.  (New York, 2011), 111.

232 Brian C. Anderson, Raymond Aron: The Recovery of the Political (Lanham, MD, 1997); Michel Foucault, The Foucault Reader (New York, 1984); Isaiah Berlin, The Proper Study of Mankind (New York, 2000); John Gray, Isaiah Berlin (Princeton, 1996); Ludwig Wittgenstein, Philosophical Investigations (Oxford, 1953).

233 Hilary Putnam, "A Half Century of Philosophy, Viewed from Within," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 213.

234 Anderson, Raymond Aron: The Recovery of the Political, 170.

235 John Gray, Isaiah Berlin (Princeton, 1996), 9.

236 Alexander Pope, An Essay on Man, In The Norton Anthology of English Literature, 6th ed, vol. 1, M. H. Abrams, ed. (New York, 1993), 2270.

237 Blaise Pascal, Pensees and Other Writings, trans. Honor Levi (Oxford, 1995), 42, 70, 66-67.

238 Nietzsche, Human, All too Human, 37; William Blake, "London," In The Complete Poetry and Prose of William Blake, David V. Erdman, ed. (New York, 1988), 27.

239 Author's emphasis.  Nietzsche, Human, All too Human, 78-79, 14-15.

240 Nietzsche, Human, All too Human, 174-75.

241 Nietzsche, Human, All too Human, 193.

242 Nietzsche, Human, All too Human, 256.

243 T. S. Eliot, Four Quartets, In Collected Poems, 1909-1962 (New York, 1963), 178, 180-81, 185, 203, 208.

244 John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2007).

245 Frank E. Manuel, ed., Utopias and Utopian Thought (Boston, 1966); Imagining the Future, The Hedgehog Review 10, no. 1 (Spring 2008); Gray, Black Mass.

246 Thomas More, Utopia (New York, 1965), 64.

247 Ironically, Brecht ended up as a refugee in the U.S. and revised his play for an American audience, but was later interrogated by the House Un-American Activities Committee because of his radical politics, and he was forced to leave the U.S. and return to the then Soviet occupied East Germany.

248 Bertolt Brecht, "Life of Galileo," In Brecht: Collected Plays, vol. 5 (New York, 1972), 58.

249 Brecht, "Life of Galileo,"94.

250 Brecht, "Life of Galileo," 64.

251 John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2007), 20.

252 Charles E. Lindblom and David K. Cohen, Usable Knowledge: Social Science and Social Problem Solving (New Haven, 1979), 73.

253 Tony Judt, The Burden of Responsibility: Blum, Camus, Aron, and the French Twentieth Century (Chicago, 1998), 123, 135.

254 Rene Dubos, So Human an Animal (New York, 1968), 61, 101, 128, xii.

255 Ralph Waldo Emerson, "Fate," Selections from Ralph Waldo Emerson (Boston, 1957), 330.

256 Ibid., 142-43.  See also John Gray, Isaiah Berlin (Princeton, 1996), 23-24; Philip K. Howard, The Death of Common Sense, 12, 60-61, 87, 186-187.

257 Michael Berube and Cary Nelson, eds., Higher Education Under Fire: Politics, Economics, and the Crisis of the Humanities (New York, 1995); Martha Nussbaum, Cultivating Humanity: A Classical Defense of Reform in Liberal Education (Cambridge, MA, 1997); Sander Gilman, The Fortunes of the Humanities: Thoughts for After the Year 2000 (Stanford, 2000); Richard Wolin, "Reflections on the Crisis in the Humanities," The Hedgehog Review 13, no. 2 (Summer 2011): 8-20.

258 On higher education see Jonathan R. Cole, The Great American University: Its Rise to Preeminence, Its Indispensable National Role, and Why It Must Be Protected (New York, 2009), 100, 151, 155; Louis Menand, The Marketplace of Ideas: Reform and Resistance in the American University (New York, 2010).  On K-12 education see Diane Ravitch, The Death and Life of the Great American School System: How Testing and Choice are Undermining Education (New York, 2010), 226.

259 Cole, The Great American University, 155; Ravitch, The Death and Life of the Great American School System.

260 Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of Reform in Liberal Education (Cambridge, MA, 1997), 8-9; Richard Wolin, "Reflections on the Crisis in the Humanities," 10.

261 Owen Flanagan, The Bodhisattva's Brain: Buddhism Naturalized (Cambridge, MA, 2011), 109.

262 Menand, The Marketplace of Ideas, 56, 51.

263 Ibid., 56.

264 Nussbaum, Cultivating Humanity, 295.

265 M. H. Abrams, "The Transformation of English Studies: 1930-1995," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 144.  See also Sarah Bakewell, How to Live: A Life of Montaigne in One Question and Twenty Attempts at an Answer (New York, 2010).

266 John Dewey, Democracy and Education (New York, 1916); Paulo Freire, Pedagogy of Freedom: Ethics, Democracy, and Civic Courage (Lanham, 2001); Amy Gutmann, Democratic Education (Princeton, 1987).

267 Amartya Sen, Development as Freedom (New York, 1999), 283.

268 John Gray, Isaiah Berlin (Princeton, 1996), 23-24.

269 Ibid., 284.  See also Bakewell, How to Live: A Life of Montaigne in One Question and Twenty Attempts at an Answer, 320.

270 Owen Flanagan, The Really Hard Problem: Meaning in a Material World (Cambridge, MA, 2007), 1.

271 Seneca, On Anger, qtd. in Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of Reform in Liberal Education (Cambridge, MA, 1997), xiii.

Religion & Secularism

A Short History

originally written 2012

For most of human history the cosmos of the known universe was populated by deities and devils in a simplistically ordered whole guided by metaphysical principles and orally transmitted through myths.  Humans acted in broad ignorance of the actual mechanics of the objective world, mixing local intelligence and tradition with fanciful belief in the power of magic, prayer, and fate.  Peter Gay once explained, the "essence of the mythopoeic mind" as "where the category of verification is absent, there are no lies."[i]  Early philosophers in ancient Greece and India began to use logos to challenge the antiquated logic of mythos, but skeptics had little evidence to actually convince people to reject old worldviews.  Often critique was met with ridicule from the establishment, or sometimes, as was the case with Socrates, a death sentence to preserve public morality.[ii]  The traditional epistemology of mythos was not fully challenged until new critical human endeavor called "science" was invented in the 16th century, which allowed a new type of truth to be fashioned based on empirical observation, innovative technology, experimentation, and mathematical theory.[iii] 

But even as western science developed over the next several hundred years, a foundational mythos of ancient humans was retained as a central assumption.  This great assumption of a rationally ordered universe guided by immutable laws was largely unchallenged until the end of the 19th century.[iv]  Take for example William Graham Sumner who began his professional life as an Episcopal pastor, but left the ministry to become a social scientist at Yale in 1872.  In his last sermon Sumner noted the “philosophical skepticism” of modernity and the brewing “conflict” between “traditional dogmas” and science.  Sumner explained his turn toward science in terms of looking for “an historical revelation of spiritual and universal truths which has authority for man.”[v]  Drawing from the cosmology of ancient mythos, scientists have assumed that there was a singular, “rational” and “intelligible” order to the universe that could be discovered by the human mind. 

Up until the early 20th century, many scientists conceptualized this singular order in association with traditional Judeo-Christian beliefs about “God.”[vi]  Thomas Hill, President of Harvard College in 1865, claimed, “The ultimate ends of comment sense, of philosophy, and of science are the same.  They may be summed up in one, - it is the reading of God’s thought.  The order of the universe is rational, intelligible…No mind capable of scientific labor ever doubts that all phenomena are subject to law, that is, that all phenomena succeed each other in an order which can be understood and expressed in the formulae of human speech.”[vii]

But the unification of mythos (traditional, sacred reasoning) and logos (logical, critical reasoning) in a singular conception of the cosmos has not always been widely accepted because of the inherent ontological and epistemological conflict between these two worldviews.  These two divergent ideologies have often been in conflict over the past 500 years, especially in the Western world.  During this time, the ancient logic of mythos was rebranded as “religion,” and the analytical empiricism of logos was labeled “secularism.”  But both concepts of “secularism” and “religion” sprang from the Judeo-Christian tradition as it developed in Europe.[viii]  Secular came from the Latin word saeculum, which literally referred to a century or age. 

But this concept was used by Western theologians to designate ordinary, natural time (as apposed to the sacred time of God’s metaphysical order).[ix]  Religion came from the Latin word religio, which referred to the rituals or behaviors linked to supernatural powers or metaphysical reality.  But as the Christian church began to consolidate an orthodoxy this concept was used by Western theologians, like Saint Augustine, to designate the body of “true” knowledge or doctrine contained in the Bible or propagated by church fathers.[x]  Beginning in the 18th century, European scientists and humanists used these Judeo-Christian distinctions to gradually wall-off a new ontological space, conceptualizing the “secular” world of strictly human affairs and natural processes.  As William T. Cavanaugh has argued, “’Religion’ as a discrete category of human activity separable from ‘culture,’ ‘politics,’ and other areas of life is an invention of the modern West.”[xi]

Secularism became a powerful intellectual force, part of the ideological worldview of Deism, which was created by scientists and humanists during the 18th century.  Charles Taylor has called Deism “a half-way house on the road to contemporary atheism.”[xii]  As a quasi-religion, Deism did not completely reject the existence of “God” or metaphysics.  Deists assumed that there was some larger metaphysical reality outside the boundaries of the physical universe.  However, using the basic theory of logical parsimony, often called “Occam’s razor,” Deists eliminated the notion of an all-powerful transcendent being (“God”) operating and intervening in human history or the natural world because it was redundant in the face of new scientific facts. 

Instead, modern European Deists focused on empirical observation, experimental science, and the monistic belief in natural “laws” of the universe, which they used to re-order the ontological and epistemological space of physical reality.  Deists did not try to logically disprove the existence of “God”, yet any religious notion would have accord with these new natural laws and prove itself valid within this new empirical worldview.[xiii]  As the French philosophe Voltaire once explained, "Almost everything that goes beyond the worship of a supreme Being and the submission of one's heart to its eternal commands, is superstition...We are all steeped in weaknesses and errors; let's forgive each other our follies; that is the first law of nature."[xiv] 

While Deists were largely agnostic on religious issues, effectively their belief in science made metaphysical notions redundant.  This lead to a general orientation of naturalism, which can be (and was) seen as a form of atheism.[xv]  However, even Voltaire held on to a deep religious conviction.  After the Lisbon earthquake in 1755 Voltaire wrote, "All the subtleties of metaphysics will not make me doubt for a moment the immortality of the soul or a beneficent Providence.  I feel it, I believe it, I want it, I hope for it, and I shall defend it to my last breath."[xvi]  As Peter Gay argued, "Deism was in fact a last compromise with religion.  But it was not a compromise with mythopoeic thinking...The disenchanted universe of the Enlightenment is a natural universe."[xvii]

18th and 19th century scientists and philosophers used an emerging naturalist mentality to create a “new framework” based on a “new rationalism,” which slowly secularized the modern Western world up to the 21th century.[xviii]  Bit by bit, the physical universe, the natural world, and human society were separated from metaphysical doctrines and spiritual deities, although the underlying belief in a singular, intelligible cosmos was retained.  As historian Peter Gay explained, "The old questions that Christianity had answered so fully for so many men and so many centuries, had to be asked anew: What - as Kant put it - what can I know? What ought I to do?"[xix]  Perhaps the fullest expression of this new scientific rationality took shape in 1843 when John Stuart Mill published his six book treatise, A System of Logic, Ratiocinative and Inductive, Being a Connected View of the Principles of Evidence and the Methods of Scientific Investigation.  Mill argued that there were no "a priori" or "self evident" truths, which were simply the reflection of "deep-seated prejudices."  Instead he argued that truth needed to be constructed through the newly developed scientific method.[xx]

Newly formulated concepts, such as “religion” and “secularism” were used to signify the modern ontological divide that many Westerners now take for granted.  Thomas Jefferson famously went so far as to proclaim a “wall” between the two spheres, although his rhetorical wall has always been more symbolism than reality.[xxi]  The existence of a secular sphere of knowledge and action slowly became accepted due to political and scientific developments, but was not a reality until the 20th century.[xxii]  Western societies became more and more diverse, ethnically and religiously, and a secular public sphere was a tool to enable official tolerance of all faiths.  Also, advances in science and technology validated the naturalist conception of the physical world and made many older notions seem quaint fictions, like the traditional notion of a disembodied "soul." 

But there is big problem with the whole notion of a secular/religious divide that has been legally enshrined over the last century – it is largely a fiction.  While many people consider the 21st century to be part of a “secular age,”[xxiii] the vast majority of human beings still live in a pre-modern, mythological frame of mind, which has created general skepticism about the meaning and predominance of secularism in the modern world.[xxiv]  The rise of secularism over the last five centuries has not only coincided with the stability of pre-modern religious beliefs and practices, but it has also seen the development of huge global religions, such as Christianity, Islam, and Buddhism, as well as the continual emergence of new religions.[xxv]  Plus, there was an emergence of a new form of religion.  Raymond Aron and John Gray have called them "secular religions," such as Marxism and Humanism, both of which are "Christian heresies" because of their faith in progress and apocalyptic millenarianism.[xxvi]  John Gray has convincingly argued that the 21st century is not only in the grip of traditional religions but also "modern political religions" animated by various utopian projects.[xxvii]

In the United States religion is arguably as strong and as culturally significant as it always has been.  At the end of the 19th century 37 out of the 42 states in America legally "recognized the authority of God in the preambles or texts of their constitutions."[xxviii]  By the 21st century around 90 percent of the citizens of the United States of America still consider themselves “religious,” and even most of the non-religious still have some spiritual beliefs.[xxix]  Perhaps most surprising, around 39 percent of all scientists in the U.S. still consider themselves to be religious, only slightly less than the 42 percent of believing U.S. scientists polled in 1914.[xxx]  As Charles Taylor ironically explained, “The secular age is schizophrenic.”[xxxi]  Secularism and religion co-exist awkwardly, and often in conflict, but not always.[xxxii]

With the growing acceptability of secularism over the last century, especially in Western Europe and the United States, there has also been a growing tension between the rival ontological and epistemological systems of “religion” and “science,” periodically irrupting into conflict.  While science has been legitimated in the public eye, largely do to the staggering advances in technology and the quality of human life, the scientific worldview is still an embattled ideology in a religiously dominated world, even in Western democracies.  Some scholars have questioned whether or not we need a “new model” for describing the world, which is “neither exclusively secular nor exclusively religious.”  Martin E. Marty explained it succinctly: The world “is religious.  And secular.  At the same time.”[xxxiii]  Modernity, as Charles Taylor reminded us, is a “struggle between belief and unbelief.”[xxxiv]  But this duality predates the modern period because belief and unbelief have always existed in longstanding conflict.[xxxv]  Will it always be this way?

Skirmishes between secularism and religion began at the very origins of Western science.[xxxvi]  In fact, if one takes a longer, global view, “religious” beliefs have always been subject to the assaults of doubt, and doubters have often been disdained by the moral majority.[xxxvii]  But the battle began in full force with the European philosophers and scientists of the 18th century, who promoted a spirit of “enlightenment” from the shackles of antiquated myths, oppressive institutions (like the Roman Catholic Church), and authoritarian monarchs.  Theses philosophers and scientists promoted rational thought, free exchange, objectivity, universal natural laws, political democracy, and the ability “to provide permanent solutions to all genuine problems of life or thought.”[xxxviii]  But from the start, a rival tradition of “counter-enlightenment” philosophers, poets, and theologians saw the new secular world-view of science as a reductionist “murder” and “distortion” of the multifarious and “ineffable” nature of reality.  These thinkers bristled against the “total claim of the new scientific method to dominate the entire field of human knowledge.”[xxxix]  These thinkers also questioned the metaphysical assumption of “the general progressive improvement of the world,” which of course was based on making the world conform to the cultural institutions of Western Europe.[xl] 

This battle expanded and became more acute in the 19th century.  There were several major scientific, philosophical, and political developments that shook the very foundations of religious belief.  This in turn caused an aggressive reaction from traditionalists, which lead to various "culture wars" up into the 21st century.[xli]  The four secular developments included the new Biblical criticism of Friedrich Schleiermacher (1768-1834) and David Strauss (1808-1874), the publication of Darwin’s On the Origins of Species (1859), the proto-sociologist Karl Marx’s (1818-1883) new political philosophy that saw religion as the “opiate of the masses,” and the iconoclastic atheism of Friedrich Nietzsche (1844-1900), who declared that God was dead.  By the end of the 19th century, the conceptual war between science and religion became a popular cottage industry, the most notable publications being John W. Draper’s History of Conflict Between Religion and Science (1875) and Andrew Dickinson White’s A History of Warfare of Science with Theology (1896).[xlii]  And from then on cultural wars have ripped through Western societies every couple of decades.[xliii] 

Of course not all scientists in the 19th and 20th century were opposed to religion, nor did all scientists think that “science” and “religion” were ontologically opposed.  Remember, at the turn of the 20th century around 40 percent of American scientists still considered themselves religious.  Three of the most notable scientists of the 20th century who defended the notion of religion were Alfred North Whitehead in Science and the Modern World (1925), Albert Einstein in Ideas and Opinions (1954), and more recently, Stephen Jay Gould in Leonardo’s Mountain of Clams and the Diet of Worms (1998) and Rock of Ages: Science and Religion in the Fullness of Life (1999).  Both Whitehead and Einstein believed in a metaphysical reality that was one with the physical workings of the universe.  Einstein argued for a pantheist “God” (“Spinoza’s God”) who designed and worked through the laws of nature.[xliv]  Gould, who referred to himself as a “Jewish agnostic,” argued that religion and science were “nonoverlapping magisteria.”  In an effort to defuse the war between these worldviews, Gould claimed that these two ontological categories had their own “respective domains of professional expertise: science in the empirical constitution of the universe, and religion in the search for proper ethical values and the spiritual meaning of our lives.”[xlv]

The rise of scientific rationality and the growing legitimacy of secularism also gave rise in the 19th century to the birth of Comparative Religion, and later, Religious Studies.  These academic disciplines sought to study the phenomenon of “religion.”  While the early proponents of these disciplines were religious men who sought to bolster the importance of religion in the modern world, the effects of these new disciplines would only serve to widen the conflict between science and religion.  The main actor in this early intellectual enterprise was the European philologist Max Mueller.  He was greatly influenced by the universal philosophy and theology of Georg Wilhelm Friedrich Hegel (1770-1831), whom postulated a universal metaphysical geist (spirit) behind the working of human history and at the center of all human religions. 

Mueller’s most important work was Introduction to the Science of Religion (1873).  In it he claimed that humans have a “faculty of faith” which they used to apprehend the divine and develop morality.  Mueller was also influenced by the liberal Protestant (and also Western imperialist) agenda of the early ecumenical movement, which meant to spread a largely Western European Protestant version of Christianity across the world.  Mueller saw the study of comparative religion as an important resource for Christian missionaries spreading the gospel and converting the non-western savage.[xlvi] 

Most of the scholars in the emergent discipline of Religious Studies were theologians and philosophers whose explicit purpose was to reconceptualize and validate traditional theological principles within the newly christened “secular” world.  Such scholars included Dutch theologian P. D. Chantepie de la Saussaye, who coined the term “phenomenology of religion.”  He wanted to study the essence of religion through its “empirical manifestations.”  It also included the Jewish Christian theologian Joachim Wach (1898-1955), a professor in the Divinity School at the University of Chicago, who wanted to create a “science” of religion.[xlvii] 

Wach later brought the European philosopher Mircea Eliade (1907-1986) to the University of Chicago in the 1950s.  Eliade would become “one of the most influential theorists in religious studies in modern times.”[xlviii]  Eliade most famous book The Sacred and the Profane: The Nature of Religion (1957) legitimated and consecrated the secular/religious divide, but he did so in order to point out how humans move from secular, profane existence into the “sacred space” of the divine.  He sought to explain the “religious experience” of “religious man,” a being he termed “homo religiosus.”  His central concept was the “the sacred,” which he defined as “pre-eminently the real, at once power, efficacity, the source of life and fecundity…objective reality…a cosmos.”  Eliade claimed that religious man would always be dissatisfied with “profane experience,” and so he “opens” himself to the “divine” and “makes himself, by approaching the divine models.”[xlix]

But these early intellectual movements were only quasi-scientific at best.  Much of the work in the study of religion during the first half of the 20th century was either theology or philosophy, and this work almost always was biased in favor of religious assumptions.[l]  The actual scientific study of religion did not begin until the 20th century.  The fruits of this research would prove to be highly corrosive to traditional religious claims and theological belief.  The three most important scientists to study religion scientifically were the German sociologist Max Weber (1864-1920), the French sociologist Emile Durkheim (1858-1917), and the social anthropologist E. E. Evans-Pritchard (1902-1973). 

Weber was working on his magnum opus Wirtschaft und Gesellschaft (Economy and Society) when he died in 1920.  In the Religionssoziologie (Sociology of Religion) part of this manuscript he explained that the “essence of religion is not even our concern, as we make it our task to study the conditions and effects of a particular type of social behavior.”  Weber argued that anything claiming to be religious phenomenon should “not be set apart from the range of everyday purposive conduct” because this phenomenon is primarily “oriented to this world,” by which he meant the secular world.[li] 

In The Elementary Forms of the Religious Life (1912) Durkheim theorized that religion was primarily a social phenomenon that was geared to the survival of human groups, and that it was also a strategy used by political elites to maintain social order.[lii]  When Durkheim used the concept of “sacred,” Loyal Rue argues that what he really meant was any “vital interest of the group.”  Thus, “The gods are to be understood as mere symbols personifying the transcendent reality of the group.  The central activity of religion is found in its ritual life, which creates social solidarity and preserves the social order by reinforcing group consciousness.”[liii] 

Evans-Pritchard’s Witchcraft, Oracles and Magic Among the Azande (1937) was a groundbreaking book in the scientific study of religion.  As Pacal Boyer explains, “His book became a model for all anthropologists because it did not stop at cataloguing strange beliefs.  It showed you, with the help of innumerable details, how sensible those beliefs were, once you understood the particular standpoint of the people who expressed them and the particular questions those beliefs were supposed to answer.”[liv]

Although the field of Religious Studies remains plagued by the bias and vacuity of the central concept of “religion”,[lv] the study of ritual, belief, institutions, ethics, and human consciousness in sociology, anthropology, cognitive psychology, and biology have led to new developments in our understanding of religion as a social and psychological phenomenon.  Some of the more notable include: philosopher Daniel C. Dennett’s Darwin’s Dangerous Idea: Evolution and the Meanings of Life (1995) and Breaking the Spell: Religion as a Natural Phenomenon (2006); philosopher Loyal Rue’s Religion Is Not About God: How Spiritual Traditions Nurture Our Biological Nature (2005), and anthropologist Pascal Boyer’s Religion Explained: The Evolutionary Origins of Religious Thought (2001).  There have also been new discoveries of evidence in history and archaeology, which have led to more accurate understanding of existing religious traditions and the meanings of sacred documents.  In the study of Christianity, some of the more notable include: John Dominic Crossan’s The Historical Jesus (1993) and The Birth of Christianity (1999); Donald Harmon Akenson’s Saint Saul: A Skeleton Key to the Historical Jesus (2002); and Bart D. Ehrman’s Lost Christianities (2005), The New Testament: A Historical Introduction to the Early Christian Writings (2007), and Misquoting Jesus: The Story Behind Who Changed the Bible and Why (2007).

For those who follow these recent developments, there is little space left for any theist claims, as traditionally defined by the major world religions.  Religious prophets have been demystified and humanized.  Religious traditions have been historicized within specific cultural, temporal, and geographical domains.  Religious texts have been deconstructed both in terms of historicizing the writing and critically analyzing the meaning of sacred texts, and also in terms of reconstructing the heterodox collection of texts that were actively suppressed by more powerful orthodox traditions.  As philosopher Loyal Rue explains, “The question of God’s existence simply doesn’t come into the business of understanding religious phenomena…belief is the thing, not the reality of any objects of belief.”[lvi] 

And there have been recent scientific findings that have falsified theist-favoring claims of “nonoverlapping magisteria” and the special religious province of morality.  Anthropologists, cognitive scientists, and evolutionary psychologists have shown that morality has a biological and social basis that predates the development of religion.  Pascal Boyer summarizes what we now know of the origins of morality, “Having concepts of gods and spirits does not really make moral rules more compelling but it sometimes makes them more intelligible.  So we do not have gods because that makes society function.  We have gods in part because we have the mental equipment that makes society possible but we cannot always understand how society functions…We can explain religion by describing how these various [human] capacities get recruited, how they contribute to the features of religion that we find in so many different cultures.  We do not need to assume that there is a special way of functioning that occurs only when processing religious thoughts…this notion of religion as a special domain is not just unfounded but in fact rather ethnocentric.”[lvii]

These recent scientific developments have led many people to proclaim that belief in God is “obsolete.”  Christopher Hitchens sees the concept of God as passé, an old fashioned notion based on primitive superstition: “The original problem with religion is that it is our first, and worst, attempt at explanation.  It is how we came up with answers before we had any evidence.  It belongs to the terrified childhood of our species…reason and logic reject god.”[lviii]  Evolutionary psychologist Steven Pinker argues, “The deeper we probe these questions, and the more we learn about the world in which we live, the less reason there is to believe in God.”[lix]  Based on the available scientific evidence, physicist Victor J. Stenger argues that “the only creator that seems possible is the one Einstein abhorred – the God who plays dice with the universe…Yet there is no evidence that God pokes his finger in anyplace.”[lx]  Thus, this has led many scientists to proclaim that “God” is a failed and obsolete “hypothesis.”  The evolutionary biologist Richard Dawkins goes so far as to claim that those who continue to belief in God, despite the evidence against such a being, are deceived by a “pernicious delusion.”[lxi] 

The large tide of scientific evidence over the twentieth century has also emboldened a group of atheists, who have tried to rehabilitate this old epithet and make it a respectable intellectual and moral position.[lxii]  However, it should be acknowledged that there has been a bold tradition of atheism and agnosticism in the west for a very long time.  Perhaps two of the most noted atheists in the 18th and 19th centuries were the philosopher David Hume (1711 - 1776) and the poet Percy Bysshe Shelley (1792 - 1822).  Shelley went to far as to publish a tract called "The Necessity of Atheism" (1811) in which he declared, "There is no God...Every reflecting mind must acknowledge that there is no proof of the existence of a Deity.  God is an hypothesis."[lxiii]  Robert Green Ingersoll (1833-99) famously argued, "While I am opposed to all orthodox creeds, I have a creed myself; and my creed is this.  Happiness is the only good."[lxiv]  The philosopher Bertrand Russell (1872 - 1970) was an outspoken atheist who called "all the great religions of the world" both "untrue and harmful."  He called religion a "delusion born of terror," and he argued, "The knowledge exists by which universal happiness can be secured; the chief obstacle to its utilization for that purpose is the teaching of religion."[lxv]

In the Unites States the number of people claiming “no religion” has grown from 2.7 percent of the population in 1957 to about 15 percent in recent polls.  But only about 10 percent say they are “neither spiritual nor religious,”[lxvi] and even less, about 5 percent of the population, claim to “not believe in God.”  Of this 5 percent, only a quarter of this small minority (about 1.6 percent of the total population) identify themselves as “atheists,” although some researchers put the number of atheists and agnostics around a miniscule 0.2 percent.[lxvii]  In this very religious country, there still remain deep prejudices against professed atheists, although these prejudices are not as violently expressed as a century ago.  In 1876 the New York Sun called Robert Green Ingersoll an "outrage," and a "moral pestilence" that should be "exterminate[d]" by "hang[ing] the mortal carrion in chains upon a cross beam."[lxviii]  In 1941 Bertrand Russell was prevented from assuming a teaching post at the College of the City of New York because he was called a "propagandist against both religion and morality," a "philosophical anarchist and moral nihilist," and a "professor of immorality and irreligion."  George V. Harvey went so far as to say that "colleges would either be godly colleges, American colleges, or they would be closed."[lxix]

While this violent type of moral indignation and hatred against atheists is rarely published any more, atheists are still a despised minority.  In a recent poll conducted in 2003, an average of 41 percent of Americans (and 63 percent of white evangelical Protestants) said that they would not vote for an atheist running for president, even if that person has received their political party’s presidential nomination.[lxx]  An American serviceman had to be sent home early from Iraq because of the threats he received from other soldiers, and even his commanding officer because he tried to hold a meeting for “atheists and freethinkers” while serving in a warzone.[lxxi]  Atheists remain a very small minority; however, a number of new atheists have sketched out a platform to promote non-belief and “atheist pride.”[lxxii]  These new atheists, raising an ideology of "militant modern atheism,"[lxxiii] have delivered some bold attacks against religion and fanaticism, while defending the intellectual and moral merits of atheism.  Remarkably, many of these new atheists have also acknowledged the legitimacy certain conceptions of “God” and spiritual practice, leaving open the possibility of a new form of religious humanism. 

Christopher Hitchens claims that because of the continuing “fanaticism and tyranny” of religious people, atheism has “moral superiority” because it allows people to use reason independently of “dogma” and form more logical and moral solutions to the world’s problems.[lxxiv]  The neuroscientist Sam Harris agrees with Hitchens.  He argues that religious people who hold beliefs with “no evidence” are “mad,” “psychotic,” “delusional” and show signs of “mental illness.”  This makes religious people very “dangerous” because they are “ignorant” of reality and prone to violence.[lxxv]  Harris claims that the major enemy of the 21st century is not any specific religious tradition or person, the real enemy is “faith itself” – the “evil of religious faith.”[lxxvi]  Thus, without faith, humans can take “a rational approach to ethics” and logically solve the world’s problems.[lxxvii]  But Harris does not deny that people have “spiritual experience[s],” nor does he find anything wrong with “meditation” or “mysticism,” as long as it is a “rational enterprise” grounded on sound principles and evidence.[lxxviii]

Richard Dawkins explains that an atheist is simply a “philosophical naturalist” who “believes there is nothing beyond the natural, physical world, no supernatural creative intelligence lurking behind the observable universe, no soul that outlasts the body and no miracles – except in the sense of natural phenomena that we don’t yet understand.”[lxxix]  But like Harris, Dawkins is open to some forms of spirituality.  He does not deny the possibility of Spinoza’s “God,” Einstein’s “God,” or the other naturalistic definitions of “God” made by “enlightened scientists.”[lxxx]  The notion of God just becomes a metaphor for the wonders of the natural universe.  Instead of traditional notions of “God,” Dawkins prescribes “a good dose of science” and the “honest and systematic endeavor to find out the truth about the real world.”[lxxxi]

But traditional theists refuse to give up much ground to scientific developments or to the claims of new atheists.  Recent responses include John F. Haught’s God and the New Atheism: A Critical Response to Dawkins, Harris, and Hitchens (2007) and Karen Armstrong’s A Case for God (2009).  Even some self-professed atheists acknowledge that religion will always have the upper hand.  Robert Sapolsky, a neurologist at Stanford University, argues that while “science is the best explanatory system that we have,” that doesn’t mean “that it can explain everything, or that it can vanquish the unknowable.”[lxxxii]  There are many perennial questions about the human condition that science cannot completely answer or remediate, like how to find "meaning" in life, how to find a satisfying identity, and how to deal with death.  Questions such as these, Christopher R. Beha has pointed out, "won't simply go away," and the new atheists have not formulated any convincing answers.  Beha laments, "One is mostly left noticing how much scientism takes from us, and how little it offers in return...For most of us, 'Get over it!' doesn't qualify as a satisfactory response to the 'persistent questions.'"[lxxxiii]  Because of this philosophical gap, most still recognize that religious belief offers "something that science does not.”  Sapolsky calls it “ecstasy,”[lxxxiv] but one could also add emotional comfort, meaningful identity, and motivational purpose. 

Thus, in light of the entrenched position of both secularism and religion, and the growing conflict between these two world views, some people have asked if there can be common ground, rather than more culture wars.  Robert D. Putnam and David E. Campbell believe that Americans have "a high degree of tolerance" because our the diversity of religion "produces a jumble of relationships among people," often within families, and people learn to live "peacefully" with religious conflict.[lxxxv]  Rather than just co-exist, many scholars and intellectuals want to find ways to bridge this conflict.  Georgetown University Theologian John F. Haught has proposed a “conversation” between science and religion, as has philosopher Daniel C. Dennett.[lxxxvi] 

There have been some notable debates between science and theology over the subject of the historical Jesus.[lxxxvii]  But there have been few real conversations or dialogues between people with secular and religious views, and most of these have been academic exchanges that would have limited appeal with the broader public.  One of the most remarkable academic debates was between William Lane Craig and Quentin Smith in Theism, Atheism, and Big Bang Cosmology (1993).  The two philosophers debated back and forth the significance of the Big Bang from their opposing religious and secular viewpoints.  Another more comprehensive example is the edited collection, Contemporary Debates in the Philosophy of Religion (2004).  In a not so formally academic exchange, the atheist Christopher Hitchens has publically debated several religious believers, including William Lane Craig.[lxxxviii]  Sadly one does not find more of this kind of exchange, especially geared toward engaging the broader public.[lxxxix]

But not everyone agrees that such conversations are possible, let alone trying to find common ground.  James Davison Hunter argued that culture wars cannot be resolved and conversations cannot take place: "Is it not impossible to speak to someone who does not share the same moral language?  Gesture, maybe; pantomime, possible.  But the kind of communication that builds on mutual understanding of opposing and contradictory claims on the world?  That would seem impossible."[xc]  Philosopher Richard Rorty argues that dialogue won't work because religion is often a "conversation-stopper" for most people, especially for those believers who hold onto absolutes that cannot be compromised.[xci]

If the past 5,000 years of human history tells us anything, the reality of the human condition points toward the continued need for religious belief, thus, there will probably never be a secular world.  The power of cultural tradition is too great, the legitimacy of existing institutions is too strong, the quality of public education is too impoverished, and the impact of forward-looking personalities is too small. The majority of human beings will never lose their religion, nor will they adopt a secular, scientific worldview.  Secularism will always be embattled, and religious differences will sometimes still to lead to violence.  The world of diverse, conflicting cultures and viewpoints that we have inherited from our ancestors will be passed along to our children. 

As the philosopher Isaiah Berlin once explained, “Surely it is not necessary to dramatize these simple truths, which are by now, if anything, too familiar, in order to remember that the purposes, the ultimate ends of life, pursued by men are many, even within one culture and generation; that some of these come into conflict, and lead to clashes between societies, parties, individuals, and not least within individuals themselves; and furthermore that the ends of one age and country differ widely from those of other times and other outlooks.”  But just because we live in a diverse and conflicting world, it does not “preclude us from sharing common assumptions, sufficient for some communication with [others], for some degree of understanding and being understood.  This common ground is what is correctly called objective…Where there is no choice there is no anxiety; and a happy release from responsibility.  Some human beings have always preferred the peace of imprisonment, a contented security, a sense of having at last found one’s proper place in the cosmos, to the painful conflicts and perplexities of the disordered freedom of the world beyond the walls.”[xcii]

The hope of humanity lies within our ability to practice "old-fashioned toleration."[xciii]  Essentially this means finding ways to communicate our difference, seeking to understanding others and be understood in turn, and where at all possible, finding common ground on which to cohabitate in a shared world.  Even Pope John Paul II recognized this hope: "Our respect for the culture of others is...rooted in our respect for each community's attempt to answer the question of human life...every culture has something to teach us about...that complex truth.  Thus the 'difference' which some find so threatening can, through respectful dialogue, become the source of a deeper understanding of the mystery of human existence."[xciv] 

From a secular perspective, John Gray has also made the same basic point, "Oddly enough, we will find that it is by tolerating our differences that we come to discover how much we have in common."[xcv]  Engaging in respectful dialogue is no doubt a difficult and dangerous endeavor, but it is the responsibility of all those who claim the ethical courage to build a better world.  “We can do no better,” Daniel C. Dennett argues, “than to sit down and reason together, a political process of mutual persuasion and education that we can try to conduct in good faith.”[xcvi]  But in order to do so, argues Charles Taylor, “Both sides need a good dose of humility, that is, realism.  If the encounter between faith and humanism is carried through in this spirit, we find that both sides are fragilized; and the issue is rather reshaped in a new form: not who has the final decisive argument in its armory…Rather, it appears as a matter of who can respond most profoundly and convincingly to what are ultimately commonly felt dilemmas.”[xcvii]  But even when solutions are found to our common problems, “We don’t just decide once and for all…it is only a continuing open exchange.”[xcviii]

There is hope that such public dialogue is possible.  There have been several moments in human history where diverse viewpoints were peacefully exchanged in public debate.  The ancient Greeks provided an open forum for believers and skeptics to openly discuss their ideas, although the death of Socrates marked the limits of permissible speech.[xcix]  In the third century BCE, the Indian Emperor Ashoka instituted “Buddhist councils” that created an open arena for diverse parties to argue over religious principles and practices.[c]  In the late 16h century the Muslim Mughal Emperor Akbar the Great, ruing over Hindus, Muslims, Christians, Parsees, Jains, Jews, and atheists in the Indian sub-continent, instituted state religious neutrality and public dialogue between representatives of the different faiths.[ci] 

Mongke Khan, the ruler of the vast Mongolian empire in the 13th century, not only supported Genghis Khan’s initial policy of religious neutrality and tolerance of all faiths, he also instituted public debates over religion.  Individuals would try to refute opposing religious doctrines or practices in front of three judges: a Christian, a Muslim, and a Buddhist.  Contestants could only use rhetoric and logic to persuade the judges.  Common ground was rarely reached and “no side seemed to convince the other of anything,” but each debated ended, as most Mongol celebrations did, with an excess of alcohol, merriment, and “everyone simply too drunk to continue.”[cii]

The philosophes of the European Enlightenment used the ancient philosophical method of dialogue to investigate religious dogma and political traditions.  Gottfried Wilhelm Freiherr von Leibniz tried to orchestrate an ecumenical conference to bring together the warring religious factions of Christendom during the Thirty Years' War.  In the 1650s he exchanged many letters with the Roman Catholic Bishop Bossuet towards this end until it became clear that the Bishop was only trying to convert the heretic Leibniz.[ciii]  David Hume used the method of dialogue in his landmark book, Dialogues Concerning Natural Religion (1779), published after his death and without attribution because of the radical nature of dialogue during a time of religious and political absolutism.  Gotthold Ephraim Lessing's play Nathan the Wise (1779) was focused on the dialogue between a Jew, and Muslim, and a Christian.  In it he tried to preach tolerance for different faiths and viewpoints, ending with the maxim "Little children, love one another."[civ]  Peter Gay explained the power of this method, the philosophes favorite philosophical tool: "The philosophes, for their part, could exploit the potentialities of dialogue fully, to propound the most outrageous hypotheses for the sake, not of refutation, but of serious consideration, to dramatize the constructive role of criticism, to record their own education, their struggles and uncertainties, and by recording them, educate their readers."[cv] 

Political discourse in the United States of America is another example of the institutionalization of dialogue, public debate, and the peaceful exchange of ideas.  However, it took several hundred years for this rhetorical arena to open up for all peoples in this country.  For most of this country’s history not everyone was free to debate, and at several key junctions heated debates have irrupted into violent conflict, and even war.  But more and more minorities have stood up to be recognized and have articulated their need for equal rights.  The idea of American democracy remains to this day an unsettled and contested ideological terrain – the contours of which remain divisive and ever changing.[cvi]  James A. Banks has argued that a major problem facing modern, multicultural nations is “how to recognize and legitimize difference and yet construct an overarching national identity that incorporates the voices, experiences, and hopes of the diverse groups that compose it.”[cvii]  I think Gerald Graff has offered the only lasting solution.  He argued that educators should show students that “culture itself is a debate” and, thereby, “teach the conflicts” that define our American culture both past and present: “Acknowledging that culture is a debate rather than a monologue does not prevent us from energetically fighting for the truth of our own convictions. On the contrary, when truth is disputed, we can seek it only by entering the debate.”[cviii] 

In many ways the ideal of democracy can be seen as a peaceful yet heated discussion conducted by diverse human beings with different viewpoints trying to convince each other with words on the best way to organize a society. President Barack Obama has described American democracy “not as a house to be built, but as a conversation to be had.”  Obama argues that Americans need to join the “conversation” of America, which is “a ‘deliberative democracy’ in which all citizens are required to engage in a process of testing their ideas against an external reality, persuading others of their point of view, and building shifting alliances of consent.”[cix]  The philosopher Amy Gutmann has praised the “virtue” of deliberation by which important questions facing society are discussed and argued peacefully amongst the equal participants of a democratic nation. Often the most important questions cannot be completely answered, nor can agreement always be found, but Gutmann stressed, “We can do better to try instead to find the fairest ways for reconciling our disagreements, and for enriching our collective life by democratically debating them.”[cx]

The focus of education in a diverse culture should be on dialogue, the sustained attempt to peacefully discuss ideas while establishing mutual bonds of goodwill, friendship, and a common humanity.  This is a difficult and potentially dangerous task, but it is worth the risk.[cxi]  There should also be a focus on reasoned argument and clear thinking.  It is important to get beyond what Amartya Sen calls "disengaged toleration."[cxii]  Thus, beliefs and claims are questioned so as to investigate the truth, yet the old notion of a singular truth must be discarded since one's idea of truth is most likely embedded in a complex nexus of values and priorities.[cxiii]  The purpose of this book is not necessarily to convince anyone with a “final decisive argument.”  Rather this book is about sharing reasoned arguments in an open, friendly environment, seeking to be heard, respected and understood, while also searching for “commonly felt dilemmas,” mutual interest, and shared values.[cxiv]  The focus of education should be on the public dialogue between diverse participants, who have all come together in a mutual project to not only share divergent ideas, but ultimately, to engage our differences, tolerate diversity, and peacefully share an ever unfolding and changing world.[cxv] 


Endnotes

[i] Peter Gay, The Enlightenment: The Rise of Modern Paganism (New York, 1995), 92.

[ii] Jennifer Michael Hecht, Doubt: A History (New York, 2003); Amartya Sen, The Argumentative Indian (New York, 2005), 21-30.

[iii] I. Bernard Cohen, Revolution in Science, 135, 142.  See also Thomas S. Kuhn, The Structure of Scientific Revolutions.

[iv] Alfred North Whitehead, Science and the Modern World (Cambridge, UK, 1926), 17; Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 165.

[v] Quoted by Harris E. Starr, William Graham Sumner (New York, 1925), 167-168.  See also Richard Hofstadter, Social Darwinism in American Thought (New York, 1955), 55.

[vi] Whitehead, Science and the Modern World, 17; Ong, Ramus: Method, and the Decay of Dialogue, 165.

[vii] American Social Scientific Association, Constitution (27 December 1865), cited by Thomas L. Haskell, The Emergence of Professional Social Science (Urbana, IL, 1977), 111-112.

[viii] Timothy Fitzgerald, The Ideology of Religious Studies (Oxford, 2000); Jacob Pandian, “The Dangerous Quest for Cooperation between Science and Religion,” Science and Religion: Are They Compatible? (Amherst, 2003); Charles Taylor, A Secular Age (Cambridge, MA, 2007).

[ix] Taylor, A Secular Age, 54-55.

[x] Fitzgerald, The Ideology of Religious Studies; Pandian, “The Dangerous Quest for Cooperation between Science and Religion,” 165.

[xi] William T. Cavanaugh, “Sins of Omission: What ‘Religion and Violence’ Arguments Ignore,” Religion and Violence, The Hedgehog Review 6, no. 1 (Spring 2004): 37.  See also, Fitzgerald, The Ideology of Religious Studies, 3-32.  Jack Goody argues that "religion" is not so much a "western" invention as it is a more general product of literate societies.  See The Logic of Writing and the Organization of Society (Cambridge, UK: Cambridge University Press, 1986), 4-5.

[xii] Taylor, A Secular Age, 270.

[xiii] Taylor, A Secular Age, 275.  See also Berlin, “The Apotheosis of the Romantic Will,” 555-559; “Herder and the Enlightenment,” 426; “The Divorce Between the Sciences and the Humanities,” 326-28; “The Originality of Machiavelli,” 312-313; “The Counter-Enlightenment,” 245-46; “The Pursuit of the Ideal,” 5.

[xiv] Qtd. in Peter Gay, The Party of Humanities: Essays in the French Enlightenment (New York, 1971), 26.

[xv] Peter Gay, The Enlightenment: The Rise of Modern Paganism (New York, 1995), 374-377.

[xvi] Voltaire, Letter, August 18, 1756, qtd. in Peter Gay, The Enlightenment, 68.

[xvii] Peter Gay, The Enlightenment, 148-49.

[xviii] Peter Gay, The Enlightenment, 338; Taylor, A Secular Age, 294; Berlin, “The Sciences and the Humanities,” 330.

[xix] Gay, The Party of Humanities: Essays in the French Enlightenment, 124.

[xx] Mill qtd. in Richard Reeves, John Stuart Mill: Victorian Firebrand (London, 2007), 163-66.

[xxi] Thomas Jefferson, “Letter to Danbury Baptist Association,” Jan 1 1802.  The Supreme Court did not legitimate this notion until 1878.  For a history of this letter see James Hutson, “A Wall of Separation: FBI Helps Restore Jefferson’s Obliterated Draft,” Information Bulletin, Library of Congress, 57, no. 2 (June 1998) <www.loc.gov/loc/ lcib/9806/danbury.html>

[xxii] Thomas Bender, "Politics, Intellect, and the American University, 1945-1995," American Academic Culture in Transformation, Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 28.

[xxiii] Taylor, A Secular Age.

[xxiv] Rajeev Bhargava, "Does Religious Pluralism Require Secularism?" Hedgehog Review 12, no. 3 (Fall 2010); Charles Taylor, "The Meaning of Secularism," Hedgehog Review 12, no. 3 (Fall 2010); Craig Calhoun, "Rethinking Secularism," Hedgehog Review 12, no. 3 (Fall 2010).

[xxv] Jose Casanova, “Rethinking Secularization: A Global Comparative Perspective,” After Secularization, The Hedgehog Review 8, no 1-2 (Spring & Summer 2006); Paul Heelas, “Challenging Secularization Theory: The Growth of ‘New Age’ Spiritualities,” After Secularization, The Hedgehog Review 8, no 1-2 (Spring & Summer 2006); Daniele Hervieu-Leger, “In Search of Certainties: The Paradoxes of Religiosity in Societies of High Modernity,” After Secularization, The Hedgehog Review 8, no 1-2 (Spring & Summer 2006).

[xxvi] Raymond Aron, "The Future of Secular Religions" and "From Marxism to Stalinism," in The Dawn of Universal History: Selected Essays from a Witness of the Twentieth Century, trans. Barbara Bray (New York, 2002), 177-201, 203; Raymond Aron, The Opium of the Intellectuals (New York, 1962); John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New York, 2003), xiii, 4; John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2007).

[xxvii] John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New York, 2007), 1-3.

[xxviii] Taylor, "The Meaning of Secularism," 26.

[xxix] Kendrick Frazier, “Are Science and Religion Conflicting or Complementary?” Science and Religion: Are They Compatible? (Amherst, 2003), 26-27; Robert D. Putnam and David E. Campbell, American Grace: How Religion Divides and Unites Us (New York, 2010).

[xxx] Neil Degrasse Tyson, “Holy Wars: An Astrophysicist Ponders the God Question,” Science and Religion: Are They Compatible? (Amherst, 2003), 77; Eugenie C. Scott, “The ‘Science and Religion Movement’: An Opportunity for Improved Public Understanding of Science?” Science and Religion: Are They Compatible? (Amherst, 2003), 112.

[xxxi] Taylor, A Secular Age, 727.

[xxxii] Putnam and Campbell, American Grace: How Religion Divides and Unites Us, 33

[xxxiii] Martin E. Marty, “Our Religio-Secular World,” Daedalus 132, no. 3 (Summer 2003): 42, 47.

[xxxiv] Taylor, A Secular Age, 636.

[xxxv] Actually, the conflict between secularism and religion extends even beyond the existence of these concepts.  For at least the past three thousand years, along side religious belief in various cultures, doubts about these religious beliefs have co-exited.  See Hecht, Doubt: A History.

[xxxvi] Mario Biagioli, Galileo, Courtier: The Practice of Science in the Culture of Absolutism (Chicago, 1993).

[xxxvii] Hecht, Doubt: A History.

[xxxviii] Berlin, “The Counter-Enlightenment,” 263.  On the enlightenment see Peter Gay, The Enlightenment: The Rise of Modern Paganism (New York, 1966); The Enlightenment: The Science of Freedom (New York, 1969). See also Berlin “The Apotheosis of the Romantic Will;” “Herder and the Enlightenment;” “The Divorce Between the Sciences and the Humanities;” “The Originality of Machiavelli.”

[xxxix] Berlin, “The Counter-Enlightenment,” 250-51; “The Sciences and the Humanities,” 328.

[xl] Johann Gottfried Herder, Auch eine Philosophie der Geschichte zur Bildung der Menschheit (Another Philosophy of History on the Development of Mankind), in Berlin, “Herder and the Enlightenment,” 408.  Herder also questioned the Euro-centric bias of enlightenment philosophers (416).

[xli] James Davison Hunter, Culture Wars: The Struggle to Define America (New York, 1991); Peter L. Berger, ed., The Limits of Social Cohesion: Conflict and Mediation in Pluralist Societies (Boulder, CO, 1998).

[xlii] On this debate in England see Frank M. Turner, Between Science and Religion: The Reaction to Scientific Naturalism in Late Victorian England (New Haven, 1974); Peter J. Bowler, Reconciling Science and Religion: The Debate in Early Twentieth Century Britain (Chicago, 2001); Susan Budd, Varieties of Unbelief: Atheists and Agnostics in English Society, 1850-1960 (London, 1977).

[xliii] Although these "cultural wars" have not always been focused on religious tenants.  In the U.S. for example, during the 1910s-20s there was a cultural war based on competing versions of "Americanism" in response to rising tides of immigration, colonial wars, and World War I.  In the 1940s-1950s there was another cultural war based on political ideologies (communism vs. Americanism) and economic orders (state socialism vs. free-market capitalism).

[xliv] Ronald W. Clark, Einstein: The Life and Times (New York, 1971), 413.  But Einstein did not believe in the “conventional” definition of religion in terms of a personal God with supernatural powers who acts in history.  Einstein said, “The idea of a personal God is quite alien to me and seems even naïve.”  See Dawkins, The God Delusion, 15.

[xlv] Steven Jay Gould, “Nonoverlapping Magisteria,” Science and Religion: Are They Compatible? (Amherst, 2003), 193.

[xlvi] Fitzgerald, The Ideology of Religious Studies, 34-36.

[xlvii] Fitzgerald, The Ideology of Religious Studies, 36-37.

[xlviii] Fitzgerald, The Ideology of Religious Studies, 41.

[xlix] Mircea Eliade, The Sacred and the Profane: The Nature of Religion (New York 1987), 16, 28, 100, 166, 203.

[l] Fitzgerald, The Ideology of Religious Studies.

[li] Max Weber, The Sociology of Religion, trans. Ephraim Fischoff (Boston, 1963), 1.

[lii] Emile Durkheim, The Elementary Forms of the Religious Life (New York, 1965).

[liii] Loyal Rue, Religion Is Not About God: How Spiritual Traditions Nurture our Biological Nature (New Brunswick, 2005), 147-148.

[liv] Pascal Boyer, Religion Explained: The Evolutionary Origins of Religious Thought (New York, 2001), 12.

[lv] Fitzgerald, The Ideology of Religious Studies.

[lvi] Rue, Religion Is Not About God, 3.

[lvii] Boyer, Religion Explained, 28, 311.  See also Daniel C. Dennett’s Darwin’s Dangerous Idea: Evolution and the Meaning of Life (New York, 1995) and Breaking the Spell: Religion as a Natural Phenomenon (New York, 2006).

[lviii] Christopher Hitchens, “No, But It Should,” Does Science Make Belief in God Obsolete? John Templeton Foundation (West Conshohocken, PA, n.d.), 25.  See also Hitchens, God is not Great: How Religion Poisons Everything (New York, 2007).

[lix] Steven Pinker, “Yes, If By,” Does Science Make Belief in God Obsolete? John Templeton Foundation (West Conshohocken, PA, n.d.), 4.

[lx] Victor J. Stenger, “Yes,” Does Science Make Belief in God Obsolete? John Templeton Foundation (West Conshohocken, PA, n.d.), 32.  See also Stenger, God: The Failed Hypothesis – How Science Shows That God Does Not Exist (Amherst, NY, 2007).

[lxi] Richard Dawkins, The God Delusion (New York, 2006), 31.

[lxii] Victor J. Stenger, The New Atheism: Taking a Stand for Science and Reason (Amherst, NY, 2009).  For a video discuss between some of the leading “new atheists” see: Richard Dawkins, Daniel C. Dennett, Sam Harris, and Christopher Hitchens, The Four Horsemen: Episode 1 (2008).

[lxiii] Percy Bysshe Shelley, The Necessity of Atheism and Other Essays (New York, 1993), 31, 35.

[lxiv] Qtd in Susan Jacoby, Freethinkers: A History of American Secularism (New York 2004), 169.

[lxv] Bertrand Russell, Why I am Not a Christian and Other Essays on Religion and Related Subjects (New York, 1957), v, 193, 47.

[lxvi] Christopher McKnight Nichols, “The ‘New’ No Religionists: A Historical Approach To Why Their Numbers Are on the Rise,” Culture, Institute for Advanced Studies in Culture (Fall 2009), 13-14.

[lxvii] “Not All Unbelievers Call Themselves Atheists,” U.S. Religious Landscape Survey, The Pew Forum on Religion and Public Life (April 2 2009) <http://pewforum.org/Not-All-Nonbelievers-Call-Themselves-Atheists.aspx>; Robert D. Putnam and David E. Campbell, American Grace: How Religion Divides and Unites Us (New York, 2010), 104.

[lxviii] Qtd in Susan Jacoby, Freethinkers: A History of American Secularism (New York 2004), 167.

[lxix] Paul Edwards, "How Bertrand Russell was Prevented from Teaching at the College of the City of New York," in Bertrand Russell, Why I am Not a Christian and Other Essays on Religion and Related Subjects (New York, 1957), 209-13.

[lxx] “Religion and Politics: Contention and Consensus (Part II),” The Pew Forum on Religion and Public Life (July 24 2003) <http://pewforum.org/PublicationPage.aspx?id=621#1>

[lxxi] “Soldier Sues Army, Saying His Atheism Led to Threats,” The Pew Forum on Religion and Public Life (April 26 2008) <http://pewforum.org/Religion-News/Soldier-Sues-Army-Saying-His-Atheism-Led-to-Threats.aspx>

[lxxii] Dawkins, The God Delusion, 3.

[lxxiii] Philip Kitcher, "Militant Modern Atheism," Journal of Applied Philosophy 28, no. 1 (2011): 1-13.

[lxxiv] Christopher Hitchens, “The Future of an Illusion,” Daedalus 132, no. 3 (Summer 2003): 83-87.

[lxxv] Sam Harris, The End of Faith: Religion, Terror, and the Future of Reason (New York, 2004), 72-77, 83.

[lxxvi] Harris, The End of Faith, 130-131.

[lxxvii] Harris, The End of Faith, 170.

[lxxviii] Harris, The End of Faith, 217, 221.

[lxxix] Dawkins, The God Delusion, 14.

[lxxx] Dawkins, The God Delusion, 20.

[lxxxi] Dawkins, The God Delusion, 361.

[lxxxii] Robert Sapolsky, “No,” Does Science Make Belief in God Obsolete? John Templeton Foundation (West Conshohocken, PA, n.d.), 20-22.

[lxxxiii] Christopher R. Beha, "Reason for Living: The Good Life without God," Harper's Magazine (July 2012), 73, 75.

[lxxxiv] Sapolsky, “No,” Does Science Make Belief in God Obsolete?, 20-22.

[lxxxv] Robert D. Putnam and David E. Campbell, American Grace: How Religion Divides and Unites Us (New York, 2010), 4-5

[lxxxvi] John F. Haught, Science and Religion: From Conflict to Conversation (Mahwah, NJ, 1995); Dennett, Breaking the Spell, 14.

[lxxxvii] Marcus J. Borg and N. T. Wright, The Meaning of Jesus: Two Visions (New York, 1999); Dale C. Allison, Marcus J. Borg, John Dominic Crossan, and Stephen J. Patterson, The Apocalyptic Jesus: A Debate, ed. Robert J. Miller (Santa Rose, CA, 2001).

[lxxxviii] Christopher Hitchens and William Lane Craig, Does God Exist? (La Mirada Films, 2009); Christopher Hitchens and Douglas Wilson, Collision: Is Christianity Good for the World? (Level 4, 2009); Christopher Hitchens and Dinesh D’Souza, God On Trial: A Debate on the Existence of God (Fixed Point Foundation, 2008). 

[lxxxix] See for example: Deepak Chopra and Leonard Mlodinow, War of the Worldviews: Science vs. Spirituality (New York, 2011); William Lane Craig and Quentin Smith, Theism, Atheism, and Big Bang Cosmology (Oxford, UK, 1993), Michael L. Peterson and Raymond J. Vanarragon, Contemporary Debates in Philosophy of Religion (Oxford, UK, 2004).

[xc] James Davison Hunter, Before the Shooting Begins: Searching for Democracy in America's Culture War (New York, 1994), 8.

[xci] Richard Rorty, "Religion as Conversation-Stopper," Common Knowledge 3 (Spring 1994): 1-6.

[xcii] Berlin, “Does Political Theory Still Exist?”, 88; “Historical Inevitability,” 176-77, 185.

[xciii] John Gray, Enlightenment's Wake (New York, 2009), 27, 44-45.  Gray defines toleration as "an acceptance of the imperfectability of human beings...Since we cannot be perfect, and since virtue cannot be forced on people but is rather a habit of life they must themselves strive to acquire, we [are] enjoined to tolerate the shortcomings of others, even as we struggle with our own" (27).

[xciv] Pope John Paul II, Address to the United Nations General Assembly (Oct 5, 1995), qtd. in Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of Reform in Liberal Education (Cambridge, MA, 1997), 259.

[xcv] Gray, Enlightenment's Wake, 45.

[xcvi] Dennett, Breaking the Spell, 14.

[xcvii] Taylor, A Secular Age, 675.

[xcviii] Taylor, A Secular Age, 428.

[xcix] Anthony Gottlieb, The Dream of Reason: A History of Philosophy from the Greeks to the Renaissance (New York, 2001).

[c] Amartya Sen, The Argumentative Indian (New York, 2005), 15.

[ci] Sen, The Argumentative Indian, 18.

[cii] Jack Weatherford, Genghis Kahn and the Making of the Modern World (New York, 2004), 172-73.

[ciii] Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 71.

[civ] Lessing, Nathan the Wise, qtd. in Peter Gay, The Enlightenment, 334.

[cv] Peter Gay, The Enlightenment, 172, 176-77.

[cvi] Specifically I am referring the debates over the “culture war” of the last three decades, which reflect a heated disagreement over the very notions of American national and cultural identity. A very short list of this debate might include the following: Allan Bloom, The Closing of the American Mind; Arthur M. Schlesinger, Jr., The Disuniting of America: Reflections on a Multicultural Society; James Davison Hunter, Cultural Wars: The Struggle to Define America; Todd Gitlin, The Twilight of Common Dreams: Why American is Wracked by Culture Wars; Michael Lind, The Next American Nation: The New Nationalism and the Fourth American Revolution; Lawrence W. Levine, The Opening of the American Mind: Canons, Culture, and History; Michael Kazin and Joseph A. McCartin, eds., Americanism: New Perspectives on the History of an Ideal.

[cvii] James A. Banks, “Diversity, Group Identity, and Citizenship Education in a Global Age.”

[cviii] Gerald Graff, Beyond the Culture Wars: How Teaching the Conflicts Can Revitalize American Education,8,12,15.

[cix] Barack Obama, The Audacity of Hope: Thoughts on Reclaiming the American Dream, 92. For two excellent and very short books on democracy, see: Robert A. Dahl, On Democracy; Robert A. Dahl, On Political Equality.

[cx] Amy Gutmann, Democratic Education, 11-12.

[cxi] Linda Liska Belgrave, Adrienne Celaya, Seyda Aylin Gurses, Angelica Boutwell, Alexandra Fernandez, "Meaning of Political Controversy in the Classroom: A Dialogue Across The Podium," Symbolic Interaction, 35, no. 1 (2012), 68–87.

[cxii] Amartya Sen, The Idea of Justice (Cambridge, MA, 2009), x.

[cxiii] Berlin, “Does Political Theory Still Exist?”, 88; “Historical Inevitability,” 176-77, 185; Amartya Sen, The Idea of Justice, x, xix.

[cxiv] Taylor, A Secular Age, 675.

[cxv] Walter Feinberg, The Idea of a Public Education, Review of Research in Education 36 (March 2012), 17.

A Nation Divided

Understanding Culture Wars: A Historiography of American Nationalism

originally written 2011

 
At stake is how we as Americans will order our lives together.
— James Davison Hunter
 

America is in the midst of a culture war.  Many see America deeply divided and polarized by ethnicity and race, by moral values, by political parties, by class, by gender, and by a host of other variables.  Public discourse in America ranges from vitriolic partisan denunciations to diplomatic relativism to scholarly argumentation to ignorance and apathy.  Is there a path that will lead beyond this culture war?  In order to address that question Americans first need to understand the root of the conflict.  Fundamentally, the disagreement is over national identity: What is America, and who is an American?  To understand this fundamental conflict one must listen to and embrace a heated debate in order to outline a diverse array of answers.  But in order to outline a schematic of American nationalism, one must understand the origins of the American nation and its complex trajectory through history.  In looking to American history one must ask: Are there antecedents to our current cultural war?  Have there been older disagreements over American national identity?

If one examines the historical record, especially outside the boundaries of traditionally defined political authority, dissent and discord pervade American identity.  According to the founding document announcing the birth of the American nation, “The Declaration of Independence,” “all men” were “created equal” and had certain “inalienable rights” given to them by their “creator.”  Among the most important of these rights were “life,” “liberty,” “the pursuit of happiness,” and the right to a responsive representative government that would protect the people’s rights, as well as their “safety and happiness.”  But even before this hallowed political document would be approved by the Continental Congress and announced to the world, the wife of one Congressman, Abigail Adams, wrote to her husband on March 31, 1776 and scolded him and his fellow American congressmen for being hypocritical.  How could these men proclaim “liberty,” inalienable political rights, and the “emancipation of nations” while they were depriving women of their liberty and rights.  She pointed out to her husband that American men did not truly know what liberty or equality meant because their idea of liberty and equality were only for a privileged, male few.  Abigail warned that women would not take the “tyranny” of men for long and they would rebel, free themselves, “subdue” their masters, and then “without violence throw both your natural and legal authority at our feet.”[i] 

And yet the assertive Abigail Adams was only willing to extend her critique so far.  Just a year earlier she had written to her husband about the fearful “conspiracy of the Negroes,” by which she meant those slaves who had the audacity to petition for freedom in return for fighting along side the English against the insurrectionary colonists.  Abigail apparently could not understand why black slaves wanted their freedom just as much as she did, nor could she understand that these blacks would do whatever they could to attain their liberty – including fighting against the hypocritical Americans (as Abigail herself threatened) whose “liberty” and “equality” where mainly for propertied, white men.[ii]    The black American David Walker would later address the American republic in 1829, “Do you understand your own language?  Hear your language proclaimed to the world on July 4th 1776 – ‘We hold these truths to be self-evident – that ALL men are created EQUAL!!’”  In 1850 Frederick Douglass asked, “What, to the Slave, Is the Fourth of July” – “This Fourth of July is yours, not mine.”[iii]

At the same time that diverse participants of the American nation were contesting the very meaning of America, there was also a solid tradition of self-assured Americans (ironically, many of them immigrants) trying to consolidate a single, unified vision of America.  Not long after the revolution propagandists like J. Hector St. John De Crevecoeur praised the “modern” American nation as everything backward Europe was not.  Crevecoeur claimed the original English settlers were “enlightened” as they “discovered,” “settled,” “embellished,” and laid the foundation for what would become America.  He also claimed that this new modern nation was being developed by and for white, Northwest Europeans who were busy creating “a new race of men” – “the American, this new man.”  But to become American these Northwest Europeans (“English, Scotch, Irish, French, Dutch, Germans and Swedes”) had to not only leave behind their old culture, language, and customs, but also “embrace” the new American government and culture, which just so happened to be a highly Anglicized culture infused with Protestant and capitalist values.[iv] 

By 1811 the Anglo-Protestant John Quincy Adams could confidently write his father, “The whole continent of North America appears to be destined by Divine Providence to be peopled by one nation, speaking one language, professing one general system of religious and political principles.”  Of course the “one nation” that Adams foresaw was a white man’s nation, a Protestant Christian nation, a capitalist nation, and these convictions would lead many white men to proclaim a new self-evident truth.  The Democratic Review on July 1850 announced, “The fact that the dark races are utterly incapable of attaining to that intellectual superiority which marks the white race is too evident to be disputed.”  It was a simple extension of deductive logic to thereby conclude, as did James De Bow in De Bow’s Review in 1854: “The Negro till the end of time will still be a Negro, and the Indian still an Indian.  Cultivation and association with the superior race produce only injury to the inferior one.  Their part in this mysterious world-drama has been played, and, like the Individual, the race must cease to exist.”[v]  But of course this drive for cultural unity, racial purity, and national solidarity as a white man’s nation was contested all the way.  Elizabeth Cady Stanton addressed the New York State Legislature in 1860 and let them know that the “white Saxon man[’s]” ridiculous “prejudice” against “color” and “sex” were not congruent with “The Declaration of Independence.”  She declared sarcastically that “negroes” and women were not “monsters” and thus they too deserved liberty and political rights.  She wanted the nation to remove all the prejudicial legislation against women and blacks and then to “strike the words ‘white male’ from all your constitutions.”[vi] 

This essay is an attempt to outline a historical schematic leading up to our late 20th/early 21st century culture war in order to historically contextualize our current debate within a much larger and older debate over American national identity.  The central focus of this essay is the debate: a longstanding and contested deliberation over national identity and purpose.  This essay will not and cannot bring any resolution to this debate; however, this essay will try to clarify the basic structure of the debate and attempt to historically contextualize it.  The basic thesis that this essay will argue and demonstrate is that the democratic nation of America was founded on an irresolvable debate.  It was and is a debate, to quote the historian Joseph J. Ellis, which “was not resolved so much as built into the fabric of our national identity.  If that means the United States is founded on a contradiction, then so be it.”[vii]  The United States of America was consecrated on debate and its foundational documents, the Constitution, the Bill of Rights, and the Declaration of Independence, all were designed to protect and project that debate into the future – the American nation can be seen as the institutionalization of a heated, contradictory, often ugly, sometimes democratic, yet always deadly serious debate.  Our 21st century culture wars are an important testament to the longstanding tradition that defines and unties the American people: the constitutional imperative to freely speak, debate, and at times fight over[viii] the identity and direction of the American nation.  Issues, parties, perils, crises, and credos come and go, but the debate over our American identity continues to define who we all really are, have been, and will be.  It is our inheritance – both a promise and a curse.  America is dying!  Long live America!     

 

Conservative Reaction to 20th Century Liberal Reforms

The conservative reaction to the liberal state, the rights movements of the 1960s, and the general unrest caused by counter-cultural uprisings was varied in temper, scope and accuracy.  Many right wing polemics expressed only anger, condemnation, and righteous rage.  Some mixed nostalgic fantasies with biased readings of the recent changes initiated in the 1960s.  A few articles and books articulated reasonable claims backed with evidence in an attempt to put forth a scholarly argument for conservative policies.  All conservative reactions were defensive as they implicitly or explicitly tried to uphold a particular conception of a unified and monocultural Anglo-Protestant based America, which they saw being damaged or destroyed by the legacies of the 1960s.  Despite claims made by many dismissive liberal nationalists,”[ix] conservative defenders of a distinctly WASP America abound, and they have became arguably more vocal, more impassioned, and under the reign of George W. Bush, more empowered.  However, it must be added that conservative arguments or rants for a monocultural America have been extremely repetitious in their uniform allegiance to a mythic golden age of WASP American glory, civic virtue, and harmony.   

Allen Bloom’s The Closing of the American Mind (1987) was perhaps one of the most important early salvos of reactionary conservative critics.  Bloom’s “meditation on the state of our souls” was an angry conservative manifesto disguised as a metaphysical treaty on human nature, truth, and the classical virtues of a “liberal” education.  The problem, as Bloom saw it, was a shift in cultural priorities and values, which had infected the classical “liberal” curriculum and was “indoctrinating” students to see “wars, persecutions, slavery, xenophobia, racism, and chauvinism” around every hallowed corner of Western history.  The new “open” curriculum lacked refinement and cultured discrimination because it accepted “all kinds of men, all kinds of life-styles, all ideologies.”  It was nothing like the “old” American curriculum, built on an established liberal arts tradition, which taught refined students to “recognize and accept man’s natural rights” and the “fundamental basis of unity and sameness” that had been recently discarded by divisive liberal cant like “class, race, religion, national origin, or culture.”  Bloom was very concerned that American students, and the country in general, were loosing sight of the “natural human good” and the refined ability to “admire it when found” (like the traditional “heroes” of American history).  Bloom thought that a revolution had taken place whereby “minorities” had “assaulted” and “weakened” “the sense of superiority of the dominant majority” (WASPs) in order to destroy the old older and set up relativistic “nation of minorities and groups each following its own belief and inclinations” in stead of following the traditional and objective “common good,” which was disappearing in a wave of relativistic “conformism:” it was the closing of the American mind.[x]          

Bloom wanted to remind Americans that “culture is a cave” and every human being is raised within a particular traditionally defined “cave” in order to be inculcated into the “standards” that make us a “culture-being;” however, culture is limiting and keeps humans from the light of “nature” and “truth.”  Western “science,” derived from the ancient Greek search for truth, is the only way to escape the Platonic cave of culture into the wider, permanent truth that is the “rational quest for the good life according to nature.”  The current dogma of cultural relativism teaches “openness” to the “closedness” of cultural caves, which lock students in ethnocentric bias of cultural fallacies.  According to Bloom, the traditions of Western science and the liberal arts (embedded and preserved in American culture) contain the superior and universal human truths that all “men” need to escape their limited cultural caves in order to gain the eternal and universal truth of the human condition: “The active presence of a tradition in a man’s soul gives him a resource against the ephemeral.”[xi]

The Civil Rights movements of the 1960s had “dismantled” the “structure of rational inquiry” and “ideologized” the student population with “whatever intense passion moved the masses.”  But Bloom warned his audience, “The nation was not ready for great changes.”  The rush for social change only “radicalized” and “politicized” education, and the new heretical cry of “racist” was shouted irrationally from every campus at decent bastions of the old order.  Bloom was quite dismayed and claimed, “so far as universities are concerned, I know of nothing positive coming from [the 1960s]; it was an unmitigated disaster;” it was a “crime.”  The “old core curriculum” was dismantled and destroyed and replaced by a vapid “egalitarian self-satisfaction” that amounted to “nothing.”  The 1960s was the “source of the collapse of the entire American educational structure” because “the knowledge of philosophy, history and literature” was “trashed,” and replaced with “dogmatic answers and trivial tracts.”

The new dogma was derived from a “new moralism” (actually an older “antimorality”), which put forth the quasi-goods of “modern democratic thought:” “equality, freedom, peace, cosmopolitanism.”  Lost in this democratic vulgarity were traditional social goods, like the “natural differences” of human beings, the “restraints” of liberty, the glories of war, and patriotic “devotion to family or country.”  In fact, this new democratic dogma concealed a “covert elitism” that actively “suppressed” the “superiority” of certain peoples, especially rulers, in order to patronize the “ambition” of average commoners.  It also ignored the plain facts that certain races are superior to others.[xii]  Bloom was quite clear in his assertion of American exceptionalism: “America tells one story: the unbroken, ineluctable progress of freedom and equality;” and now “is the American moment in world history…the fate of freedom in the world has devolved upon our regime.” However, based on the cultural and political changes of the 1960s, America’s ability to seize its privileged destiny was in “doubt.”[xiii] 

Another important conservative reaction was Cultural Literacy (1988) by E. D. Hirsch, Jr.  Hirsch’s tone was much more subdued and scholarly then Bloom’s, and Hirsch restricted his reaction to the subject of literacy and its central importance to a democracy.  Hirsch admitted that “flux” permeates culture, but that “stability not change” should be the educator’s primary obligation to the young and, thus, “cultural literacy” should be the primary object of education: “the persistent, stable elements belong at the educational core.”  The primary purpose of schooling is to “acculturate” children into “our national life,” which Hirsch assumed to be a “shared culture.”  But later in the book Hirsch asked a telling question, “Shall we aim for the gradual assimilation of all into one national culture, or shall we honor and preserve the diverse cultures implicit in our hyphenations?”  Hirsch was able to admit the legitimacy of the “vocabulary of a pluralistic nation” and say, “American national culture is neither coherent nor monolithic, and no convincing attempt fully to define its character has ever appeared” and so he argues that the U.S. educational endeavour should be guided by a “value-neutral” “vocabulary.”  Of course this raises the question about whether a “value-neutral” vocabulary or educational project is even possible.  But Hirsch’s call for “neutrality" was disingenuous because he actually intended to promote a “conservative” “means of communication” so as to acculturate students into a “traditional culture.”  He tried to defend his policy by stating, “Traditional information by no means indoctrinates [students] in a conservative point of view,” and that “teaching children national mainstream culture doesn’t mean forcing them to accept its values uncritically.”  However, it is hard to see how the whole educative endeavour under the “primary and fundamental” direction of the “acculturative responsibility” to “teach the way’s of one’s own community” cannot avoid using the soft-power of cultural hegemony.  What safeguards do children have within the public schools when bureaucratic or professional functionaries fall back on a rigidly defined national curriculum and simply indoctrinate children so as to satisfy the predominant public good, which Hirsch believed at the time to be meeting “the needs of the wider economy.”[xiv] 

While Hirsch is certainly more reasonable and reasoned than Bloom in his conservative arguments for a common culture and nationalist education, his position still boils down to conventional wisdom and traditionalist assumptions, bottoming out on the bedrock of preferring (without explaining or systematically arguing for) one set of values over another:

“Although nationalism may be regrettable in some of its world-wide political effects, a mastery of national culture is essential to mastery of the standard language in every modern nation.  This point is important for educational policy, because educators often stress the virtues of multicultural education.  Such study is indeed valuable in itself; it inculcates tolerance and provides a perspective on our own traditions and values.  But however laudable it is, it should not be the primary focus of national education.  It should not be allowed to supplant or interfere with our schools’ responsibility to ensure our children’s mastery of American literate culture.  The acculturative responsibility of the schools is primary and fundamental.  To teach the ways of one’s own community has always been and still remains the essence of the education of our children.”[xv]

If “American national culture is neither coherent nor monolithic,” as Hirsch noted earlier, then how could he uphold abstract platitudes as “our own traditions and values,” while also admitting that communities have their own ways, which should be taught?  Whose community or interests should be taught and who in the community should decide?  What part of the diverse community decides the issue?  Hirsch’s arguments seem to advocate a curriculum based on a unified and singular “national” U.S. culture, but the question then arises what exactly is a national culture, and is it ever a unified collection of clearly defined interests based on the desires of all parties involved?  Whose culture, whose nation, whose values, whose world-view will dominate and declare “our own traditions and values” as the uniform standard?  Hirsch does not address these questions and seemingly takes it for granted, as did Bloom, that his national culture is the “common culture,” and thus the only national culture that should be taught.

In “Americanization and the Schools” (1999), E. D. Hirsch argued that Americanization should be a common function of the public schools for all children, immigrant and native alike.[xvi]  He also said that “ethnic identity” does not necessarily have “to be sacrificed in the course of Americanization,” but he did not explain how this can be avoided.  He emphasized that “failure to master the nuanced use of English in speech and writing places a severe limit in the United States on one’s opportunity, and freedom, and the amount of money in one’s purse…Those Americans who lack effective mastery of English, including mastery of the shared background knowledge that enables its nuanced use, are destined to stay poor and alienated from mainstream social and political life.”  Hirsch dismissed charges of “cultural and linguistic imperialism” because he viewed a shared language and culture as a “universal” practice and a “social necessity.”  Everyone needed to be Americanized according to Hirsch: “New citizens and citizens-to-be deserve the same Americanization as other American children.  All American children need to be Americanized in a deeper sense…This system of common knowledge and root attitudes needs to be imparted in school not just to achieve a citizenry competent to rule itself, but also to achieve community, social peace, and, not least, economic justice.”  Hirsch invoked Horace Mann and argued for Americanization through a “common curriculum” that would articulate an Americanization program that would not be a “narrow, nationalist indoctrination” but a “special universalist sentiment appropriate to a nation of nations:” Patriotism, claimed Hirsch, “implies love of country without implying hostility to the other…American patriotism is built of shared knowledge, attitudes, loyalties, and values – including values of non-exclusion and toleration.”  The “need for a common language is the key to a trans-ethnic future.”  Hirsch attacked “bilingual movement” and the “multicultural movement” as “education sisters” that articulate a program of “romantic particularism,” which he decried as the “mortal enemy” of “Enlightenment cosmopolitanism.”  He also claims that these movements have “deepen[ed] the disadvantage” of “unassimilated” children and thus they helped “preserve the economic status quo and even widen the gap between rich and poor.”  Hirsch argued that “militant bilingualism and multiculturalism” have made the schools “even more confused and rudderless places than they had already been.”   

Another important conservative barrage, and perhaps the most important and significant conservative argument of the 1990s, came from noted liberal historian Arthur M. Schlesinger, Jr. and his bestselling political tract, The Disuniting of America: Reflections on a Multicultural Society (1991, 1998).  Schlesinger made a concession to the liberal camp and argued, “cultural pluralism is a necessity in an ethnically diversified society” such as the U.S., however, his book was an  extended argument for a conservative common culture based on WASP values.  One line of argument invoked Hirsch and explained that a “common language” is an “essential bond of cohesion in so heterogeneous a nation as America.”  The other, more important line of argument focused on the “democratic principles” of America which he enveloped in a teleological grand narrative: American political history was the “persistent movement” from “exclusion” to “inclusion,” “openness,” and “tolerance.”  However, he did make a nod to critics on the left by admitting that American principles have “too often” been “transgressed in practice” due to Anglo-American “domination” of “culture and politics” and WASP “convictions of racial superiority.”[xvii] 

Schlesinger admitted that traditional U.S. history has been “invoked to justify the ruling class” composed of “white Anglo-Saxon Protestant males” who conceptualized American history to serve their own distinct “interests.”  However, Schlesinger did not linger on this point or find it necessary to condemn.  Instead he argued that Americans must embrace their past, “for better or worse,” and come to terms with the WASP tradition as the cultural foundation of America.

“The smelting pot thus had, unmistakably and inescapably, an Anglocentric flavor.  For better or worse, the white Anglo-Saxon Protestant tradition was for two centuries – and in crucial respects still is – the dominant influence on American culture and society.  This tradition provided the standard to which other immigrant nationalities were expected to conform, the matrix into which they would be assimilated.”

Schlesinger used this conception of a foundational common culture to explain how it has become more inclusive because of the grand narrative of progress unfolding in U.S. history.  He quoted the conservative historian of education Diane Ravitch who said, “Paradoxical though it may seem, the United States has a common culture that is multicultural.”  The point he developed was that even though the WASP culture was a dominative and self-seeking culture that forced other peoples to conform to its standard, it was a self-critical culture, a culture defined by democratic principles, and above all else, it was a culture that was willing and able to “forge a single nation from people of remarkably diverse racial, religious, and ethnic origins.”  American culture may be based on the foundation of an older WASP culture, but that WASP culture was able to facilitate “progress” towards a “new national identity:”  

“E pluribus unum: one out of many.  The United States had a brilliant solution for the inherent fragility, the inherent combustibility, of a multiethnic society: the creation of a brand-new national identity by individuals who, in forsaking old loyalties and joining to make new lives, melted away ethnic differences – a national identity that absorbs and transcends the diverse ethnicities.”

The “brilliant solution” of the melting pot, which leads to a “new American culture,” was never fully documented or explained by Schlesinger, and his argument is complicated and confused as he admitted that many ethnic groups were skeptical of the melting pot solution, especially considering the centuries of xenophobia, white supremicism, and racism that has only recently been “acknowledge[d] and confront[ed].”  Schlesinger noted how many minority groups throughout American history had to “demand” their political rights through “declarations of ethnic identity,” which gave rise in the 20th century to “ethnic politics” and has culminated in the denunciation of melting pot theory as nothing but “a conspiracy to homogenize America.”  Instead of documenting and reconciling this melting-pot debate, however, Schlesinger rushed to a hasty and simple conclusion: a “new American culture” has been produced through the unfolding of historical progress, culminating in the Civil Rights amendment, but petty “cults” of ethnicity have mushroomed from the 20th century Civil Rights struggle.  These ethnic cults “threaten to become a counter-revolution” and could destroy the hard-earned new national identity.  These ethnic cults must conform to the “common American nationality” because America was, is, and must continue to be “‘one people,’ a common culture, a single nation,”[xviii]

By the presidential election race of 1992 the rhetoric of the culture war was being used by the radical and religious right.  In order to scare up political support, conservative reactionaries took the debate to new levels of aggressiveness.  Patrick J. Buchanan led a campaign for the presidency on the extreme political and religious right, but he eventually came back into the mainstream Republican fold to support George Bush.  At the 1992 Republican National Convention he railed against liberals and the “failed liberalism of the 1960s and 70s” as the arbiters “of doom.”  It was the noble Republican, Ronald Reagan, who returned American to the “Judeo-Christian values and beliefs upon which this nation was built” and he “made us proud to be Americans again.”  Energized by eight years of a powerful Republican administration, religious and social conservatives loudly proclaimed that the socio-political change of the 1960s and 70s were “not the kind of change America wants.  It is not the kind of change America needs.  And it is not the kind of change we can tolerate in a nation that we still call God’s country.”  Buchanan argued that a “religious war” was being waged in America over “the soul of America:” “It is about who we are.  It is about what we believe.  It is about what we stand for as Americas…It is a cultural war.”  Buchanan called for a new conservative movement that would use “force, rooted in justice, backed by courage” in order to “take back our culture, and take back our country.”[xix]  Two months later, Buchanan expanded on this same theme and delivered another speech, “The Cultural War for the Soul of America” (1992).  Buchanan was indignant over charges of his “divisive,” “hateful,” and “racist” speeches, and he thundered, “As polarized as we have ever been, we Americans are locked in a cultural war for the soul of our country.”  He quoted a newspaper columnist and explained, “It is about power; it is about who determines ‘the norms by which we live, and by which we define and govern ourselves.’ Who decides what is right and wrong…Whose beliefs shall form the basis of law?”  Buchanan argued that “our beliefs” grounded in “the Old and New Testament” and “natural law and tradition” were at war with a “destructive, degenerate, ugly, pornographic, Marxist, anti-American ideology.”  The battle is over “family, faith, friends, and country.  For the ashes of their fathers and the temples of their Gods.”  And the battle was now “raging in our public schools” and the teaching of history.  Buchanan claimed, “If a country forgets where it came from, how will its people know who they are?...The battle over our schools is part of the war to separate…all Americans from their heritage.”[xx]

Perhaps the penultimate book reflecting the most comprehensive articulation of conservative criticisms and concerns over American identity is Samuel P. Huntington’s Who Are We? The Challenges to America’s National Identity (2004).[xxi]  Huntington’s fundamental premise on which the whole book rests is America (“We”) is “different” and “distinct” from other nations (“thems”),[xxii] which leads to a tenuous inductive argument: American cultural difference is notably superior because it has produced the most powerful nation on the planet, and that difference is due to a “distinct” Anglo-Protestant culture and its “religiosity.”  This inductive argument is repeated in numerous forms throughout the book, but it is never proved through scholarly argument and substantial evidence; it is rather assumed to be true via faulty claims, historical inaccuracies, and topically referenced, highly selective, and superficially engaged reviews the scholarly and polemical literature.[xxiii]  Huntington core claim is that America’s Anglo-Protestant culture is alone responsible for all things distinctly American: the English language, Christianity, religious commitment, republican concepts (the rule of law, the responsibility of rulers, the rights of individuals, individualism, and the work ethic), and the American creed of equality and freedom.  Thus, the Anglo-Protestant culture must be preserved against the rising tides of multiculturalism (“ethnic separatism” and “reverse racism”) and Hispanic immigration (“Hispanization”) or American will dissipate and “transform” into “a country of two languages, two cultures, and two peoples.”  And even though the Anglo-Protestant culture has traditionally been a homogeneous, “overwhelmingly white,” and white supremacist culture (a fact that Huntington does admit in subdued tones in several places), Huntington argues that “the importance of Anglo-Protestant culture” as the foundational cultural identity of Americans does not mean that America is only open to “Anglo-Protestant people.”  But he is quite clear that ethnic minorities must become Americans on Anglo-Protestant terms (Americanization) or else they are a corrosive threat to a unified national identity: “There is no Americano dream.  There is only the American dream created by an Anglo-Protestant society.  Mexican-Americans will share in that dream and in that society only if they dream in English” [my emphasis].[xxiv]

Huntington explained quite clearly in his Foreword that he was motivated by his “own identities as a patriot and a scholar” (it is very significant which identity he named first), and he acknowledged that “the motives of patriotism and of scholarship” could very easily “conflict” with each other.  He claims that his scholarship is “detached and thorough” and that it is based upon “an analysis of the evidence,” and yet he does admit, “My selection and presentation of that evidence may well be influenced by my patriotic desire to find meaning and virtue in America’s past and in its possible future.”  Throughout the book Huntington engages in the rhetorical fallacy of reifying a nationally unified, distinct, and unambiguously clear “We,” which is the voice of the “majority,” the American “public.”  Throughout the book Huntington unproblematically speaks for the American people (“We Americans”) and claims the America “most Americans love and want” is the exact same as the America “I know and love,” which in turn is the exact opposite of the divisive “cults of multiculturalism” with their Anti-American (“left-wing, socialist, working-class”) vision.[xxv]  But Huntington reveals evidence that his position may not be representative of all Americans. 

In fact, the views and arguments put forth in Huntington’s book (and the views and arguments of all of the conservative critics surveyed in this essay) resemble very closely Huntington’s characterization of “white nativism.”  One could make a strong argument that Huntington and the other conservative critics are in fact a type of white nativist, which is arguably a small, but powerful and highly vocal minority in America.  It is instructive to quote Huntington at length and then compare his words in relation to his central arguments discussed above:

“One very plausible reaction [to multiculturalism fomented in the 1960s] would be the emergence of exclusivist sociopolitical movements composed largely but not only of white males, primarily working-class and middle-class, protesting and attempting to stop or reverse these changes and what they believe, accurately or not, to be the diminution of their social and economic status, their loss of jobs to immigrants and foreign countries, the perversion of their culture, the displacement of their language, and the erosion or even evaporation of the historical identity of their country…the preservation or restoration of what they see as ‘white America’ is a central goal…to defend one’s ‘native’ culture and identity and to maintain their purity against foreign influences.”

Huntington and the other cultural critics surveyed here seem to represent the “new white nationalists” who are “cultured, intelligent, and often possessing impressive degrees” and who fear that Hispanics and other ethnic and racial minority groups are a “threat to their language, culture, and power.”  Huntington made it very clear that culture is a human invention and that cultures change, thus, based on his reasoning, Anglo-Protestant America must be preserved not because of some transcendent value, but because it is his culture and he loves it and he will fight “others,” like Hispanics, to keep his culture pure and powerful.  It is not a noble sentiment, but it is certainly heartfelt.[xxvi]        

Liberal Defense of 20th Century Reform

By the mid 1990s left leaning academics began to more fully address the arguments and historiography of conservative critics.  Liberal responses were drafted for a number of reasons.  Most spent time analyzing conservative falsehoods and exaggerations.  Many acknowledged and legitimated several conservative fears, albeit in less extreme and apocalyptic terms.  Most put forth counter arguments to justify the essential cultural changes initiated in the 1960s, however, many incorporated conservative critiques in order to reframe and defuse the cultural war in terms of a liberal or multicultural nationalism.  There have also been many voices from the political left who pushed for more radical changes.  Radicals have often sought to extend the debate of the culture war beyond a narrow preoccupation with American identity, and have focused instead on larger issues of American imperialism, universal human rights, and ecological sustainability.  Some radical voices have even suggested that the bounded community of the nation is itself an impediment to social justice as it is based on an exclusivity that can be used to deny human dignity and justice to those like “illegal” immigrants who lie beyond the protection of the nation.    

Michael Walzer wrote an important essay, “Pluralism: A Political Perspective” (1980), in which he argued that “national and ethnic pluralism has been the rule, not the exception” in American history.  Revolutionary leaders (and many political ideologues and activists since) tried to argue that democracy was only possible if it was accompanied by “cultural unity;” however, as Walzer pointed out, history has shown that democracy and claims for political and social equality have “proven to be the great solvents” of cultural unity rather than its champions.  The cultural unification of many peoples under a single nation-state, Walzer argued, “is possible only under tyrannical regimes…except in the United States.”  The United States is exception in human history because it has been built on the foundation of a “multiracial society,” albeit one where most “minority races were politically impotent and socially invisible” for a great part of its history.  But Walzer argued that the “repression” of these minority groups did not negatively effect the system of American pluralism constructed through immigration (although he did admit that “racism is the great barrier to a fully developed pluralism”).  America has been an “immigrant society” bound by patriotism to political ideals, according to Walzer, not a nation defined by ethnicity or territory.  The rise of political pluralism in the 20th century was a reaction to the coercive power of the expanded modern state, which often demanded cultural Americanization on top of political patriotism from immigrants.  Pluralists like Horace Kallen defended and celebrated diversity, and argued that America was a “nation of nationalities” and, therefore, in no need of hegemonic unity.  This inspired the “ethnic self-assertion” of cultural and racial groups in America during the 20th century, which Walzer claimed were the “functional equivalent of national liberation in other parts of the world.”  But cultural diversity does not threaten American political identity, argued Walzer, because civil society and the state “though they constantly interact, are formally distinct.”  Thus, while individuals find solace in cultural group identity in civil society, those same individuals identify with the state of America in politics: “Politics forces [ethnic groups] into alliances and coalitions, and democratic politics, because it recognizes each citizen as the equal of every other, without regard to ethnicity, fosters a unity of individuals alongside the diversity of groups.”  Besides, the power of the individualism produced by American nationalism has a destabilizing effect on group identity, argued Walzer, and thus pluralism is an “experiment” that “will prove to be a temporary phenomenon, a way-station on the road to American nationalism.”[xxvii]

Walzer also wrote another influential essay in 1990, “What Does It Mean to Be an ‘American?’”  In this essay he claims that “anybody can live [in America], and just about everybody does.”  American identity is not based on an ethnic or territorial nationalism, but on the “virtue” of immigrants and natives coming together into a single yet diverse people: “the manyness of America is cultural, its oneness is political.”  America is composed of many ethnic groups but American is not itself an ethnic group.  Immigrants retain their former identity but add onto it a hyphenated American identity, which is a political affiliation to a political nation.  The hyphen is a “plus sign,” not a disavowal of ethnicity.  Americans can live “on either side of the hyphen” and still be Americans.  National unity comes from citizenship in the nation and “pledging allegiance to the ‘one and indivisible’ republic,” not from cultural conformity.  Walzer argued that American nationalism is uniquely “complex” because it is based on the ideas of tolerance and inclusion, “incorporating oneness and manyness in a ‘new order.’”  This creates for a sort of national “incoherence,” but Walzer argues that is part of the distinctive American nation, which is still “radically unfinished” in its nature.”  Americans are free to “choose” their own cultural location on either side of the hyphen and this freedom keeps America vibrant an unfettered from a “singular national destiny.”[xxviii]        

Liah Greenfeld took a few pages at the end of her historical study, Nationalism: Five Roads to Modernity (1992), to say a few words about American nationalism for her contemporary context.[xxix]  Greenfeld argued that American nationalism was based on civic nationalist principles of freedom, democracy, and equality enshrined in The Declaration of Independence.  Not everyone agreed with these principles and these principles were not always practiced, but these were the ideals that defined a nation.  The “people” of America were not defined by any ethnic unity because “America has been a nation of immigrants from the beginning.”  Instead, American nationalism was defined by an association of individuals who gave allegiance to a set of principles and, thus, “pluralism was built into the system” because culture and ethnicity mattered less than the affirmation of nationalist principles.  The combination of a pluralist people and a civic nationalism has tended to create a tumultuous yet somehow united republic: Our “national commitment” to the ideals of freedom, equality, and democracy “remains the main source of social cohesion and the main stimulant of unrest in it.”  Greenfeld argued, “To be an American means to persevere in one’s loyalty to the ideals, in spite of the inescapable contradictions between them and reality, and to accept reality without reconciling oneself to it.”

Jennifer L. Hochschild defined and contextually explored the American Dream, the most pervasive and powerful nationalist ideology in the U.S.[xxx]  Like all ideologies the American Dream resists a formulaic definition or prescriptive power in terms of predicting the behaviors of Americans who subscribe to its ideological tenets; however, Hochschild attempted to get past the vagueness of the general idea of working hard for material success in order to tease out a more analytical conception of the American Dream.  She broke the ideology into four tenets corresponding to four descriptions questions: (1) Who may pursue success? – Everyone can pursue success; (2) What does one pursue? – One pursues “success;”  (3)  How does one pursue success? – Success is the result of an individual’s hard work and self sacrifice; (4) Why is success worth pursing? – Success is a virtue, which both constitutes and demonstrates an individual’s worthiness.  The pursuit of success is complicated by the ambiguity of “success,” which can be defined in absolute, relative, and competitive terms.  An absolute definition rests the basic achievement of an individual; a relative definition depends upon the contextual evaluation of success relative to an external marker like another person’s level of success; and a competitive definition corresponds to a capitalist marketplace where only the best will win success.  The American dream promotes a “radical individualism” which completely overlooks social and structural mediators like “economic processes, environmental constraints, or political structures.”  This factor is especially dangerous because American capitalist society is structurally set up to “ensure that some fail, at least relatively, and the dream does nothing to help Americans cope with or even recognize that fact.” 

Hochschild especially looks at the structure of racism in American society and she demonstrated how it has constrained and continues to constrain African American success in relation to white Americans and white European immigrants.  Many African Americans remain deeply entrenched in poverty and well beyond reach of achieving any measure of the American Dream, and this has caused many African Americans, especially middle-class blacks, to reject the American Dream in order to promote separatist black nationalisms or self-defeating nihilisms.  Hochschild argued that the ideology of the American dream is “flawed at the core” because it obscures the structural factors like racism and class that create and sustain inequality.  Thus the ideology “under the cloak of individual agency” both gives people “unjustified hopes” and also ensures “unwarranted feelings of failure.”  The American dream has the capacity to both “deceive” and “liberate” by encouraging “everyone to win” while structurally setting up many to loose.  Despite the “inherent flaws” of the ideology of the American dream and Hochschild’s “ambivalence” toward the concept, she argued: “For better or for worse, it is our ideology, and we are stuck with it.  We had better make the best of our situation, and strive to use the strictures of the American dream to enable more Americans to achieve the fantasies lurking within it.” 

However if American society is not structurally transformed, Hochschild delivered a serious warning: “If it can be construed as an ideal, a broad, generous, inclusive vision that encourages people to be the best they can be however they define that best, then transformative pluralism and open channels of mobility are direct and plausible extensions of Americans’ core tradition.  But if it is only an ideology in the narrow sense, a self-righteous club that winners use to justify their own actions and to push away, blame, or brainwash losers, then white separatism will continue to flourish, black separatism to grow, and class barriers to harden.”[xxxi]  

An important liberal response was made in 1995 by Todd Gitlin, a founding member of SDS in the 1960s who had become a sociologist at the University of California, Berkeley.[xxxii]  Gitlin framed his discussion of recent culture wars by arguing that certain Americans “who have imagined themselves to be real Americans, normal Americans” have repeatedly over the course of U.S. history engaged in “purification crusades” to address and combat those groups or individuals who “threaten the integrity of the nation” [his emphasis].  Gitlin argued that the periodic culture wars in American history have tended to obscure contested realities rather than clarify or settle them.  Thus, Gitlin argued that all positions and controversies needed to be re-examined in order to not only understand American identity, but also (from his leftist vantage point) to understand “the contemporary incapacity of American politics,” by which he meant the failure of the American Left to effectively redirect attention away from symbolic battles and onto more important and pressing social and economic issues. 

Gitlin discussed at length many cultural controversies over the politics of identity.  Gitlin argued that all sides focused exclusively on symbolic representations instead of concrete social and economic realities: Conservatives, liberals, and minority groups spoke from positions of “moral conviction” and argued over competing “emotional meanings” attached to historical symbols instead of focusing on “rock-bottom class inequalities and racial discrimination.”  Gitlin argued that the culture wars boiled down to verbal battles over “real and imagined symbols of insult,” which did nothing to address the concrete realities of power, racial discrimination, and economic injustice.  Both sides simply battled over symbols.  Both sides won and lost “symbolic victories.”  All the while racial discrimination remained, economic injustice increased, and the American public became more divided.  Gitlin was especially hard on liberals and the “so-called Left” who had seemingly renounced its older mission of changing material inequalities (especially the oppression of certain classes and races).  Gitlin argued that the divisive symbolic battles over “identity politics,” the primary arena of the culture wars, marked the “decline” of the American Left.  The Left once had a historical mission based on the “universal values” of freedom, justice, equality and the “common good,” but after the 1970s it had been fractured and demoralized by “sectarianism,” “petty” debates over rhetoric and representation, and the impotence of “false solutions proclaimed for real problems.”[xxxiii]

While America has always been divided by classes and races, Gitlin argued, there was still a shared moral vision based on the sacred “ideas” consecrated in The Declaration of Independence, which framed the debated contours of the nation.  For most of American history the debate between radicals and conservatives was over inclusiveness, not nationality.  Conservatives made many attempts to “compress differences” into a “single,” normative American WASP identity, which invariably was complicated not only by internal “contradictions,” but also by “those other Americans” (immigrants, aliens, slaves, radicals, and sects) who had been marginalized, ignored, or eliminated in order to manufacture a selective national unity.  These “other Americans,” the “despised outsiders,” constructed their own “unmelted,” “torn,” and sectarian American identities in opposition to exclusive crusades for a “common culture,” but invariably these outsiders sought for political inclusion within the nation.  Gitlin pointed out a long tradition of “democratic Americanism.”  This was a leftist/liberal version of American nationalism, which used the universal moral vision of The Declaration to help extend political rights and equality to these outsiders.[xxxiv] 

However, this democratic Americanism, as Gitlin argued, began to unravel in the 1960s because protest movements and the New Left began to reject both “conventional versions of American identity” and American ideals.  This resulted in a reactionary “anti-Americanism” which celebrated diversity, anti-establishmentarianism, and individualism as new ideals.  The New Left relinquished all claim to “the idea of a common America” and, thereby, Americanism “was ceded, by default, to the Right.”  Republicans were able to use revised notions of a common culture and Americanism in order to marshal organized political reaction to the rights revolution of the 1960s.  The American Left was fractured into a “collection of interest groups” with no “vocabulary for the common good,” and the Democratic party could offer no compelling counter-nationalist vision: “no commonality, no alternative crucible, no compelling rhetoric, no political culture – only a heap of demands piled on demands.”  Thus, after securing a solid political block based on a nationalist agenda, gaining more and more political power, and eventually claiming victory in the Cold War, the Republican party and social conservatives initiated an all out attack on the liberal welfare-state and declared a wider “war for the soul of America.”  The Left was fragmented into “partisans of identity politics” and, as Gitlin argued, it could not effectively respond to the powerful conservative reaction.  The culture wars were not only initiated by the conservative right, but fought over territory (national identity) that only the right could effectively defend.  Thus Gitlin’s book is an extended critique of the Left by a Leftist in order to marshal a new universal Leftist vision with which to protect and justify the liberal welfare state and the rights revolution against the onslaughts of conservative cultural warriors.  Gitlin argued that the Left needs to find a way “to cultivate the spirit of solidarity across the lines of difference” in order to “build bridges” and find a common, democratic moral vision.  Otherwise, Gitlin warned, the American Left will cease to exist as a political force of any consequence and the conservative counter-revolution will know no bounds.[xxxv]  

Michael Lind published a widely read “manifesto” of liberal nationalism called The Next American Nation (1995).[xxxvi]  Lind’s central argument: America was and continues to be a “real nation” – “a concrete historical community, defined primarily by a common language, common folkways, and a common vernacular culture.”  He argued that most Americans identify more with a national identity than they do with any political affiliation, but American nationalism has been poorly defined outside the older chauvinistic boundaries of a “white Christian nation,” which was the foundational ethos of the first two “Republics” of America.  Lind conceptualized American history as three distinct “Republican” regimes, each defined by specific nationalist ethos and specific nationalist policies.  The “First Republic” of “Anglo-America” (1584 to 1850) built upon the strong ethnocultural Anglo-Saxon national community in place before the revolution and it used Protestant Christianity and federal-republicanism to create a national community – the United States of America.  The “Second Republic” of Euro-Christian America (1850 to 1960), infused by a nationalist religion of democracy, capitalism, and a Pan-Christian ethic, expanded the national community to include most white Europeans (and after World War II, both Catholics and Jews were accepted); however, the expanded second republic was built on the foundation of a white supremacist Herrenvolk (master-race) caste-system, which actively excluded and subjected non-white races.  The “Third Republic” of multicultural America (1960s to current) was a minority led reaction to the white supremacy of the first two Republics.  Multicultural America began as a Civil Rights Revolution, which sought to open up the American nation by securing formal legal and political rights for all American citizens; however, in an effort to further extend equality-as-opportunity to equality-as-result, a federal system of racial categorization and an institutionalized “racial preference system” was put into place (affirmative action) to “force racial quotas” on American society.  What Lind called “The Second Radical Reconstruction” was a federally enforced system of minority preference, which meant to “remedy” racial, class, and gender discrimination in social spheres like schooling and employment.  While Lind was sympathetic to the rationale of affirmative action, he criticized it as an intrusive, unfair, divisive, and dangerous policy.[xxxvii] 

Lind specifically took issue with multicultural America’s obsession with race and culture as the foundational source of identity: He claimed it was not in the best interests of the American people.  Lind argued that discussion of culture was often a veiled reduction of race, and thus, cultural authenticity and cultural pride were often calls to adhere to a certain biologically defined and essentialist identity.  He called the priorities of multiculturalism divisive because “identity politics” were eclipsing identification with a larger national community and with larger national issues (like economics and health care).  Like Gitlin, Lind also called multicultural identity politics a dangerous distraction from identifying the real and continued source of inequality in America: the “white power structure.”  Lind argued that the white power structure used multiculturalism and racial preferences fraudulently to “provide the illusion of integration, while imposing minimal costs on the white overclass:” it was a classic imperial case of divide and rule.[xxxviii]  This explains why during the supposedly more equal and fair regime of multicultural America, there was a silent “revolution of the rich,” whereby, income inequality increased dramatically.  Lind argued that multiculturalism socially and politically fragmented the majority of Americans by allowing “culture wars” to displace “class wars,” and thus, unified elites were able to initiate and win “a generation-long class war,” which has led to a “new Feudalism.”  Lind argued that multiculturalism was destroying the national integrity of America, it was fragmenting the American people, and it was solidifying the power structure of a white overclass.[xxxix] 

To renew and rebuild America, Lind argued for a revised liberal nationalism, which would be the foundational ethos for a new “fourth republic” of the United States of America.  Liberal nationalism would build on the notion that America is an ethnocultural nation unified by a common language, folkways, memories, and mores, but it would also be an inclusive “mixed-race culture” symbolically defined as a “transracial melting pot,” which Lind called “Trans-America.”  Lind argued that the U.S. should be a “multiracial and multireligious but unicultural American ethnic nation.”  He held up blacks as the quintessential Americans because they not only left behind their older cultures, but mixed and blended into the emerging American identity, which prefigured their actual inclusion into the political state as citizens.  Lind argued that nationalism has often been the tool of the political left in efforts to promote “greater political, social, and economic equality among all members of the national community.”  In an effort to bring Americans together, Lind argued that a shared set of ancestors is unimportant.  The important factor for a renewed America is sharing a contemporary cultural and political union – a national community – in order to produce shared common descendants: Americans.[xl] 

David A. Hollinger wrote a widely influential work called Postethnic America: Beyond Multiculturalism (1995).[xli]  Hollinger argued that the ideology and socio-political movement of multiculturalism has served a useful purpose in attacking a racist Anglo-Protestant based American culture; however, he argued that the blunt race/culture (“ethno-racial”) based framework of multiculturalism is limited in understanding and dealing with “the problem of boundaries” in what is becoming a “postethnic,” “cosmopolitan,” and trans/multinational world, which is developing more acute globalized problems that need global solutions.  The traditional conceptions of ethnicity have posited assumed, often monolithic, and sometimes racialized “identities,” which mask the degree of actual “affiliation” any given individual psychologically and socially invests in a particular ethnic group to which that individual is supposed to belong.  Hollinger uses the example of Alex Haley and conceptualized “Haley’s choice,” by which he theorized the choice Haley made in tracing his “roots” back to his black mother’s ancestors in Africa rather than identifying with his white father’s ancestors in Ireland.  Now while Hollinger did admit that a racialized and racist America circumscribed and forced Haley’s choice, Hollinger went on to argue that in a less segregated and increasing mixed ethno-racial world, individuals are becoming freer to choose ethnic, mixed ethnic, or non-ethnic identities, but the ethno-racially infused multicultural ethos is unprepared to handle these new volitional and mixed identity formations.  Hollinger’s conception of a hybrid, postethnic America recognizes the complexity of identity, whereby, individuals constantly shift between many situationally defined and sometimes conflicting identities.  Hollinger argued that America should shed its “ethnic history” and embrace its “nonethnic ideology of the nation” as a means to embrace and foster a “postethnic future.”  Hollinger asked his readers to take seriously the national motto E Pluribus Unum as a way to conceptualize cultural diversity united by a national commitment to a common creed of liberty and justice.  “Individuals should be allowed to affiliate or disaffiliate with their won communities of descent to an extent that they choose,” argued Hollinger, “while affiliating with whatever nondescent communities are available and appealing to them.”  What unites these highly diverse and hybrid individuals is a democratically organized state “defined by a civic principle of nationality” and enacted in a shared “national culture” where diverse individuals democratically deliberate and work towards a “common future:” “The national community’s fate can be common without its will being uniform, and the nation can constitute a common project without effacing all of the various projects that its citizens pursue through their voluntary affiliations.”[xlii]       

Gary Gerstle’s “Liberty, Coercion, and the Making of Americans” (1997) looked at the Crevecoeurian myth of Americanization and how it affected the historical and sociological study of cultural assimilation and Americanization.[xliii]  Gerstle argued that Crevecoeur’s conception of assimilation in Letters from an American Farmer was one of the “most influential mediations on what it means to become an American.”  Not only did the Crevecoeurian myth help define the early 20th century ideal of the “melting pot,” but it also influenced the way 20th century sociologists and historians conceptualized theories of assimilation, which in tern had an influence on public policy and debate.  Invoking radical scholars of the 1960s and the new scholarship of David R. Roediger and others, Gerstle criticized neo-Crevecoeurian scholars for not focusing enough on the complexity and constraints (class, gender, race, nation) of the Americanization process by which “social forces external to the immigrant” play a very significant, if not the most significant, role in the Americanization of immigrants.  Gerstle argued that these “structure of power” limited the options of immigrants (and also often coerced) during the assimilation/Americanization process.  Gerstle criticized the overly optimistic accounts make by Fuchs, Sollors and Hollinger who seemed to argue for a theory of personal agency and a fluidness to identity that did not take into account the restrictiveness of structural constraints (especially race, as Gerstle argued, “race, even more than class and gender, still limits the options of those who seek to become American”).  Gerstle clearly believed that “historical circumstances and social structures undermined experiments in the fashioning of identity.”

Gerstle looked to newer studies on gender and working class Americanism (including his own), which have created a “synthesis between agency and structure” and, thereby, demonstrated how “Americanization involves both inventiveness and constraint:” America was not “simply a Crevecoeurian land of possibility,” it was also “a land of constraint.” 

“Becoming American cannot be understood in “emancipationist” terms alone, for immigrants invariably encountered structures of class, race, gender, and national power that constrained, and sometimes defeated, their efforts to be free.  Coercion, as much as liberty, has been intrinsic to our history and to the process of becoming American.”

Gerstle also critiqued liberal American nationalism via David Hollinger’s Postethnic America.  Gerstle agreed with Hollinger that liberal nationalism infused and largely defined Progressivism, the New Deal, the civil rights movement, and the Great Society by “derviv[ing] legitimacy from their claim to speak ‘on behalf of the American nation’ as a whole.”  However, Gerstle also argued that nationalism by definition means “boundaries” and “internal and external opponents,” and thus the “equality” gained over 1930 to 1960 was “made possible by the coercion of the 1910s and 1920s:”

 America had shrunk its circle of the ‘we’ and had substantially narrowed the range of acceptable cultural and political behavior…The success of this liberal nationalist project, I would argue, depended on the earlier deployment of the coercive power of the state against Germans, new immigrants, Asians, and political radicals.  Liberal progress, in this instance, profited from the earlier period of repression and exclusion…Historians have yet to take full measure of the powerful nationalism that settled over American in the 1910s and 1920s, suffocating the hyphenated identities…weken[ing] the pluralist character of pre-1917 America and accelerat[ing] national integration.   

 Gerstle is but one example of many leftist critiques of liberal nationalism.  Historians have begun to examine the artificial boundaries of the nation, what Robert Wiebe called “fictive kin composites,” and they have started to historically contextualize nations in relation to other nations and in relation to non-national and transnational paradigms of a global age.[xliv]  Bonnie Honig has looked into multiple versions of immigrants myths in relation to nationalism, and she argued that nationalist discourses that focus on immigrants do so in order to “renationalize” the state by justifying the inclusiveness of a bounded and exclusivist national community that still derives its identity by “pitting ‘us’ against ‘them.’”  She argued instead for a “democratic cosmopolitanism” by which “citizenship is not just a druidical status distributed (or not) by states, but a practice in which denizens, migrants, residents, and the allies hold states accountable for their definitions and distributions of goods, powers, rights, freedoms, privileges, and justice…denationalize the state in order to make room for the generation of alternative sites of affect and identity against which states often guard.”[xlv]  John Exdell has criticized liberal nationalism for “legitimiz[ing] a policy of exclusion,” which leaves open the possibility of further nationalist exclusions based on ethnicity and race.  Liberals claim that solidarity and justice within the bounded community are produced and protected by nationalist identity; however, Exdell demonstrated that national solidarity in America has been and continues to be undermined by the divisive power of race via a long tradition of white supremacist American ethnonationalism.  Exdell questions the liberal assumption that a reformation of “national self-understanding” is enough to truly “unite” American citizens and overcome a long tradition of American racism.  Exdell instead asked if new infusions of Latino immigrants might “renew” and “revitalize” American identity by developing a “new post-national identity” that might redefine American citizenship as a situated democratic performance conducted by any free, productive, and contributing agent within the national territory.[xlvi]           

Culture Wars in Context: Transformations of the American Nation, 1776 to 1990

The modern usage of the terms “nation” and “nationalism” comes from 16th century England.  The political discourse of a “nation” became associated with a “people.”  Reference to a “people” before this time was usually derogatory (“rabble,” “plebs,” or “mass”), but within the context of 16th century England a “people” became glorified as the source of sovereignty and the sole object of political loyalty.  This political definition of a nation stressed a civic conception of individual sovereignty (as opposed to monarchical sovereignty) constituted by a constitutional law (as opposed to divinity or monarchical absolutism).  However, national identity was also an ideological and social construction.  The notion of a sovereign people was an “imagined community” that gave its members identity, affiliation, community, and purpose.  Isaiah Berlin described nationalism as manufacturing a “kind of homogeneity” out of “common ancestry, common language, customs, traditions, memories, continuous occupancy of the same territory” so as to create “solidarity” while marking off “differences” (usually in the form of an “aggressive chauvinism”) between political and social groups.  As a political and social ideology, nationalism reveres and reifies the “unity” and “self-determination” of the sovereign people.  Loyalty and fidelity to the nation qua people, Berlin argued, is assigned a “supreme value.”  This secular reverence for a distinct people often leads to an exclusive “ethnic” chauvinism, whereby, membership in a “unique,” special, or exceptional people is restricted to an inherited and biologically based group.  Nationalism can also be used for more aggressively expansionist political purposes.  Powerful ethnic groups or nations can engage in hegemony through which they aggressively (though not necessarily imperialistically) drive for an expanded territory or nation under the banner of “unification” for economic, political, and/or military purposes.  This type of nationalism can lead to a federation, an empire, and or “irredentism,” whereby, territory and peoples are coercively agglomerated under the control of a centralized and often authoritarian state.  But rhetorically conceptualizing nations and peoples as a distinct and uniform social entity needs to be qualified.  Nations are rarely based on a distinctly singular “people,” a single national ideology, a single state, a single language, or a single territory.  Nor does the existence of nationalism necessarily predict the ideological affiliations or standardized behavior of the people within a nation.  Nations are imagined communities that represent an idealized and normative “people” that can never actually exist.[xlvii] 

The political need to establish and legitimate a people – a nation – was a relatively novel and very radical political problem in the 18th century.  Nationhood was influenced by the rise of Enlightenment republican/democratic political philosophy and capitalism, and forged through the republican revolutions in England (17th century), America, and France (late 18th century).[xlviii]  While the study of nationalism is currently a young social-scientific field of study,[xlix] the evidence seems to suggest that prior to the late 18th century, only one nation-qua-nation existed: the British.  Liah Greenfeld explained how an English national consciousness developed in the 16th century through the power politics of aristocracy, which led to a redefinition of nobility as “service to the nation.”  In the 17th century affiliation with the British nation expanded due to several factors: the rising middle class who exuded a strong sense of political ownership and entitlement, the expansion of literacy through Protestantism, counter-Reformation repression by Catholic monarchs, and finally a republican revolution.[l]  American nationalism was in many ways, but not all, derivative of English nationalism because as Greenfeld and others have argued, “The English settlers came with a national identity;” however, the development of a specific American national identity (as with all national identities) was a highly unique and non-transferable process.  American nationalism took almost a century after the Revolution to develop and diffuse because while the colonists had an emerging “American identity,” it was not linked with “a sense that Americans constituted a unity” and, thus, the highly diverse and localized colonies were always “in perpetual peril of dissolving:” As Liah Greenfeld argued, “The forces that could (and eventually did) bring the United States to the brink of disintegration were at least as strong as those which fostered unity.”[li] 

Polemicists from the political left and right have claimed that an American identity has existed from the first settlements in the 16th and 17th centuries;[lii] however, most historians and sociologists have traced the origins of a distinctly American national identity to the mid 19th century, especially after the Civil War, although some historians like Robert Wiebe place the formation of a national American identity closer to the end of the 19th century.[liii]  Before the Revolution, the largely English colonies were divided by diverse ethnic identities, dispersed regional settlements, and highly localized economies connected more with Europe than each other.  American nationalists had to contend with and overcome what became a highly diverse colonial federation.[liv]  A rhetorically imagined national public – “We the people” – was first manufactured by Revolutionary leaders as a means to unify the diverse and fragmented colonies during the Revolutionary War and again during the debates over the Constitution.[lv]  Citizenship was a divisive issue from the very start.  Noah Pickus argued that most early leaders agreed that civic principles and a “shared sense of nationhood” needed to be at the core of the new country and its founding documents, but many “differed deeply as the meaning of that nation and whether it could change.”[lvi] 

Federalists wanted a small and homogeneous republic with narrowly defined rights of citizenship limited to self-governing, propertied, “virtuous” men, while Anti-Federalists and Jeffersonian Republicans wanted a more “broadly defined national identity” based on universal civic principles and open to all who embraced and abided by those principles.  Although the early nation was quite diverse and divisive, Federalists, like John Jay, tried to manufacture consent for a more homogeneous nation and a more circumscribed citizenship by rhetorically invoking a “one united people – a people descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in manners and customs.” 

While the Constitution seemed to sidestep this debate (there is no formal definition of citizenship in the Constitution) The Declaration of Independence implied a “volitional and contractual” approach to citizenship, which the Constitution did not limit in any way (for instance, there were no “cultural, religious, or linguistic tests for citizenship”).  George Washington even went so far as to declare America “open to receive not only the opulent and respectable strange, but the oppressed and persecuted of all nations and religions.”  But he also gave hint to certain limitations by stating, “We shall welcome [them] to a participation of all our rights and privileges, if by decency and propriety of conduct they appear to merit the enjoyment.”  This ambivalence infused the parameters of the first Naturalization Law of 1790, which limited citizenship to “free white persons” with two years of residency a good character, and sworn allegiance to the Constitution.[lvii]  The central limitation of free “white” persons was borrowed from the colonial laws of many states and was deliberately used to exclude blacks[lviii] and Native Americans from the American nation, although overall, Pickus notes, the Naturalization Act was “remarkably inclusive for its time, in bestowing citizenship on all European immigrants.”  Pickus argued that while European immigrants were considered “central” to the “national-building task of Americanizing the Americans,” he also acknowledged that “belonging to the nation and reverence for its traditions mattered,” which, in combination with the racial classification of citizenship, definitely circumscribed the bounds of citizenship in the U.S. and seemed to lean more towards legitimating the definition of America in terms of the Federalists’ homogenous white republic.[lix]

For the next one hundred years there was “no single standard for membership” in the U.S. as “modes of citizenship” were “multiple,” “often contradictory” due to their regulation and facilitation by state and local governments, and they were “contested in many ways” by individuals and groups.  Citizenship opened up considerably in 1868 with the passage of the Fourteenth Amendment, which enshrined the notion of birthright citizenship to help enfranchise blacks.  The Expatriation Act of 1868 reinforced the notion that citizenship was the right of Americans by birth and “relinquishing” it depended on the “consent of the individual.”  However, Congress also passed the Chinese Exclusion Act in 1882, which narrowed the boundaries of U.S. citizenship by denying Chinese (and most Asians in general) from citizenship (although the Supreme Court in 1886 and 1898 did allow Chinese the principle of rights while in American and did affirm that Chinese children born in America were by right Americans).  But citizenship was highly dependent upon “local discretion” as some states extended rights like voting to non-citizens, while other states circumscribed and limited the rights of particular groups who were entitled to full citizenship.  Pickus argued that “both the civic and the nationalist dimensions of citizenship had inclusionary and exclusionary consequences.”  The courts used a nationalist framework and affirmed “whiteness” as the core value of American citizenship, but broad definitions of whiteness allowed more and more Europeans to be accepted as potential citizens.  The nationalist principle of birthright citizenship also extended the bounds of citizenship.  It even had the ability to circumvent the “whiteness” clause.  At the same time civic notions of citizenship based on individual virtue and consent were used to limit or exclude blacks and Native Americans from becoming citizens.  And many nativists used a white supremacist version of nationalism to call into question the more open nationalist conception of birthright citizenship.[lx]       

Throughout the 19th century, national identity and citizenship were ideologically contested battlefields wherein different factions vied for legitimacy, justice, and power.[lxi]  Perhaps the most important, complex, and emotionally charged 19th century debate over American nationalism was the Civil War and the issue of slavery, which boiled beyond words into violent confrontation.[lxii]  The Civil War enlarged the powers of the nation-state via an expanded bureaucracy, an aggressive executive branch, a conscripted and mobilized federal army, and coordinated communication systems.  After the war a new sense of American qua Republican identity, nationality, and purpose was consecrated (and federally enforced in the South), and it would be refined and reinforced through Indian wars, increased immigration, and westward expansion.  If the American Revolution and the ratification of the Constitution marked the first crisis in American nationalism, then the Civil War marked the second.  Arguably the third major crisis of American nationalism emerged at the end of the 19th century.   The Populist, labor and Progressive movements initiated in the late 19th century represented a diverse and widespread sense of national emergency, and their congealed efforts aimed at nothing less than a redefinition of American national identity and purpose, which reverberated an ethos of liberal reform throughout the 20th century.   

Nell Irvin Painter’s award winning treatment of the Progressive Era, Standing at Armageddon: The United States, 1877 – 1919,[lxiii] argued that the central political conflict of the late 19th century was a “struggle over the distribution of wealth and power” – a constant struggle in American history.  In 1890 the supper rich (0.01% of the population or about 125,000 families) earned an income of about $264,000 and owned over 50.8% of the national wealth.  The upper-middle class (11% of the population and about 1.3 million families) made on average about $16,000 per year and owned about 35% of the national wealth.  The remaining 88% of the population  (11 million families) earned under $1,500 a year and owned just over 14% of the national wealth, and half of these families (44% of the population) were impoverished, earning less than $150 a year (the poverty level is estimated at around $544 a year for a family).  Painter stressed that while income does provide the “single clearest indicator of class standing,” the notion of class needs to be seen as a complex, “fluid” and ever changing classification.  There was no single “middle class,” but rather several “middle classes” and also “many ethnicities and races” within each class.  The elite classes at the time had the most at stake in the structure of society because they benefited from the distribution of political and economic resources.  To protect their interests, the socially and politically powerful and their agents liked to put forth ideological arguments for the “identity of interest.”  This belief conceptualized society as a smoothly functioning and united organism, wherein, the interests of the great capitalists and property owners were supposedly the best interests of all in society, and further, it was put forth that society operated in harmony with “laws of God or Science.” 

Reformers of various social and political stripes put forth a counter-conception of society in order to justify what they saw as needed reform.  Seeing their own middle-class or working-class interests at odds with those of capitalists and industrialists, democratizers saw society torn by a “conflict of interests.”  Reformers often, but not always, tried to point out the interests of the “disadvantaged” within the social system and, thereby, argue for “the ideal of equity” and democracy, in order to confront the dangerous extremes of wealth and privilege.  But lurking at the periphery of all calls for reform was the specter of working class unrest, which from time to time had boiled into a froth and caused conflicts of interest to turn into real (and often violent) social and political struggles for power.  The so called “Progressive Era” was marked by a widespread call for reform and social change, however, as Painter pointed out, “the broadening consensus that change was necessary did not include agreement on the direction or extent of these changes.”

In the 1960s another generation of reformers pointed out not only the inequality between the rich and the poor, but also the differential wealth between racial groups, especially between whites and blacks.  In the early 1960s the top 20% of Americans possessed 77% of the nation’s wealth, while those in the bottom 20% owned only 0.05%.  In 1959 22.4% of the population lived in poverty.  However, the situation was worse for African Americans.  In 1965 43% of all black families earned less than $3,000 a year and were living in poverty (the national rate was 15%).  In 1967 39.3% of all black persons in America lived in poverty compared to only 11% of whites.  In 1962 the average black income was about 55% of the average income of whites and black unemployment was double the rate of white unemployment.  The Civil Rights movement of the 1960s addressed the larger issues of wealth and poverty in America, but the main part of the early movement focused mostly on the legal and social segregation of African Americans and the unjust social and economic treatment they received as second-class citizens.  One of the early sparks of the Civil Rights movement was indicative of blacks’ oppressed social and political position in U.S. society: in 1955 a young fourteen-year old African American boy named Emmett Till was abducted, bound with barbed wire, mercilessly beaten until his face began to fall off, and thrown into a river to die.  His crime was whistling a white woman.[lxiv]

But the Civil Rights movement of the 1960s did not confine itself to just the platforms of economic inequality or the oppression of African Americans.  One Civil Rights organization at the time, Students for a Democratic Society, published a widely printed and influential manifesto called The Port Huron Statement (1962), which discussed both economic inequality and racial discrimination, but it also outlined issues for reform in both educational and foreign policy as well as larger values along with a political vision of American society.  This manifesto even reached beyond American politics and professed support for reform and revolutionary movements around the globe, particularly anti-colonial uprisings in Africa and Asia.[lxv]  The reformist and revolutionary rhetoric of the 1960s inspired many minority groups in America who felt their voices and socio-political issues were being excluded by the mainstream Civil Rights platform.  Women played an important role in both the African American Civil Rights programs and in Students for a Democratic Society, however, many women eventually branched off into their own “women’s liberation movement” in order to address “the woman question.”[lxvi]  Mexican Americans drew on a history of organizational efforts in America and several Chicano Civil Rights organizations were formed, including the League of United Latin American Citizens (LULAC), which worked for the “economic, political, and social rights for all Mexican Americans.”  The Chicano “movimiento” specifically addressed the second-class citizenship of Mexican Americans who were often portrayed as “dehumanized” “commodities” of the American economy.[lxvii]  Many other minority social groups in America also became inspired by the large Civil Rights reform movements, including Native Americans, homosexuals, various European ethnic groups, political radicals, many stripes of cultural radicals, and what some called the “youth” culture.[lxviii]  The diversity of movements, reform issues, protests, and alternative cultural practices propagated during the 1960s led to “radical cultural disjuncture[s],” which created what many at the time called a “counter culture,” which mainstream America believed to be “a barbaric intrusion” and an “invasion of centaurs.”  Besides a “common enemy” in mainstream WASP American culture and corporate capitalism, there was also a common personalization of political objectives, whereby, to paraphrase the feminist Carol Hanisch, the personal became political.  What was heretofore assumed to be a “common” American culture, had now fractured along the lines of many distinct, disgruntled, and dissenting counter cultures, each with its own vision and agenda, and each assuming the liberal state would be responsive by expanding the parameters of Civil Rights legislation.[lxix]

Both the Progressive era and the 1960s Civil Rights reform movements were able to influence and use the federal government as a way to initiate and preserve social changes through the law, enforcement of the law, and federal funding of policy initiatives.  However, historians like Alan Dawley have demonstrated that the democratic veneer of the liberal state has also allowed elites to “maintain their rule against popular discontent” by mediating the seemingly democratic processes of a representative and responsive government.  Beneath the surface of elite mediated democratic politics lay “deep structures” of corporate capitalism, racism, sexism, and economic inequality, which have been rarely touched by reformist federal policies.  Dawley argued that these deep structures, which have selectively “apportioned” liberty according to one’s “class, gender, and race,” have never been seriously altered by any 20th century reform movement.      

Dawley went on to argue that American liberal elites had devised three “governing strategies” to deal with social change in terms of containing social and economic conflict, and in terms of negotiating the new relationships between “society” and the “state.”  The older strategy of liberalism (free markets, laissez faire, white supremacy, private property, government by elites) was a staple of the 19th century, but it was not a sufficient governing strategy for modern times (although it would be refurbished in the late 20th century as neo-liberalism).  The first new strategy of the early 20th century was progressivism (in broader terms of “government regulation of society in the public interest”).  The second was managerial liberalism, which sought to “avoid state bureaucracies by coordinating corporations and other large-scale institutions.”[lxx]  The third strategy was New Deal liberalism, which created the welfare state and followed Keynesian economic policies in order to both regulate society and allow corporate control over the economy.[lxxi]  The defining “unity” of this historical period (roughly from the 1890s to the 1930s) was the “persistent efforts of elites to remake the liberal state in the context of the new social forces.”[lxxii]

 The coherence behind these unifying conceptions of liberal government was “the most potent ideology of all:” nationalism.  It was described by many during the early 20th century as a “new nationalism” and its broad based goal was a directed expansion of Americanism through a welfare state and more explicit Americanization initiatives to unite the citizenry and keep them loyal to the state.  World War I helped legitimize and spread nationalism and patriotic fervor in order to manufacture the consent of the American people.  Nationalism was the most powerful ideological force to create both unity and loyalty in a diverse society, mobilizing the masses, industry, and modern technology for state sponsored projects.  Liberal elites used nationalism, reformism, and state interventionism to hold society “together against its own inner contradictions.”[lxxiii]

Both Dawley and Alan Brinkley have documented the liberal accomodationist and nationalist strategy at work in the New Deal period as well.[lxxiv]  In “The New Deal and the Idea of the State,” Brinkley explored how liberals did not seek to transform the economic structures that created economic injustice, they sought instead to regulate the market and control it through the state, which created the appearance of reform without actually changing the structure of society.  However, controlling the market proved a difficult, “unrealistic,” and “perhaps even dangerous” intrusion into the economic realm, and besides, many American liberals assumed that progressive reforms and Neal Deal policies had “eliminated the most dangerous features of the capitalist system.”  The economic boom and triumphant nationalism caused by World War II reinvigorated a return to lassie faire free market policies by which many elites thought that unregulated economic growth would create the conditions for social and economic progress, which would then reduce the role of the state to “compensate[ing] for capitalism’s inevitable flaws and omissions without interfering with its internal workings.”[lxxv]  Ira Katznelson argued that this legacy defined the parameters of Johnson’s Great Society legislation as well, whereby, the government was used “in unprecedented ways for social ends,” but within a compensating framework that did not alter the larger structure of society and the economy.  Katznelson also argued that the political climate of the 1960s became more focused on race and cultural pluralism due to the diverse and fractured political movements of various identity groups, and thus, the Great Society programs were seen by many elites to be temporary capitulations to particular groups because of “emergency” situations, not permanent political reforms.[lxxvi]

Thus, when the liberal coalition ran out of political capital in the 1970s due to Civil Rights reforms, Great Society reforms, Vietnam, and an unruly counter-cultural movement, it “exploded” and “burst into its constituent shards.”  A revitalized and powerful conservative reaction co-opted the liberal rhetoric of nationalism and progressive reform in order to orchestrate a conservative rollback of 20th century liberal policies, especially the enlarged and empowered federal government.  As Jonathan Rieder notes, no policy was resented more than the effort to “dismantle” the racial “caste system” in America with court orders, federal troops, and enforced integration.  Eric Foner noted that many conservatives saw “racial reform [as] being promoted against the will of the democratic majority,” who had the right to protect their own interests and to discriminate against those who posed a threat.  Anti-Communism, anti-radicalism, and fundamentalist Christianity were also used to refine an older form of patriotism that “demanded a simple, unreflective loyalty.”  Rieder characterized the conservative backlash as a multifaceted, racist, “proto-fascist revolt of the little man, animated by fearful resentment:” “populism with a vengeance, literally.”  Widespread discontentment and resentment due to grievances “too varied to be captured in a single category” mobilized large numbers of white working and middle class Americans who were longing for a nostalgic return to a simpler, fairer, whiter, less restrictive, more patriotic, more Christian, and more homogeneous American society.  The Republican Party was able to mobilize and unite these fearful Americans, and turn disgruntlement into valuable political capital that was used by Richard Nixon, Ronald Reagan, George H. W. Bush, the Newt Gingrich led congress of 1994, and later George W. Bush.  The Republican’s overarching policy was to dismantle the New Deal welfare state and initiate reactionary neo-liberal “reforms” (a return to free markets, laissez faire, white/Western supremacy, small/limited government) and national unity/defense (patriotism, WASP culture and values, expanded military-industrial complex).

Conservative reactions and calls for unity often precipitated militant minority reactions and calls for racial and cultural separatism.  These rhetorical battles often ignited violent confrontations between radicals and conservatives, between whites and racial minorities, and between racial minorities and law enforcement (the most noticed being the riots of 1965, 1968, and 1992).  Liberals who had initiated social change in the 20th century were often associated with the various minority groups who battled against conservatives and law enforcement, and this association “transformed the folk imagery of liberalism” into the poisoned source of conservative angst.  Jonathan Rieder argues that America became a “culture of incivility” as “tension” between conservatives and liberals and between conservatives and minority groups turned from impassioned argument to “outright feuding” and “unabashed denunciation.”[lxxvii]  This angry debate would come to be called a “culture war” and the reactionary conservative rhetoric seemed to define the parameters of this war of words.  Eric Foner noted that by the 1990s “virtually no politician would admit to being a liberal,” while “conservative assumptions” about the benefits of the free market, the evils of “big government,” and the unquestioned good of conservative values (like the family, national unity, and patriotism) were taken for granted in public discourse as gospel truths.  Conservatives began using their political capital and rhetorical appeal to attack not only the liberal welfare state, but more visibly the symbols of liberal decadence and national decline: funding for the arts and humanities, the national public school curriculum and curricular standards, and the decline of higher education due to multicultural policies.[lxxviii] 

 

A Rhetoric of Debate: Towards a Sociology of American Culture Wars

In the midst of the cultural wars, some academics (with mostly liberal sympathies to be sure) were more interested in understanding the nature of the conflict and how both sides might be brought into a more socially productive exchange.  In 1991 James Davison Hunter published Culture Wars: The Struggle to Define America.[lxxix]  At the time this book was the most comprehensive sociological and historical treatment of America’s culture wars.  Hunter’s objective was to sort through the charges and accusations on both sides of the debate in order to come to a sociological understanding of why the culture war was taking place and, further, to draw conclusions about what the culture war meant for American society, institutions, and politics. 

According to Hunter, the cultural war[lxxx] was a root a moral debate over “what is fundamentally right and wrong about the world we live in – about what is ultimately good and what is finally intolerable in our communities:” “At stake is how we as Americans will order our lives together.”  It was a debate over “national identity,” the very “meaning of America,” and perhaps more importantly “who we, as a nation, will aspire to become.”  Many of the participants in the cultural war were sincere and “reasonable” people who felt themselves “thrust into controversy” because their “moral commitments,” their “bases of moral authority,” and their “world views” “compelled” them to defend fundamental truths they held dear.  For most participants and viewers of the debate, all knowledge of the issues, the participants, and the war itself were filtered through the various mass media, which by their very forms are highly limited in their coverage and overly focused on the “personalities and events of the moment.”  Hunter explained that the deeply “personal disagreements that fire the culture war were deep and perhaps un-reconcilable.”  But he also suggested that “these differences are often intensified and aggravated by the way they are presented in public.”[lxxxi] 

Hunter traced the historical roots of the culture war to the presence of “various minority cultures” (based on religion, sexuality, and race) that have confronted and competed with a “Protestant-based populism” for control over definitions of American “social reality.”  Over the last two centuries of U.S. history there has been a general “expansion of cultural tolerance” that has accompanied the “slow but steady expansion” of “political and ideological tolerance,” “racial tolerance,” and “sexual tolerance.”  One of the most dynamic transformations has been the recent emergence of the “Judeo-Christian consensus” in the 20th century.  However, Hunter argued that this consensus was “collapsing” because of a broader “expansion of pluralism,” which included many communities beyond the ideological boundaries of the Judeo-Christian worldview (secularists, non-Judeo-Christian religions, feminists, and homosexuals).  Hunter explained that “tension” between religious, racial, and ideological groups has always existed in various degrees and will mostly likely never subside because “cultural conflict”[lxxxii] continues to evolve “along new and in many ways unfamiliar lines,” and because competing ideological and moral visions are rarely “coherent, clearly articulated, sharply differentiate world views.”[lxxxiii]  Hunter simplified the culture war into a broad, polarized debate between “the orthodox” (cultural conservatives) and “the progressive” (liberal or libertarians).[lxxxiv]  The debate was over whose culture will “dominate” and, thereby, who will have the “power to define reality.”  Because the debate focused on competing definitions of reality, it was by its nature a highly symbolic war where competing symbols were used to define and legitimate different practices, ideals, and virtues in the public realm.  This war over symbols has taken place on various battlefields: the family, education, media and the arts, law, and electoral politics.[lxxxv]  At the heart of this symbolic war were competing “moral visions” of American history, American identity, and American freedom – all based on competing sources of “moral authority.”  The orthodox Americans saw America as the embodiment of Judeo-Christian Providence, exceptionalism, and destiny.  To them American liberty is based on righteousness and all personal and economic freedom is based on the bounty and grace of God as documented in the sacred text of the Bible.  Progressives place faith in human reason and social responsibility, and they place moral authority in the rule of law, philosophical principles, and democratic politics – both of which are “living” and malleable human creations that “must evolve as society evolves and matures.”  To progressives, American liberty is the freedom from all constraints (under the conditions of liberal philosophies set by John Stuart Mill and Charles Taylor) based on the political rights of individuals.  Because of the deep ideological and moral divide based on competing moral authorities and expressed in different “moral languages,” Hunter argued, “In the final analysis, each side of the cultural divide can only talk past the other” because “what both sides bring to this public debate is, at least consciously, non-negotiable.”[lxxxvi] 

Hunter basically agreed with Pat Buchanan that fundamentally the culture wars were a religious war because “what is ultimately at issue are different conceptions of the sacred.”  But unlike Buchanan (who took up arms to defend his group in the war), Hunter asked a question: Can the American republic survive without a “common agreement as to what constitutes the ‘good’” because without such an agreement, “all that remains are competing interests, the power to promote those interests, and the ideological constructions to legitimate those interests?”  Hunter argued, no, some common ground must be found.  He put forth the possibility of a “new, common rationality, a new unum wherein public virtue and public civility can be revitalized.”  But to achieve common ground, Hunter argued, Americans must first come to an agreement over “how to publicly disagree,” i.e. formalizing disagreement within the “virtues” of an “authentic” democratic debate.  And from there, he argued, Americans must come to terms with a “principled pluralism” and a “principled toleration” with which to guide future negotiation over the parameters of a deeply divided American culture.[lxxxvii]  

A year later Gerald Graff published his award winning[lxxxviii] Beyond the Culture Wars: How Teaching the Conflicts Can Revitalize American Education (1992).  Graff’s book tried to outline the very techniques and virtues of authentic democratic debate in an effort to encourage people to really engage the debate through listing to their opponents.  He argued that too many Americans were sheltered in their own ideological cocoons, and thus, were shut out from opposing points of view.  Because of a heightened sense ideological warfare, many Americans adopted a siege mentality by which they withdrew into safe intellectual communities, but this was creating a dangerous “communicative disorder:” “a good deal of American life is organized so as to protect us from having to confront those unpleasant adversaries who may be just the ones we need to listen to.”  Graff attempted to address and understand some heated debates within his own field (literary studies) in order to find a common ground that can only be gained through an honest appraisal of the merits and limitations of both sides of the debate.  His technique was also a pedagogical demonstration of how the culture wars can be taught in classrooms as a way to both understand and defuse the tensions produced by competing points of view.[lxxxix] 

Graff chastised many conservative critics for their “apocalyptic posturing” and their refusal to see opponents’ positions as “legitimate” and “worthy of debate.”  Graff also took conservative critics to task for their “degree of exaggeration, patent falsehood, and plain hysteria,” which boiled down to a “simple fear of change.”  For instance, Graff singled out the prominent critic Dinesh D’Souza whose Illiberal Education: The Politics of Race and Sex on Campus (1991) claimed that universities were “expelling” and “stripping” away all the old liberal arts classics to make way for new multicultural texts.  Through a close examination of the actual state of university reading lists, Graff pointed out that this claim and its various offspring were based on “recycled evidence” that was “wildly inflated,” “grossly exaggerated,” and “provably false.”   Graff concluded, “To put it simply, the critics have not been telling the truth.”  What was actually happening to the literary canon was a process of change by “accretion at the margins,” which had been going on for at least a century or more.  Graff argued that the “caricaturing practice” and “political polemics” of conservative critics “obscured the fact that virtually every major advance in humanistic scholarship over the last three decades is indebted to the movements that are widely accused of subverting scholarly values.”  Graff was not saying that every new theory or academic school of thought delivers an unquestionable truth, but he did argue that new perspectives should be welcomed and honestly evaluated to see if they can expand the boundaries of and add to human knowledge.  Graff used the example of Chinua Achebe’s critical reading of Joseph Conrad’s famous conical work Heart of Darkness.  While Graff does not completely agree with Achebe’s criticism, Graff does admit that Achebe has a good point, which stems from Achebe’s different but valid cultural perspective.  In Graff’s classroom, he does not present one reading of Conrad’s novel as the true interpretation, but instead teaches the novel “as part of a critical debate about how to read it, which in turn is part of a larger theoretical debate about how politics and power affect the way we read literature.”[xc]  Graff’s technique acknowledges and investigates some of the debates at the heart of the cultural war in an effort to legitimize the very real conflict that does exist in America and, thereby, teach his students to democratically debate the issues as “a debate, not a monologue” through an examination of multiple perspectives.  Graff argued, “I think frank discussion of these conflicts is more likely to improve our handling of them than pretending they do not exist.”[xci]    

Graff argued that America’s system of education was put into a tough position with the culture wars.  Many people, especially conservatives, viewed education as a “conflict-free” and value neutral tradition.  However, as Graff pointed out, education has always been effected by the conflicts of the wider culture, especially higher education, which in the 20th century has had the “deeply contradictory mission” of both preserving honored traditions while also producing new knowledge by questioning those very traditions.  The educational system has reflected changes due not only to the “democratization of culture” produced by the counter-cultures of the 1960s, but also due to advancement of knowledge produced by the structure of the academy.  Graff argued that the boundaries of a culture and the frontiers of knowledge have always been contested and debated.  Many conservative critics talk of a “consensus” or a “common culture” as if “it were already finished and completed, something that people just ‘affirm’ or don’t affirm rather than something people struggle to create through democratic discussion.”  The 1960s did not create “divisiveness and difference” in America.  Multiple cultures have always been a part of the landscape.  Graff argued that the culture war boils down to the very stuff that democracies are made of: a diverse “common discussion” over the public good and public policy.  Thus, Graff framed his solution in terms of understanding, realizing, and practicing an inclusive democracy: “We need to distinguish between a shared body of national beliefs, which democracies can do nicely without, and a common national debate about our many differences, which we now need more than ever…[multiculturalists] are not rejecting the idea of a common culture so much as asking for a greater voice in defining it.”[xcii] 

Michael Kazin and Joseph A. McCartin have tried to point scholars in the direction of Americanism.  The nationalist ideology of Americanism is not only “vast” and “protean,” but “famously contested.”[xciii]  In a broad sense Americanism represents both a “distinctive” socio-political identity of U.S. citizens and also a particular brand of “loyalty” to the American nation.  More particularly Americanism is a “bundle of ideals” with “shifting content” that has “always” been fought over; however, the parameters of Americanism seem to roughly cohere due to a civic foundation of “shared political ideas.”  Kazin and McCartin claim that the concept of Americanism dates back to the first European settlements.  John Winthrop’s “city upon a hill,” John Adams’ invocation of “Providence,” and Tom Paine’s notion of America as “an asylum for mankind” all represent a particular redemptive and exceptional conception of America and its socio-political ethos.  Since then Americans “has been put to a variety of uses, benign and belligerent, democratic and demagogic,” and while Americanism is often most associated with more conservative forms of nationalism and patriotism in the service of protecting the status quo, it also contains a “vital countertradition” of dissent. 

Historians like David Hollinger have argued that scholars must understand and deal with Americanism because it has become “the most successful nationalist project in all of modern history.”  Kazin and McCartin argue that Americanism must be studied on “its own terms” so as to understand it as a “well-developed, internally persuasive ideology” and, thereby, “concerned” citizens could shape it towards “more benevolent” ends by “learning how to speak effectively within its idioms.”  Ultimately Kazin and McCartin suggest that “the ideals of Americanism” could be the “foundation of a new kind of progressive politics” – a politics where the left can “speak convincingly to their fellow citizens” and thus “pose convincing alternatives for the nation as a whole.”  While thoughtful scholars like Martha Nussbaum have argued that patriotism and nationalism are “morally dangerous,” Kazin and McCartin argue that nationalism is a fixture of the modern world and thus “instead of raging against their persistence, we should view them empathetically, doing what we can to help realize the best rather than the worst possibilities of faith in a country and its people…we must do more than rail against patriotic ideals and symbols.  For to do so is to wage a losing battle…progressives should claim, without pretense or apology, an honorable place in the long line of those who have demanded that Americanism apply to all and have opposed the efforts of those who have tried to reserve its use for privileged groups and belligerent causes.”

Understanding and subscribing to a shared concept of Americanism implies a sense of national identity, but it does so more in terms of place and procedure than ideology.  Americanism is an ambiguous and conflicting bundle of attitudes and ideological commitments, and it holds within it’s diversity a common commitment to a shared sacred ground.  America as a social, political, cultural, and economic territory is the ground over which various American parties have physically and ideologically wrestled over for centuries.  Americanism is not an identifiable ideology per se, but it is the identification of an individual or group as “American” in order to stake one’s territorial right for freedom, opportunity, and justice.  Thus, as I mentioned in the introduction, America is in essence an institutionalized debate wherein Americans have verbally and physically fought over what America is and should be.  Given the complex dynamics of the history of human society and the ecological flux of the natural world, I don’t think that there has ever been a stable, unified, or traditional notion of Americanism.  I don’t agree with much of what Crevecoeur wrote, but I do think he was right when he said that America produced a “surprising metamorphosis.”  Crevecoeur invoked the notions of patria and alma mater as a way of saying that America was the sum of its individuals interacting with the land and producing a nation through their work, their conflict, and their claims of “consequence.”[xciv]  In this sense the creation of an American nation is the compound and conflicting interaction of diverse parties staking their claim to a single territory.  Not all parties have been equally powerful, just, successful, or free, but all parties have verbally and physically struggled with the land and its inhabitants to survive, and in surviving laying a claim of consequence in this nation as one of its own. 

Thus, as I mentioned earlier, the disagreement over national identity (What is America, and who is an American?) is the true essence that unites all Americans.  An American is one who stakes claim of consequence in America and contributes their voice and their demands to the never ending debate over Americanism.  There will always be diversity and calls for unity.  There will always be culture wars and disagreement.  The hope of Americans, if hope is to be found, lies in what the philosopher, linguist, and literary critic Kenneth Burke once called Ad Bellum Purificandum – “Towards the Purification of War.”  By this phrase Burke meant to direct attention to language as the “critical moment” at which human motives take form.  Burke argued that a purification of the human ability (and need) to articulate identity, ideology, and purpose into language would be a great help in developing personal agency and social cooperation.[xcv]  It seems to me that while the debate over national identity and purpose can never be resolved, there is the possibility that the method of debate – the tools of discussion and deliberation – might themselves be perfected, as Burke maintained, and thereby, if we as Americans cannot erase our disagreement, we may learn to more productively and peacefully disagree. 

 


Endnotes

[i] Abigail Adams, “Letter to John Adams 31 March 1776” & “Letter to John Adams 7 May 1776,” in The Letters of John and Abigail Adams, Frank Shuffelton, ed. (New York: Penguin Books, 2004): 147-49, 168.  For John Adams reply to Abigail see “Letter to Abigail Adams 14 April 1776” (154).

[ii] Mia Bay, “See Your Declaration Americans!!!  Abolitionism, Americanism, and the Revolutionary Tradition in Free Black Politics.”  In Americanism: New Perspectives on the History of an Ideal, ed. Michael Kazin and Joseph A. McCartin (Chapel Hill, NC: The University of North Carolina Press, 2006): 25-52.

[iii] Mia Bay, “See Your Declaration Americans!!!, Ibid.

[iv] De Crevecoeur, J. Hector St. John, “Letter III: What is an American,” in Letters from an American Farmer, Susan Manning, ed. (1782; reprint, Oxford: Oxford University Press, 1997): 40-82.

[v] Reginald Horsman, Race and Manifest Destiny: The Origins of American Racial Anglo-Saxonism (Cambridge, MA: Harvard University Press, 1981).

[vi] Elizabeth Cady Stanton, Address to the Legislature of New-York, Adopted by the State Woman’s Rights Convention, Held at Albany, Tuesday and Wednesday, February 14 and 15, 1854 in The Norton Anthology: Literature by Women, The Tradition in English, 2nd ed., Sandra M. Gilbert and Susan Gubar, eds. (1854; reprint, New York: W. W. Norton, 1996):466-68.

[vii] Joseph J. Ellis, Founding Brothers: The Revolutionary Generation (New York: Vintage, 2000): 16.

[viii] In the Declaration of Independence Jefferson clearly declared the right of Americans to “alter” or “abolish” any government that did not represent and protect the people’s natural rights.  While not explicitly advocating violence, any abolishment of an existing institution would arguably necessitate force or violence of some kind, and thus, forcibly defending one’s rights is arguably a right of the people.  This statement seems to thereby justify and institutionally consecrate the right of political violence directed at securing and protecting other natural rights.

[ix] David A. Hollinger claimed that “virtually no one defends monoculturalism.”  David A. Hollinger, Postethnic America: Beyond Multiculturalism (1995; reprinted & expanded, New York: Basic Books, 2005): 80.

[x] Allen Bloom, The Closing of the American Mind (New York: Touchstone, 1987): 19, 26-27, 30-31.

[xi] Bloom, Ibid., 36, 38-39, 247.

[xii] “The fact is that the average black student’s achievements do not equal those of the average white student in the good universities, and everybody knows it.  It is also a fact that the university degree of a black student is also tainted, and employers look on it with suspicion, or become guilty accomplices in the toleration of incompetence (96).  Bloom stated flatly: blacks are “manifestly unqualified and unprepared” for good universities (94).

[xiii] Bloom, Ibid., 313-15, 318, 320-22, 326, 329, 55, 97, 382.

[xiv] E. D. Hirsch, Jr., Cultural Literacy: What Every American Needs to Know (1987; reprinted, New York: Vintage, 1988): 12, 29, 110, xvii, 95, 102, 23, 24, 18, 73.

[xv] Ibid., 18.

[xvi] E. D. Hirsch Jr., “Americanization and the Schools,” The Clearing House 72:3 (Jan/Feb, 1999): 136-39.

 

[xvii] Arthur M. Schlesinger, Jr., The Disuniting of America: Reflections on a Multicultural Society, revised ed. (1991; revised, New York: W. W. Norton, 1998):80, 115, 142, 123, 19, 54.

[xviii] Ibid., 158, 34, 144, 132, 17, 46, 34, 142-47, 49.

[xix] Patrick J. Buchanan, “1992 Republican National Convention Speech,” Republican National Convention, Houston, TX (August 17, 1992) <www.buchanan.org>.

[xx] Patrick J. Buchanan, “The Cultural War for the Soul of America” (Sept 14, 1992) <www.buchanan.org>.

[xxi] Samuel P. Huntington, Who Are We? The Challenges to America’s National Identity (New York: Simon & Schuster, 2004).

[xxii] Huntington’s discussion of identity is very confused because he tries to meld essentialist Cartesian dualism with modern constructivism, and then hold it together with a fascist militarism.  Huntington tells his readers that identities and cultures are “constructed” by people, adapted to environments, and change as environments and peoples change.  However, he also posits an unproblematical “substance” or “qualities” of “self” that are “possessed” by a person and which make that individual “distinct.”  But he further muddies his discussion by saying that identities are not substantial, but contextual, i.e. “to define themselves, people need an other.”  This contextual discussion argues against a “substantial” core of human identity and instead posits identity as an I/we contextually defined against a you/them, which according to Huntington inevitably leads to “competition,” “antagonism,” “demonization,” and finally the transformation of the “other” into an “enemy” that must be fought and killed (21-26).  Without invoking the concept, Huntington is replicating Sartre’s critical master/slave dialectic by which Sartre pointed out how human identity and society are replicated through ego-centrism, intolerance, antagonism, violence, and war.  Throughout the book Huntington celebrates war as the primary source for national cohesion, unity, and identity, and he devotes a section on “The Search for An Enemy.”  In this section Huntington claims that that “peace” and the absence of an “enemy” produces “internal disunity,” and thus, in order to protect national identity America needed to find an enemy after the Cold War, which turned out to be “militant Islam,” “America’s first enemy of the twenty-first century” (258-64).  Throughout the book Huntington reminds his readers with nationalist glee that it was only after 9-11 and America’s militaristic response that a heightened sense of patriotism and nationalism produced a sense of national unity not seen since World War II or the early Cold War (3-4, 199, 264).     

[xxiii] I will provide two examples of Huntington’s numerous flawed argumentation and historical inaccuracies.  First, Hunting claims that slavery “and its legacies” have been “the American dilemma” [author’s emphasis], which is demonstrably false.  He then sets up a dichotomy.  One the one hand laudable and self-less nationalist black leaders like Martin Luther King Jr. sought “equal rights for all” in order to solve this central dilemma.  On the other hand there were negative black leaders like Bayard Rustin who helped institute “affirmative discrimination”/”reverse discrimination” through narrow-minded and self-interested demands for “material benefits to blacks as a distinct racial group” (146-158).  Huntington admits that the American creed of equal rights for all was “ignored and flouted in practice” for “over two hundred years.”  The Civil Rights legislation made thing truly equal for the first time in American history and yet he has the audacity to argue that it was black people and affirmative action policies that “reintroduced racial discrimination into American practice” (157).  He basically makes the argument that racism disappeared overnight in 1965 only to be reintroduced by greedy blacks who only wanted to profit off the displacement of innocent white Americans.  The second flawed argument is representative of his treatment of much of the scholarly literature in this book.  Huntington cites Milton Gordon’s seminal yet outdated (1964) sociological study of assimilation in America.  Instead of a close read of Gordon’s central arguments, Huntington just lists off several quotes and makes the claim that while assimilation has “never been complete,” it has worked extremely well and is “a great, possibly the greatest, American success story” (183).  Huntington’s claim completely misrepresents Gordon’s argument, which was all successful assimilation in America has been superficial “cultural assimilation” (by which immigrants and minorities adopt the culture and language of the dominant culture), but Gordon went on to demonstrate that many ethnic minorities, especially dark skinned racial minorities, still suffer prejudice and discrimination, and were kept from the more significant “structural assimilation.”  Huntington makes the demonstrably false claim that all immigrants between 1820 and 1924 were “almost totally assimilated into American society” on equal and welcoming terms (178), and he uses his unfounded assertion to severely criticize newer immigrants as threats to American society, like Mexicans, because there are not assimilating as completely as older generations of immigrants.  

[xxiv] Ibid., 365, 9, xv-xvii, 256.

[xxv] Ibid., xvii, 10-11, 171-73, 144.  Huntington engages in classical populist/progressive rhetoric throughout the book by breaking the culture was into a dichotomous debate between the interests of the “America people” and the special “minority” interests represented by the “elites.”  Throughout the book this debate is rhetorically described in terms of a zero-sum competition, whereby, what is good for minorities (multiculturalism) must be detrimental and detracting from the American people (nationalism).  His rhetorical characterization of multiculturalism-as-minority-rights runs from simplistic-and-unfair (“the idea that diversity rather than unity or community should be America’s overriding value”) to unfair half-truths (“reverse discrimination”) to out right distortions and lies (multiculturalism comes only “at the expense of teaching the values and culture that Americans have had in common”) (142, 154, 173). 

[xxvi] Ibid., xvii, 10-11, 171-73, 144, 309-16.

[xxvii] Michael Walzer, “Pluralism: A Political Perspective,” in Harvard Encyclopedia of American Ethnic Groups (Cambridge: Belknap Press of Harvard University, 1980).  Reprinted in Michael Walzer, What It Means to Be an American: Essays on the American Experience (New York: Marsilio, 1996).

[xxviii] Michael Walzer, “What Does it Mean to Be an ‘American?’” Social Research (1990); Reprinted in Michael Walzer, What It Means to Be an American: Essays on the American Experience (New York: Marsilio, 1996).

[xxix] Liah Greenfeld, Nationalism: Five Roads to Modernity (Cambridge: Harvard University Press, 1992): 482-84.

[xxx] Jennifer L. Hochschild, Facing Up to the American Dream: Race, Class, and the Soul of the Nation (1995; reprint, Princeton: Princeton University Press, 1996).

[xxxi] Ibid., xiv, 15-58, 249, 252, 259.

[xxxii] Todd Gitlin, The Twilight of Common Dreams: Why America is Wracked by Culture Wars (New York: Metropolitan Books, 1995): 2-3.

[xxxiii] Ibid., 20, 23, 29-36.

[xxxiv] Ibid., 45, 48-51, 56-59.

[xxxv] Ibid., 68-73, 79, 82, 100-01, 146, 165, 198-99, 207, 217, 236-37.

[xxxvi] Michael Lind, The Next American Nation: The New Nationalism and the Fourth American Revolution (New York: Free Press, 1995).  Lind claimed that his book was “the first manifesto of American liberal nationalism” (15).

[xxxvii] Ibid., 5-9, 20-27, 55, 65-70, 89, 97-115, 119.

[xxxviii] Lind argued, “Racial preference is in reality a conservative policy, a form of elaborate but ultimately superficial tokenism that is much less costly, to affluent whites in general and the business class in particular, than expensive universal programs designed to improve the educations and standard of living of the bottom half of the population, of all races.  Compared to color-blind liberalism, racial preference is cheap” (179).

[xxxix] Ibid., 123, 130-31, 139, 141, 181-85, 188-215, 245.

[xl] Ibid., 259-98.

[xli] David A. Hollinger, Postethnic America: Beyond Multiculturalism (1995; reprinted & expanded, New York: Basic Books, 2005).

[xlii] Ibid., 1, 6-7, 19-28, 82-84, 106, 116, 132-34, 143, 157.

[xliii] Gary Gerstle, “Liberty, Coercion, and the Making of Americans,” The Journal of American History 84:2 (Sept 1997): 524-58.

[xliv] Thomas Bender, ed., Rethinking American History in a Global Age (Berkeley: University of California Press, 2002).

[xlv] Bonnie Honig, Democracy and the Foreigner (Princeton: Princeton University Press, 2001): 74-75, 104-5, 122.

[xlvi] John Exdell, “Liberal Nationalism, Immigration, and Race,” at Reclaiming Democracy: Visions and Practices from the Radical Left, Radical Philosophy Association, 7th Biennial Conference, Creighton University, Omaha, Nebraska, November 4, 2006.

[xlvii] Liah Greenfeld, Nationalism: Five Roads to Modernity (Cambridge: Harvard University Press, 1992): 4-14; Guido Zernatto, “Nation: The History of a Word,” Review of Politics 6 (1944): 351-66; Max Weber, Wirtschaft und Gesellschaft in From Max Weber: Essays in Sociology, H. H. Gerth and C. Wright Mills, eds. (1946; reprint, Oxford: Oxford University Press, 1958), 171-79; Louis Wirth, “Types of Nationalism,” The American Journal of Sociology 41 (May 1936): 723-37; Hans Kohn, “The Nature of Nationalism,” The American Political Science Review 33 (Dec 1939): 1001-21; Chong-Do Hah and Jeffrey Martin, “Toward a Synthesis of Conflict and Integration Theories of Nationalism,” World Politics 27 (April 1975): 361-86; Isaiah Berlin, “Nationalism: Past Neglect and Present Power,” Against the Current: Essays in the History of Ideas, in The Proper Study of Mankind: An Anthology of Essays, Henry Hardy and Roger Hausheer, eds. (1979; reprint, New York: Farrar, Straus and Giroux, 1997): 581-604; Benedict Anderson, Imagined Communities: Reflections on the Origin and Spread of Nationalism (1983; reprint, London: Verso, 1991); Eric Hobsbawm, Nations and Nationalism since 1780: Programme, Myth, Reality (1990; reprint, Cambridge: Cambridge University Press, 2000).

[xlviii] Eric Hobsbawm, The Age of Revolution, 1789 – 1848 (1962; reprint, New York: Vintage Books, 1996); Eric Hobsbawm, The Age of Capital, 1848 – 1875 (1975; reprint, New York: Vintage Books, 1996); Eric Hobsbawm, The Age of Empire, 1875 – 1914 (1987; reprint, New York: Vintage Books, 1989); Eric Hobsbawm, Nations and Nationalism since 1780: Programme, Myth, Reality (1990; reprint, Cambridge: Cambridge University Press, 2000).

[xlix] Anthony Smith, “Nationalism and Classical Social Theory,” The British Journal of Sociology 34 (Mar 1983): 19-38; Eric Hobsbawm, Nations and Nationalism since 1780: Programme, Myth, Reality (1990; reprint, Cambridge: Cambridge University Press, 2000); Louis Wirth, “Types of Nationalism,” The American Journal of Sociology 41 (May 1936): 723-37; Hans Kohn, “The Nature of Nationalism,” The American Political Science Review 33 (Dec 1939): 1001-21; Liah Greenfeld, “The Trouble with Social Science,” Critical Review 17:1-2 (2005): 101-16.

[l] Liah Greenfeld, Nationalism: Five Roads to Modernity, Ibid., 27-87.

[li] Ibid., 400-02, 424, 426, 431, 444.

[lii] Robert A. Carlson, The Quest for Conformity: Americanization through Education (New York: John Wiley and Sons, 1975); Michael Lind, The Next American Nation: The New Nationalism and the Fourth American Revolution (New York: Free Press, 1995); Samuel P. Huntington, Who Are We? The Challenges to America’s National Identity (New York: Simon & Schuster, 2004).

[liii] Alexis de Tocqueville, Democracy in America, Harvey C. Mansfield and Delba Winthrop, trans. & eds. (1835, 1840; reprint, Paris: Editions Gallimard, 1992; reprint, Chicago: University of Chicago Press, 2000); Joyce Appleby, Lynn Hunt, and Margaret Jacob, “History Makes a Nation,” in Telling the Truth About History (New York: W. W. Norton & Company, 1994): 91-125; David M. Potter, “The Historian’s Use of Nationalism and Vice Versa,” The American Historical Review 67 (July 1962): 924-50; Liah Greenfeld, Nationalism: Five Roads to Modernity (Cambridge: Harvard University Press, 1992); Robert H. Wiebe, The Search For Order, 1877 – 1920 (New York: Hill and Wang, 1967); Robert Wiebe, “Framing U.S. History: Democracy, Nationalism, and Socialism,” in Rethinking American History in a Global Age, Thomas Bender, ed. (Berkeley: University of California Press, 2002): 236-49; William Earl Weeks, “American Nationalism, American Imperialism: An Interpretation of United States Political Economy, 1789-1861,” Journal of the Early Republic 14 (Winter 1994): 485-95.

[liv] Thomas J. Archdeacon, Becoming American: An Ethnic History (New York: The Free Press, 1983); Karen Ordahl Kupperman, “International at the Creation: Early Modern American History,” in Rethinking American History in a Global Age, Thomas Bender, ed. (Berkeley: University of California Press, 2002): 103–22; Linda K. Kerber, “The Republican Ideology of the Revolutionary Generation,” American Quarterly 37 (Autumn 1985): 474-495; Daniel T. Rodgers, “Republicanism: The Career of a Concept,” The Journal of American History 79 (June 1992): 11-38; Joyce Appleby, “Republicanism and Ideology,” American Quarterly 37 (Autumn 1985): 461-473; Liah Greenfeld, Nationalism: Five Roads to Modernity (Cambridge: Harvard University Press, 1992); Robert Shalhope, “Anticipating Americanism: An Individual Perspective on Republicanism in the Early Republic,” In Americanism: New Perspectives on the History of an Ideal, Michael Kazin and Joseph A. McCartin, eds. (Chapel Hill: The University of North Carolina Press, 2006): 53-72.

[lv] As John Jay famously wrote: Americans were one people “descended from the same ancestors, speaking the same language, professing the same religion, attached to the same principles of government, very similar in their manners and customs.”  However, Michael Walzer has argued that the American constitution embodied the fragmented cultural diversity of the new republic.  The constitution is not one, but “two texts” (the Constitution and the Bill of Rights).  Walzer argues that the Bill of Rights actually “opposes” the Constitution by securing protection for diversity.  Michael Walzer, “Constitutional Rights and the Shape of Civil Society,” What it Means to Be an American: Essays on the American Experience (New York: Marsilio, 1996): 105-24.  Linda K. Kerber, “The Revolutionary Generation: Ideology, Politics, and Culture in the Early Republic,” in The New American History, Eric Foner, ed., 2nd ed. (Philadelphia: Temple University Press, 1997): 31-59; Joseph J. Ellis, Founding Brothers: The Revolutionary Generation (2000; reprint, New York: Vintage Books, 2002); Alexander Hamilton, James Madison, and John Jay, The Federalist.  Benjamin Fletcher Wright, ed. (1787-1788; reprint, Cambridge: Harvard University Press, 1961; reprint, New York: Metrobooks, 2002): see especially #1, #2, #6, and #10; G. Calloway, The American Revolution in Indian Country: Crisis and Diversity in Native American Communities (Cambridge: Cambridge University Press, 1995); Liah Greenfeld, Nationalism: Five Roads to Modernity (Cambridge: Harvard University Press, 1992): 422-23.

[lvi] Noah Pickus, True Faith and Allegiance: Immigration and American Civic Nationalism (Princeton: Princeton University Press, 2005).

[lvii] The Naturalization Act of 1795 extended the residency requirement to five years and added the requirement that the immigrant declare intention to naturalize at least three years in advance.  It also required applicants to renounce all family titles and titles of nobility.  Citizenship was restricted further through the Sedition Act, Alien Enemies Act, Alien Friends Act, and the Naturalization Act of 1798 whereby residency requirements were extended to 14 years and new powers gave the president wide latitude to arrest and deport aliens (34-51). 

[lviii] Pickus explained that many of the Founding generation believed that slavery was immoral and should be discontinued, but this did not mean that they wanted to extend citizenship rights to freed blacks or welcome them into civil society.  Even most abolitionists who believed blacks to be inherently equal to whites did not think that freed black slaves could assimilate into American society.  The “free white” clause in the Naturalization Act was meant to forestall any thought of incorporating blacks into civil society should slavery eventually be abolished (56-58, 61-62).

[lix] Ibid., 15, 17, 19, 22-23, 24-25, 34-51, 53-62.

[lx] Ibid., 64-71.

[lxi] Michael Kazin & Joseph A. McCartin, “Introduction,” In Americanism: New Perspectives on the History of an Ideal, Michael Kazin & Joseph A. McCartin, eds. (Chapel Hill: The University of North Carolina Press, 2006): 1-21; Robert James Branham, “’Of These I Sing”: Contesting ‘America,’” American Quarterly 48:4 (1996): 623-52; Linda K. Kerber, “The Revolutionary Generation: Ideology, Politics, and Culture in the Early Republic,” in The New American History, Eric Foner, ed., 2nd ed. (Philadelphia: Temple University Press, 1997): 31-59; Eric Foner, The Story of American Freedom (New York: W. W. Norton & Co., 1998); Reginald Horsman, Race and Manifest Destiny: The Origins of American Racial Anglo-Saxonism (Cambridge: Harvard University Press, 1981); Mia Bay, “See Your Declaration Americans!!!  Abolitionism, Americanism, and the Revolutionary Tradition in Free Black Politics,” In Americanism: New Perspectives on the History of an Ideal, Michael

Kazin and Joseph A. McCartin, eds. (Chapel Hill: The University of North Carolina Press, 2006): 25-52.

[lxii] David M. Potter, “The Historian’s Use of Nationalism and Vice Versa,” The American Historical Review 67 (July 1962): 924-50; Roger L. Ransom, Conflict and Compromise: The Political Economy of Slavery, Emancipation, and the American Civil War (1989; reprint, Cambridge: Cambridge University Press, 1995); Eric Foner, Forever Free: The Story of Emancipation and Reconstruction (New York: Alfred A. Knopf, 2005).

[lxiii] Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919 (New York: W. W. Norton & Company, 1987): xii-xiii, xix, xxiv, xl, xliii, 279-80.

[lxiv] William H. Chafe, The Unfinished Journey: America Since World War II, 2nd ed. (1986; revised, Oxford: Oxford University Press, 1991): 146-76, 236-37.  Stanley Nelson (director & producer), The Murder of Emmett Till, American Experience (produced by WGBH Educational Foundation, distributed by PBS Home Video, 2003).  Lawrence Mishel, Jared Bernstein, and Sylvia Allegretto, The State of Working America 2006/2007, An Economic Policy Institute Book (Ithaca, NY: ILR Press, an imprint of Cornell University Press, 2007): 283-91.  Michael Harrington, The Other America: Poverty in the United States (1962; revised, New York: Penguin Books, 1981): 185-202, Ch 4.  

[lxv] Tom Hayden, The Port Huron Statement: the Visionary Call of the 1960s Revolution (1962; reprinted, New York: Thunder’s Mouth Press, 2005); Todd Gitlin, The Sixties: Years of Hope, Days of Rage (New York: Bantam Books, 1987).

[lxvi] Ruth Rosen, The World Split Open: How the Modern Women’s Movement Changed America (New York: Penguin Books, 2000): Ch 4.

[lxvii] Matt S. Meier and Feliciano Ribera, Mexican Americans, American Mexicans: From Conquistadors to Chicanos (1972; revised, New York: Hill and Wang, 1993): Ch 14-15; Arnoldo De Leon and Richard Griswold del Castillo, North to Aztlan: A History of Mexican Americans in the United States, 2nd ed. (1996; revised, Wheeling, IL: Harlan Davidson, 2006): Ch 8.

[lxviii] Eric Foner, The Story of American Freedom (New York: Norton, 1998): Ch 12; Cal Jillson, Pursuing the American Dream: Opportunity and Exclusion over Four Centuries (Lawrence, KA: University Press of Kansas, 2004); Vine Deloria, Jr., Spirit and Reason: The Vine Deloria, Jr., Reader (Golden, CO: Fulcrum Publishing, 1999): Ch 20; Michael Novak, The Rise of the Unmeltable Ethnics: Politics and Culture in the Seventies (New York: Macmillan, 1973); Maurice Isserman and Michael Kazin, “The Failure and Success of the New Radicalism,” The Rise and Fall of the New Deal Order, 1930-1980, Steve Fraser and Gary Gerstle, eds. (Princeton: Princeton University Press, 1989): 212-42.  Tom Wolfe: The Electric Kool-Aid Acid Test (1968; reprint, New York: Bantam, 1999).

[lxix] Theodore Rosak, The Making of a Counter Culture: Reflections on the Technocratic Society and Its Youthful Opposition (New York: Anchor Books, 1969): 42-43, 57.  I am using Rosak’s conception of the “counter culture” in a much broader way than he did in this book (xii, 56, 68).  Ruth Rosen, The World Split Open, Ibid. 196-97.

[lxx] Alan Dawley, Struggles for Justice: Social Responsibility and the Liberal State (Cambridge: Belknap Press of Harvard University Press, 1991).  From the perspective of workers, Dawley explained the “corporate rationalization” as “something less than an exercise in pure reason.  What the new breed of scientific managers liked to present as a rational system of efficiency and merit nonetheless contained all the irrationalities of class, race, and gender.  The supposedly impartial bureaucratic hierarchy was also an axis of unequal power between managers and workers…Rationalization introduced new forms of male dominance” (77-78).

[lxxi] Dawley argued that the New Deal reforms actually “preserved social hierarchy:” “Even as the New Deal responded to popular demands for social justice, it was careful not to infringe too much upon the privileges of wealth…the Roosevelt administration had crafted a compromise between privileged elites and subordinate groups that restrained liberty in the name of security without upending the social order…While the first New Deal tried to save the capitalist system for big business, the second tried to save it from big business…Although Roosevelt popularized his program with populist rhetoric, the new governing system did not redress the balance of class power or redistribute wealth so much as mediate social antagonisms by creating a new set of bureaucratic institutions.  Building on Hoover’s initiatives, Roosevelt’s New Deal expanded state intervention in the market and launched a welfare state” (385, 394, 395).

[lxxii] Ibid., 31, 62, 64.  Dawley further explained his notion of the new models of elite governance: “National elites had to look elsewhere for models of how to govern.  In fact, they experimented with three models.  The first was old fashioned liberalism – a state of courts and parties, a policy of laissez faire on social issues, the use of troops to police industrial disturbances, and the ruling myths of private property right, separate spheres, and white supremacy.  Still the dominant model, it hardly presented an innovative path to the future.  The other two models – progressive and managerial – were rival attempts to resolve the contradiction between emerging social forces and the existing liberal state, and they would compete with each other through the First World War into the New Era and all the way to the New Deal.  They represented alternative revisions in the American liberal tradition of self-government.  Managerial liberals redefined it to mean self-government in industry, emphasizing the public benevolence of the private corporation.  Progressives redefined it in social terms, emphasizing government as the benevolent influence balancing the claims of selfish private interests” (163).

[lxxiii] Ibid., 1-13, 30-31, 62, 71-73, 105, 114-16, 128-38, 163-65, 175-77, 184-96, 276, 370, 394.

[lxxiv] Dawley, Ibid., 370, 385-86, 394.

[lxxv] Alan Brinkley, “The New Deal and the Idea of the State,” in The Rise and Fall of the New Deal Order, 1930-1980, Steve Fraser and Gary Gerstle, eds. (Princeton: Princeton University Press, 1989): 85-121.

[lxxvi] Ira Katznelson, “Was the Great Society a Lost Opportunity?” in The Rise and Fall of the New Deal Order, 1930-1980, Steve Fraser and Gary Gerstle, eds. (Princeton: Princeton University Press, 1989): 185-211.  Both Katznelson and Alan Dawley place the origins of identity politics to the early 1940’s “ethnic pluralism” exemplified in Roosevelt’s invocation of a “nation of nations” (Struggles for Justice, Ibid., 389).

[lxxvii] Jonathan Rieder, “The Rise of the ‘Silent Majority,’” in The Rise and Fall of the New Deal Order, 1930-1980, Steve Fraser and Gary Gerstle, eds. (Princeton: Princeton University Press, 1989): 243-68; Eric Foner, The Story of American Freedom, Ibid., Ch 13; Cal Jillson, Pursuing the American Dream: Opportunity and Exclusion over Four Centuries, Ibid., Ch 8

[lxxviii] Richard Jensen, “The Culture Wars, 1965-1995: A Historian’s Map,” Journal of Social History 29 (Oct 1995): 17-37.

[lxxix] James Davison Hunter, Culture Wars: The Struggle To Define America (New York: Basic Books, 1991).  See also, James Davison Hunter, “The Discourse of Negation and the Ironies of Common Culture,” Hedgehog Review 6:3 (Fall 2004): 24-38.

[lxxx] Hunter has also indicated the much larger issue at stake that transcends the American and contemporary context.  He has taken a position at odds with the homogenizing conception of culture made by cultural anthropologists like Clifford Geertz.  Hunter has argued, “Culture is, by its very constitution, contested….always and everywhere, even when it appears most homogeneous…Where there is culture, there is struggle” [author’s emphasis].  Culture is often a battle over who has “the power to project one’s vision of the world as the dominant, if not the only vision of the world.”  The creation of “law” or public “policy” is to “create and sustain a normative universe…it is, in short, to take sides on the matter of the public good.”  James Davison Hunter, “Culture Wars Revisited,” Insight 10, Institute for Advanced Studies in Culture and Center for Religion and Democracy (Spring 2004): 5-6. 

[lxxxi] Ibid., 31-34, 42-43, 49-51.  Davison argued that due to the nature of broadcast media, the culture war is oversimplified and represented as “more polarized than the American public itself…The polarization of contemporary public discussion is in fact intensified by and institutionalized through the very media by which that discussion takes place…Middling positions and the nuances of moral commitment, then, get played into the grid of opposing rhetorical extremes” (159-61).  See also: Karlyn Kohrs Campbell, “Marketing Public Discourse,” Hedgehog Review 6:3 (Fall 2004): 39-54. 

[lxxxii] Hunter defined “cultural conflict” as the “political and social hostility rooted in different systems of moral understanding.  The end to which these hostilities tend is the domination of one cultural and moral ethos over all others.”  These “systems of moral understanding” are “not merely attitudes that can change on a whim but basic commitments and beliefs that provide a source of identity, purpose, and togetherness for the people who live by them.  It is for precisely this reason that political action rooted in these principles and ideals tends to be so passionate.”  Hunter argued that older forms of cultural conflict have given way to a larger clash between “worldviews:” competing groups are battling over “our most fundamental and cherished assumptions about how to order our lives – our own lives and our lives together in this society.  Our most fundamental ideas about who we are as Americana are now at odds” (42).  At root, Hunter argued, “cultural conflict is ultimately about the struggle for domination…[it] is about power – a struggle to achieve or maintain the power to define reality” (52).  When a dominant group secures and exercises this power over sub-groups it is called “cultural hegemony” (57).

[lxxxiii] Hunter argued that “the significant divisions on public issues are no longer defined by the distinct traditions of creed, religious observance, or ecclesiastical politics” (105).

[lxxxiv] Hunter defined “orthodoxy” as “the commitment on the part of adherents to an external, definable, and transcendent authority,” which clearly defines “a consistent, unchangeable measure of value, purpose, goodness, and identity.”  He defined “progressivism” as a “modern” world view built from “a spirit of rationalism and subjectivism:” “Truth tends to be viewed as a process, as a reality that is ever unfolding,” and thus, progressives adapt and “re-symbolize historic faiths according to the prevailing assumptions of contemporary life” (44-45).

[lxxxv] Ibid., 173-291.  Hunter devoted a chapter to each one of these cultural battlefields.

[lxxxvi] Ibid., 67-106, 39-43, 52-55, 107-32.

[lxxxvii] Ibid., 106, 312-14, 318, 325.

[lxxxviii] Winner of the 1993 American Book Award.

[lxxxix] Gerald Graff, Beyond the Culture Wars: How Teaching the Conflicts Can Revitalize American Education (New York: W. W. Norton, 1992): viii.  Graff characterized his teaching-the-conflicts model as a way to turn “the poisonous divisions of the culture war into educationally valuable discussion” (62).  Graff’s technique is a way to teach “critical literacy,” which he used Mike Rose’s Lives on the Boundary (1989) to define as, “framing an argument or taking someone else’s argument apart, systematically inspecting a document, an issue, or an event, synthesizing different points of view, applying a theory to disparate phenomena, and so on” (91).

[xc] Graff argued that “literature is a social product, enmeshed in a system of more mundane cultural assumptions, texts, and ‘discourses,’ not an autonomous creation springing full-blown from the brain of an unconditioned genius.  The jargon [of a literary theory]is a way of shifting attention to the ‘cultural work’ done by the text, suggesting that the text does not stand above its culture but acts on and is acted on by it.  It points to the conflicts, contradictions, and struggles in works of literature rather than the unifying elements” (79).

[xci] Ibid., 3-5, 8, 16-36.

[xcii] Ibid., 6-8, 44-46.

[xciii] Michael Kazin & Joseph A. McCartin, “Introduction,” In Americanism: New Perspectives on the History of an Ideal, Michael Kazin & Joseph A. McCartin, eds. (Chapel Hill: The University of North Carolina Press, 2006): 1-21.

[xciv] Crevecoeur, Letters form an American Farmer, Ibid., 43-44.

[xcv] Kenneth Burke, A Grammar of Motives (1945; reprint, Berkeley: University of California Press, 1969): 318-20.

The Paradox of Progressivism

A Historiography of a Concept and a Political Movement

originally written 2011

What Was the Progressive Movement?

John R. Commons used the term “Progressive” in the 1890s as an idea foreshadowing a new social and political orientation that was challenging laissez-faire individualism, but he was not explicit about what the term meant.  By 1897 Albion Small noticed a new reformist impulse in the U.S. and a rising “social movement,” but was not sure if a few initial stirrings of reform would lead towards a programmatic platform that could create widespread social change.[i]  Daniel T. Rodgers has written that the word “progressive” was used by Woodrow Wilson in 1911, who prefaced its political meaning during the 1910 electoral campaigns by saying it was still a “new term.”  The rhetorical identification of a Progressive “movement” seemed to have arisen by around 1912 along with its ideological counterpart, “progressivism,” which was used as a political orientation in opposition to the democratic, republican, and socialist parties.  The prominence of these terms were due to the third-way “Progressive” Party in the presidential campaign of 1912, but these terms did not become associated with a widespread reformist identification until later in the decade.[ii] 

Benjamin Parke DeWitt published a polemic called The Progressive Movement: A Non-partisan Comprehensive Discussion of Current Tendencies in American Politics by 1915.  He tried to explain the Progressive ideology and political platform in terms of a struggle between the oppressed “people” and the sinister political and economic “interests.”  By the time the so-called “Progressive movement” had largely come to an end after World War I, there was still no agreement on what exactly “Progressive” meant or what the movement was about.  In 1924 Nation journalist William Hard held a contest to see if his readers could define “Progressivism.”  No consensus emerged.[iii]  During that same year, long time self-identified Progressive, Robert “Fighting Bob” La Follette, initiated a “new Progressive Party” (incorporating labor and socialists) and was able to win 16% of the vote (the second largest third-party percentage of the 20th century, next only to the first Progressive Party of Roosevelt, which garnered over 4 million popular votes and 88 electoral votes).  The year 1932 brought out an obituary for Progressivism in John Chamberlain’s Farewell to Reform: The Rise, Life and Decay of the Progressive Mind in America

During the 1950s and 60s the term “Progressivism” stood as the catch all concept of historians and political philosophers, which was used to define a broad age of liberal reform following agrarian uprisings (“Populism”) and prefacing the New Deal.  By the 1970’s U.S. historians found the early 20th century social movement(s) ambiguous, inconsistent, paradoxical, contradictory, complex, and beyond the limited capacity of the term “Progressive.”[iv]  Some called for the dismissal and burial of the term.  But the idea survived and by 2003 Oxford University Press published yet another volume on the “Progressive Movement.”  We will look at selective portraits over the last 50 years within the historiography on the “Progressive Movement” to see how “Progressivism” has been defined in order to evaluate its usefulness as a concept for understanding U.S. reformist programs during the first decades of 20th century.    

Richard Hofstadter was one of the first major historians of the “Progressive” period in U.S. history and also an early conceptualizer of “progressivism.”  He won the Pulitzer Prize in history for his treatment of the subject, The Age of Reform (1955).  In this work he sought a “broader” definition of the term “progressive” and located its essence within the “impulse toward criticism and change” which was emblematic of middle-class programs for social and economic reform around the turn of the 20the century.  He was careful to point out that both the larger term “Progressive” and the more specific “Progressive Movement” were “rather vague and not altogether cohesive or consistent” conceptions.  He focused on the “ideas” of this vague and inconsistent movement, which was based on the notion of “self-reformation.”[v]

Hofstadter described the United States economic, legal, and political system of the 19th century as “reliably conservative.”  He also noted that reactions against this conservative system of government during the 19th century were “popular,” “democratic,” and “progressive.”  Hofstadter labeled the period from 1890 to 1940 as an “age of reform,” whereby, a “surge” of popular, democratic, and progressive reactions were sounded and corresponding social movements set forth.  Hofstadter set the progressive period between two other periods of reform in U.S. history: 1) an earlier period of agrarian uprising, especially the “populist” movement, which had its origins in Jacksonian politics and reached it peak in the 1890s; 2) the progressive period, which properly congealed by 1900; and 3) the later initiative called the New Deal, originating in the 1930s, which was less programmatic, more pragmatic, and more Federally centered than previous reform periods.  Hofstadter suggested that this long string of reformism had stalled by the 1950s (he was writing his book in mid decade), partly due to the social and political institutionalization of reform, which quite literally internalized the progressive-liberal ethos into the U.S. system of government and, thereby, argued Hofstadter, the progressive-liberal ethos as a political program became more conservative so as to preserve its central position within the socio-political arena.[vi]

Hofstadter invoked several definitions and conceptions of “progressivism” and “progressives,” but there were many common themes in his work.  The Progressive Ethos was a broad “impulse” of “criticism and change” that became the “whole tone” of socio-political ferment after 1900.  Its essence was an imprecise and nostalgic call for a “later-day Protestant revival” that preached “self-reformation,” “economic individualism,” “political democracy,” “morality,” and “civic purity.”  It was also a reactionary push against concentrated economic power, inequality, and corruption, while at the same time progressivism was a narrow-minded attempt to counter industrial inefficiency, urban social disorder, and immigration.    

The Progressive actors were largely “genteel,” “proper,” and “respectable” middle class reformers with an “enthusiasm” for social and economic change.  They had humanitarian “vision” and “courage,” but they were not radicals and they preferred talk of “moral values” instead of initiating material improvement.  Hofstadter also claimed that progressives were a group of “responsible” WASP “elites” who embarked on a “status revolution”[vii] to regain “deference and power,” which had been threatened by corporate capitalism, labor organizations, and ethnic political machines. 

Hofstadter characterized the Progressive Movement as “rather vague,” “not altogether cohesive or consistent,” “mild and judicious,” “moderate,” “safe,” and “constructive.”  This movement sought a “widespread” effort including “the greater part of society” for a “moderate” and “constructive” change in the social and political system. The movement seemed to prefer “exposure,” “information,” and “exhortation” to programmatic action and more equitable restructuring.  Hofstadter noted the “radical” tenor of progressive criticisms, but he pointed out a “disparity between the boldness of their means and the tameness of their ends.”  He criticized the Progressive Movement as a “moral crusade” under the spell of an “evangelistic psychology” that often devolved into a “retrograde,” “delusive,” “comic,” and sometimes “vicious” bit of political parody.  Hofstadter made it clear that there was much about the Progressive Movement that could be considered illiberal and even unprogressive by its own standards.[viii]

Another major historian of the Progressive period is Robert H. Wiebe whose The Search for Order, 1877 – 1920 (1967) has been widely cited in the literature on the subject.  Wiebe did not use the “Progressive” periodization and he did not refer to Progressives or Progressivism in his book, although it was mentioned in the “Introduction” by David Donald.  The only time Wiebe used the term “Progressive” was in relation to the 3rd party during the 1912 presidential election, the “Progressive Party.”  Wiebe’s book focused instead on what he terms “the new middle class.”  This group of people congealed into what could be called a “class” by the late 19th century and this “class” that Wiebe examines was conceptually similar to the “Progressives” that Hofstadter described.  This new middle class was composed of educated and cultured professionals and specialists who were “clustered” in urban areas in the United States by the turn of the 20th century.  These educated professionals had an optimistic “faith” in scientific and bureaucratic rationality and they tended to use this discursive method to focus on the country’s “evils” with an “earnest desire to remake the world upon their private models.”  The primary goal of this new middle class was a desire for order, unity, efficiency, and cohesion in society, politics, industry, and urban development, both nationally and also internationally, in short they wanted a national – if not global – “frictionless bureaucracy.”  When order could not be achieved rationally, these professionals often resorted to “traditional techniques” to establish order, like force or exclusion: The new middle class would “draw a line around the good society and dismiss the remainder…separate the legitimate from the illegitimate.”  This new middle class used their scientific rationality to facilitate a new technocratic and managerial framework with which to gain power so as to “reorder” society, industry, and state according to what they considered universal, scientific principles of natural law.[ix]

In 1968 James Weinstein wrote an important book and widely cited book on the influence of corporate capitalism on Progressive reform, The Corporate Ideal in the Liberal State: 1900-1918.[x]  Weinstein demonstrated a “conscious and successful effort to guide and control the economic and social policies of federal, state, and municipal governments by various business groupings in their own long-range interest as they perceived it.”  Liberalism changed from its 19th century roots of individualism and laissez faire to an early 20th century “new liberalism” of corporate social responsibility and the rationalized expansion of the regulatory, “liberal” state.  Many business leaders in the late 19th and early 20th century made a conscious decision to use liberal reform “as a means of securing the existing social [and economic] order.”  Liberal reforms were meant to incorporate various socialist and labor initiatives, while delegitimizing socialist and labor movements, and liberal reforms sought to stabilize, rationalize, and expand the apparatus of the state as a method business friendly of market regulation, which corporate interests could oversee or control.  A member of the National Civic Federation and a utilities magnate, Samuel Insull, argued in 1909 that corporate leaders should “help shape the right kind of regulation” before “the wrong kind [was] forced upon him.”  At the Conference of Republicans of the State of New York in 1913, Elihu Root, also a member of the NCF, argued that the Republicans needed to “meet industrial and social demands of modern civilization, so far as they are reasonably consistent with our institutions.”  Paraphrasing Theodore Roosevelt, Weinstein argued that by the 1920s many corporation leaders began to see that “social reform was truly conservative.”  The rhetoric, legislation, oversight, and enforcement of worker collectives, trust regulation, workers compensation, reduction of the work day and work week, and wage increases could all be managed by corporate interests so as to safeguard the long term profits of corporate and monopoly capitalism from the more radical agitation of socialists and labor unions.[xi]  And as long as corporate leaders were willing to keep up a rhetorical front of corporate responsibility and regulation then political leaders like Teddy Roosevelt, Wilson, Taft, and even Franklin Roosevelt were willing to conflate (using the rhetoric of “hearty cooperation”) national with corporate and even monopoly interests.  Even when truly concerned reformers like Frank P. Walsh tried to outline progressive industrial reforms, “the proposals were made mostly by men whose conscious purpose was to help the working man, while stabilizing and strengthening the corporate system,” which lead to the “rise of a new corporate oligarchy.”[xii] 

By 1970 “Progressivism” was being reexamined by historians.  In “An Obituary for ‘The Progressive Movement,’” Peter Filene called the whole conception of Progressivism and the Progressive Movement into question.  He argued that what had been commonly called “The Progressive Movement” never actually happened.  He said that there was never a monolithic and unified movement working towards a clear, let alone agreed upon, social and political program.  The notion of a unified movement, Filene argued, was a “mirage.”  The concept of a “Progressive Movement” was a “dead end” because the data on reformers during the period from 1890 to 1930 “stubbornly spill[s] over the edges” of the concept of “Progressivism:” “The more historians learn, the farther they move from consensus.”  Filene argued that just because “many Americans in the early 20th century were ‘reformers’” does not mean that “these Americans joined together in a ‘reform movement.’”  Filene argued, “The evidence points away from convenient synthesis and toward multiplicity” – social reform in the U.S. at the turn of the 20the century was “ambiguous, inconsistent, [and] moved by agents and forces more complex than a progressive movement.” 

And further, Filene argued, if there was a “progressive” ideology that united some reformers, it was “at best” “heterogeneous” and “lacked unanimity of purpose either on a programmatic or on a philosophic level.”  Filene even cited Michael Rogin’s 1967 work The Intellectuals and McCarthy: The Radical Specter whose research questioned whether the Progressive Party could even be considered “progressive” based upon its diverse membership and contradictory platforms.  Filene ended his article by focusing on the “diversity” of reformers during the period and the conflict and consensus between these diverse groups.  He argued for a conception of “shifting coalitions around different issues” by which diverse reformers and reform groups practiced “political factionalism” and “ideological improvisation” in broad and contradictory efforts at reforming U.S. society, culture, and government.[xiii]

In response to Filene’s charge, three respected and widely published scholars in the area of early 20th century U.S. history published Progressivism (1977).  In this book John C. Burnham, John D. Buenker, and Robert M. Crunden each drafted a statement and a rejoinder to discuss the usefulness and accuracy of “Progressivism” as a tool for understanding early 20th century political and social reform in the U.S.[xiv]

In the first essay John C. Burnham argued that Filene’s “obituary” was “premature” because Filene along with other scholars had focused too much on particular aspects of the diverse political and local history of the period, which “ended up refining progressivism out of existence.”  Burnham argued that “Progressivism” needed to be re-evaluated and he suggested two new ways to conceptualize the term: 1) the “coalescing” of a number of reformist streams that “reinforced” each other, “cumulating” into “what contemporaries recognized as progressivism;” and 2) specific socio-political “changes” that actually occurred around the turn of the 20th century.[xv]   

Burnham invoked Clyde Griffen’s concept of a “progressive ethos,” which was defined as an “an idealism marked by the ‘juxtaposition of a practical piece-meal approach to reform with a religious or quasi-religious vision of democracy.’”[xvi]  Burnham argued that this “progressive ethos,” an optimistic and scientific “moral fervor” to change the world, sparked a “progressive movement” around 1907-08 when journalistic criticism gave way to direct action and, thereby, inspired a “confluence of specific reform streams.”  These reform streams were primarily based within non-governmental voluntary organizations because progressives were “ambivalent” if not “mistrustful” of government action.[xvii]  Burnham argues that while “concrete achievements” outside of formal organizational efforts (membership lists, meetings, organizational literature) are “hard to demonstrate,” the membership numbers and sheer diversity of organizations was testament to the “awesome demonstration of the power of determined private citizens.”  Progressivism was also a “practical evangelism” based on professionalism, efficiency, expertise, and science, which lead to an “ideal of unselfish service and efficiency,” which in turn manifested itself in programs providing care, service, and protection.  These aid programs, carried out primarily by voluntary organizations, sought to reform behavior and change people – socially, politically, culturally, morally, hygienically, and linguistically.  Often reform organizations used education and persuasion to bring about this change, but coercion was not out of the question, especially when progressives thought reform was necessarily in the best interests of the recipient.

Robert M. Crunden’s “Essay”[xviii] drew on the work of Eric Erikson[xix] and argued that “progressivism” was a “frame of mind” or “frame of reference” composed of basic “moral and emotional attitudes” that many of the “leaders” of the reform period shared.  Crunden believed that Progressivism was not “specifically political or social, but rather cultural” to which he added, “progressivism was essentially religious” – a “form of displaced Protestantism:” Progressivism was the “spirit” and the “motivation” that inspired reformers.  Crunden defined a Progressive as “a person of strongly religious upbringing who displaced the moral concerns of his youth onto the very real social, industrial, political and aesthetic problems of his maturity, and who attempted to solve these public and personal problems within a Protestant, moral frame of reference.”  Crunden held up Jane Addams and John Dewey as “psychological paradigms of the progressive experience.”  Crunden also quotes Frederic C. Howe, a self-described reformer, who earlier wrote about the Progressive’s “evangelistic psychology:”

“I was conformed to my generation and made to share its moral standards and ideals…early assumptions as to virtue and vice, goodness and evil remained in my mind long after I had tried to discard them.  This is, I think, the most characteristic influence of my generation.  It explains the nature of our reforms, the regulatory legislation in morals and economics, our belief in men rather than in institutions and our messages to other peoples…all a part of that evangelistic psychology that makes America what she is.”[xx]

While Crunden argued that Progressives were primarily motivated by religious and psychological concerns, he did not discount or deny that other factors, like economical or political motivations, also played a part.  Crunden argued that many historians mistake economic and political motivations as the whole story.  Crunden argued that the Progressive Movement can best be understood in relation to the “psychological needs of the reformer.”

Crunden’s essay in Progressivism was expanded several years later into a book, Ministers of Reform: The Progressives’ Achievement in American Civilization, 1889 – 1920 (1982).  In this work Crunden again argued that Progressivism was an “ethos,” a “dominant national mood,” and a “system of values,” which grew out of the individual psychological needs of a culturally transitioning and professionalizing middle class.  He argued that Progressives shared no single political or social platform nor were they members of a single reform movement.  Progressives shared “moral values” and a commitment to the “spiritual reformation” of American democracy, and while the Progressive ethos often seemed “amorphous, inchoate, and difficult to define,” it was bounded by a Protestant and democratic discourse and infused by a moral fervor to reform all facets of U.S. society.  Crunden denied that there was a “progressive era,” and instead focused on three generations of U.S. reformism: liberal precursors of Progressivism [reformers born before 1854], 1st generation Progressives [reformers born between 1854 – 1874], and 2nd generation Progressives [reformers born between 1874 – 1894].  Crunden’s book makes several historical character sketches of individual Progressives, like Jane Addams, John Dewey, George Herbert Mead, and George Herron, in order to describe how a “progressive ethos” infused these individuals’ specific reformist impulse.[xxi]           

John D. Buenker’s “Essay” in Progressivism[xxii] marked a growing divergence on the subject.  He stood in agreement with Filene’s “shifting coalitions” theory and against the “ethos” theory of scholars like Burnham and Crunden.  Buenker argued that since “Progressivism” had been defined so many ways it had lost clear meaning except in relation to a specific political party and, thus, Buenker claimed, “as a description of either an ideology or a political program, I find it worthless and misleading.”  Buenker argued that trying to define Progressivism as a “common set of values” was disingenuous because it either gets defined too broadly (and thus just about every middle class person at the turn of the century could be described as “Progressive”) or it gets defined too narrowly (and thus becomes “ambiguous” and “contradictory” in relation to specific individuals). 

Buenker argued that there were many Progressive populations and programs and each had a different set of values.  Thus he believed that Filene’s “shifting coalitions” conception seemed the most appropriate theory with which to describe the various early 20th century reform movement(s).  Buenker argued that the idea of shifting coalitions was a more “comprehensive explanation” because it can take into account diverse reform movements composed of diverse people with diverse motives who may have on certain occasions accommodated or cooperated on specific reform issues: “the politics of compromise, conciliation, and coalition,” Buenker noted, “have been the hallmark of the American system from the beginning.”  A focus on shifting coalitions put primary emphasis on the political arena as the plane where compromise, conciliation, and coalition took place.[xxiii]  But he also noted that individual reformers had complex identities and conflicting social relationships, which in turn further fractured any coherent notion of personal “ethos” that a historian might construct.  Buenker demanded a complex reckoning of the specific social, cultural and political relationships and identities of individual reformers both prior to and during public reform debates and policy coalitions.[xxiv]

Daniel T. Rogers offered a look at the concept of “progressivism” in 1982.[xxv]  He noted that the term went from “one of the central organizing principles of American history” to a “corpse that would not lie down.”  The debate of the meaning of progressivism was “acute and troubling.”  He described the literature on the subject after 1970 as moving away from the ethos of Progressivism and actors in a Progressive movement to its “context” – the “structures of politics, power, and ideas within which the era’s welter of tongues and efforts and ‘reforms’ took place.”  The “fundamental fact” researchers of the 1970s focused on was the “explosion of scores of aggressive, politically active pressure groups” in an era of “shifting, ideologically fluid, issue-focused coalitions, all competing for the reshaping of American society” of which the Progressives were only one group.  Actually, Rogers argued, the group of reformers called the “Progressives” were really many distinct individuals and associations that “shared no common party or organization,” had “deep disagreements,” but from time to time shared ideas and rhetorical strategies.[xxvi]  Progressive politics, like other forms of politics in the era, were “coalition politics, prone to internal fissures.”  And this was perhaps one of the distinctive features of the era, the “rise of modern, weak-party, issue-focused politics.”  The other distinctive feature was the “revolution” in “social organization:” “the eclipse of the local, informal group” and its “replacement by vastly bigger, bureaucratically structured formal organizations,” most importantly the business corporation and the regulatory state.  Rogers spent some time reviewing the literature of New Left historians like Gabriel Kolko and James Weinstein whose research described the “new corporate phase of capitalism,” which allowed the corporation to become the “dominant” economic force of the 20th century.     

In 1983 Arthur S. Link and Richard L. McCormick published a short but detailed historiographical summary of the literature on Progressivism up to 1980.  Link and McCormick organized the previous scholarly literature into six schools of analysis:

1) a conflict between “ordinary” and wealthy Americans

2) the continuation of a long tradition of agrarian protest

3) an urban, WASP, professional, middle-class movement trying to organize society, thereby, remedying industrialization, urbanization, and immigration

4) an urban, WASP, professional, middle-class movement on an intolerant moral crusade to remake America inspired mostly by their own personal problems

5) reformers from the “wealthiest” groups of society out to address social ills

6)  diverse reform groups with divergent missions who often formed “shifting coalitions” to address and combat particular issues

For all six schools, Link and McCormick warned, historians have not often separated “purposes, rationale, and results” in their research.  These are three very different yet mutually informative categories of analysis.  The authors pointed out how many historical studies of the period have exaggerated a single category of analysis to the exclusion of others. 

Despite all the diversity on the subject, Link and McCormick did offer their own summation of “Progressivism.”  They noted there was “no unified movement,” but many “diverse” and “convulsive reform movements” with many diverse and contradictory goals that came through the U.S. between the 1890s and 1917.  These reform movements were typically led by “crusading” middle- and upper-class, native-born, professional Americans who sought in one way or another to address and ameliorate specific social ills, especially those social problems resulting from urbanization and industrialization.  The typical Progressive reform pattern began with investigation of a problem, which led to organizing a response, which in turn led to educating the citizenry, and often ended with the pinnacle of Progressive reform – legislation: Reformers “assumed that passing a law was equivalent to solving a problem and that government officials could be entrusted to enforce the measure in a progressive spirit.”   And while different reform movements and leaders articulated distinctive discourses of social justice, they were all usually “simplistic, traditional, moralistic” and programmatically warranted some kind of narrowly defined social control.  Specifically, the authors pointed out an often neglected aspect of Progressivism: “coercive” Progressives.

Coercive Progressive programs sought to impose social control and they took various forms, like the White Jim Crow movement in the South, Americanization programs, and moral reforms such as temperance and prohibition.  While they made many references within the literature to the problematic usage of “Progressive,” Link and McCormick argued in passing, “it might be better to avoid the terms progressive and progressivism altogether, but they are too deeply embedded in the language of contemporaries and historians to be ignored.”[xxvii]

In 1987 Nell Irvin Painter published her award winning treatment of the Progressive Era, Standing at Armageddon: The United States, 1877 – 1919.[xxviii]  She analyzed the politics of the era via a “hybrid political-labor history” framework.  This schema allowed her to focus on the “conflict between various groups, classes, and competing ideals,” which morphed into a pitched battle between “partisans of democracy” and “protectors of hierarchy” – “the struggle over the distribution of wealth and power.”  This great political conflict and struggle caused enormous amounts of “fear…plain, stark fear” in the hearts and minds of the middle- and upper-class.  Painter argued that this fear “lay at the core” of many “progressive reforms.”  The table below displays the large gap between the very rich (0.01%), the rich (11%), and the rest of the country (88%).  The great extremes of wealth caused by capitalism and industrialization became a point of concern for the great majority who owned less than 15% of the national wealth. 

Painter stressed that while income does provide the “single clearest indicator of class standing,” the notion of class needed to be seen as a complex, “fluid” and ever changing classification, whereby there was no single “middle class,” but rather “middle classes” (and also “many ethnicities and races”).  Those elite classes with the most at stake and thereby the most influence liked to put forth ideological arguments for the “identity of interest.”  This belief conceptualized society as a smoothly functioning organism wherein the interests of the great capitalists and property owners were supposedly the best interests of all in society and in harmony with “laws of God or Science.”  Reformers acting as “democratizers” put forth a counter-conception of society.  Seeing their own middle-class or working-class interests at odds with those of capitalists and industrialists, democratizers saw society torn by a “conflict of interests.”  Reformers often, but not always, tried to point out the interests of the “disadvantaged” within the social system and thereby argue for “the ideal of equity” and democracy, in order to confront the dangerous extremes of wealth and privilege.  But lurking at the periphery of all calls for reform was the specter of working class unrest, which from time to time would boil into a froth and cause conflicts of interest to turn into real (and often violent) social and political struggles for power.  The so called “Progressive Era” was marked by a widespread call for reform and social change, however, as Painter pointed out, “the broadening consensus that change was necessary did not include agreement on the direction or extent of these changes.”[xxix] 

Perhaps the most powerful voice of reform came from educated and elite men who wanted a more “clean, efficient government” operated by a rationalized bureaucratic machinery and run by an advanced cadre of elite professionals.  Painter argued that reform initiatives during the period were often very “ambiguous” and rarely a “straightforward story of altruism” because “nativism, racism, and sexism” pervaded both the reformist impulses and the reformist programs of these educated elites.  By the early 20th century many middle class and agrarian reformers, including the so-classed Progressives (like the Progressive Party’s presidential candidate, Teddy Roosevelt), saw the United Stats as standing on the threshold of “Armageddon” with the evils of plutocratic industrial power on one side and the evils of the violent mob on the other.  Progressives under the banners of “New Nationalism” or “New Freedom” called for the regulation of society and the economy by an empowered and enlightened federal government which would act as a disinterested arbitrator between conflicting political factions, like labor and capital (of course more radical voices pointed out the impossibility of a disinterested federal government as federal policy was often in the hands of industrial capitalists and their appointed voices in the Congress).  Teddy Roosevelt succinctly summarized the ideals of these Progressive reformers: “the object of the government is the welfare of the people.  The material progress and prosperity of a nation are desirable chiefly so far as they lead to the moral and material welfare of all good citizens.”[xxx]       

The most frightening voice of reform came from the laboring classes and political radicals who often spoke in the name of working class interests.  Often disposed and exploited, lacking any real propertied interest in the social order, workers expressed their frustration through “strikes, boycotts, and cooperative enterprises” in order to pool their collective strength as a means to gain bargaining leverage with their industrial masters.  It was not until the late 19th century that workers and radicals, especially socialists, began turning to the political process and electoral politics as a way of “influencing” the U.S. economy and factory workplaces.  Labor and Populist leaders began to see that “they would have to take a hand in shaping the laws that governed them,” which meant lobbying the state and federal governments “to seize the powers to regulate” the industrial economy on the “behalf” of working class interests.  Women and ethnic minorities also tried to use the political process in order to highlight their marginal status and seek redress through political rights, but these efforts were largely unsuccessful during the “Progressive Era,” with the exception of white women who were able to gain suffrage by the end of WWI.[xxxi]      

Painter also talked at length about race and racism in the U.S.  She discussed the racialized U.S. foreign policy and imperialist interventionist projects of the period.  Formally mapped out in 1885 at the Berlin Conference, the world had been divided by white Europeans, and also by the turn of the century Japan.  By 1900 Europeans ruled over 1/5 of the world’s land and 1/10 of the world’s human population.  Each dominant European nation assigned itself “national spheres of interest” over which each nation, through soft and hard exercises of colonial mastery, exploited favorable terms of trade and natural resource extraction.  The U.S. via a revitalized Monroe Doctrine asserted control over the Americas and the Caribbean with expansive moves across the Pacific and into China.  With some envy for the preeminent stature of Great Britain, Painter argued that an “Anglo-American identity of interest” coupled with an “Anglo-Saxon chauvinism” congealed in the later 19th century as the English-speaking countries united under the racialized banner of “the natural superiority of Anglo-Saxons.”  After the conquest of the Philippines president McKinley wrapped U.S. foreign policy in this doctrine of “the white man’s burden.”  He stated that the Filipinos could not be left to themselves because “they were unfit for self-government” and, thus, the Americans had a duty “to take them all, and to educate the Filipinos, and uplift and civilize and Christianize them, and by God’s grace do the very best we could by them.”  Indiana Senator Albert J. Beveridge believed that the “American Republic” was destined, through the will of God and the dictates of the “highest law” of “race,” to be “the most masterful race in history.”  Painter explained: “Imperialism was elemental, racial, predestined, for God had prepared the English-speaking people, master organizers, for governing what Beveridge called ‘save and senile people.’”  Even anti-imperialists, who argued against the trappings of empire for many reasons, often framed their critiques of foreign intervention with the same racist assumptions, and focused more on the implications of empire for poor whites in America.  Many Southerners actually felt vindicated by Imperial policies, although skeptical about ruling over more non-white people.  Benjamin Tillman argued to his fellows in Congress that “We of the South” had already “borne this white man’s burden of a colored race in our midst.”  In 1883 the Supreme Court had already invalidated the Civil Rights Act of 1875 and by the 1890s there was widespread acceptance of Jim Crow segregation and disenfranchisement laws.  The color line became an increasingly important national preoccupation by the early 20th century as the U.S. became defined more and more as a white man’s nation.  Thus self-proclaimed “progressives” never touched the white supremacy of the South and de facto “racial hierarchy” of the country as a whole.[xxxii]

Another article by John D. Buenker published in 1988 argued for the existence of “two full-blown political cultures,” which influenced and defined the socio-political and cultural identifications of Americans during the turn of the 20th century.[xxxiii]  Despite the “complexity and diversity of motives, goals, methods, and results” of socio-political and cultural struggle during this period, Buenker argued for two distinctive and primary “competing political cultures.”  These two political cultures were especially important in defining the relationship between the individual and society, and they set up distinctive battle lines within the “arena of structural reform:” 1) the “new politics” of a “modernizing” ideology of “atomistic aggregation of sovereign individuals,” which was associated with the “reformer-individualist-Anglo-Saxon complex,” and 2) the “old politics” of an “ethnic identification” of “organic networks,” which was aligned with the “boss-immigrant-machine complex.”  Buenker argued that these two world views shaped the context out of which individuals defined their socio-political-cultural identities and allegiances, but they should not be seen as some oversimplified dualism: “The choice made by individuals was not a dichotomous one between the sovereign individualist or organic network world views.  Rather, the two views functioned as antipodes on a continuum or as the rows and columns of a matrix on which each person found his or her own identity out of a bewildering variety of permutations that changed over the life cycle.”  The Progressive coalition would have been associated with the “new politics” and part of their mission, under the terms Buenker introduced in this essay, was to confront and defeat the “old politics” for control over the socio-political-cultural reform that would govern the new century.        

In 1999 Alan Dawley published a major work on the broad period of reform infusing the early 20th century: Struggles for Justice: Social Responsibility and the Liberal State.  The central question of his book (and the broad period of reform under study) was not about Progressivism but about “how could the existing form of the state, designed generations earlier for an agrarian-commercial society, withstand the brawling conflicts and relentless evolution of an urban-industrial way of life?”  Dawley argued that the “crux of American history” around the turn of the 20th century was the “reckoning between a dynamic society and the existing liberal state.”  Progressivism was only a small, but important part of this much larger and very global issue. 

His book broke down the reckoning of state and society into three stages.  The first stage was “imbalance” and he located this stage between the 1890s and 1913.  During this period U.S. society was “on a collision course” with its political system based on laissez-faire liberalism and the inequality it bred.  Liberty and political right were “reserved” for wealthy, white men and as other groups struggled for socio-political inclusion, the “polarities of class and culture intensified” and “struggles broke out” across the nation.  Many reform initiatives reacted to this conflict so as to resolve it, but different reformers often fostered conflicting visions, which only furthered the melee.  And behind it all, Dawley argued, was a “contradiction between the needs of society and the existing political system.”  The next stage, from 1914 to 1924, was a time of “confronting issues” by the state resulting in an increase in state intervention and regulation, whereby, the “governing system” of “state embedded in society” began to change in dialectical relation to social struggles.  The last stage from 1925 to 1938 marked a “resolution” of state intervention to “restore balance” to the governing system.  The New Deal was the primary institutional impetus of this resolution, but Dawley was quite clear in arguing that this new policy program focused on “neither liberty nor equality, but security.”

At the center of Dawley’s book was the “problem of hegemony:” “how was society held together (consensus) against its own inner contradictions (conflict)?”  One of the central arguments he made towards explaining the successful change within the governing system was the power and strategy of elites “to regain their legitimacy by reforming the system.”  He links progressivism to “managerial liberalism” and “social liberalism” as viable forms of state interventionism that could accommodate reformist demands for social change while legitimizing elite management of the social and political transformations.  As a solidly liberal and yet quasi-socialist ideology, Progressivism was able to “contain” socialism and thus middle-class and elite interests were able to steer reformist initiatives in more conservative and capitalist directions that did not significantly challenge the institutional structure of liberal society.  Dawley argued that it was “inevitable” that “state structures and ruling values would change” – “the only questions were how, and in whose interests?”  In terms of early 20th century reform initiatives and state interventionism, Dawley wrote, “Americans were dragged kicking and screaming toward social responsibility.”[xxxiv]  Thus Progressivism, in Dawley’s conception, was a response to challenge the excesses and instability of the elite managed liberal state while containing the more threatening challenges and disorder of lower class unrest.        

John Whiteclay Chambers II first published The Tyranny of Change: American in the Progressive Era, 1890 – 1920 in 1992.  He noted that many historians have written about the “Progressive Era,” but they have not been clear about “the nature of either progressivism or the era.”[xxxv]  Despite this confusion, he argued the concepts of a “progressive impulse or ethos” and a “Progressive Era” continued to be “relevant.”  He noted that while Progressivism was not a united movement, it was still “the most pervasive political reform effort since the pre-Civil War period.”  He called Progressivism a “controversial and complex,” “multifaceted,” “moderate” reform movement that “affected nearly every aspect of American life.”  He also acknowledged the shifting coalitions theory by stating how a “hodgepodge of coalitions” often “contradicted each other” while working for “diverse” social change.  However, while he denied Progressives a “common creed or a system of values,” he also described what he believed to be some common “clusters of ideas” and “social languages,” like democratic ideals, rhetorical appeals to move people, a “politics of opposition.”  Whiteclay argued that Progressives were not often original thinkers, but there were powerful “users” of ideas in the effort to initiate social change. 

Chambers devoted a whole chapter to the “Progressive Impulse” in which he defined “Progressivism” as a “nationwide movement” composed of a “number of major efforts to reform society through the power of private groups and public agencies.”  Leaders and participants of some of these reform efforts called themselves “Progressives,” and hence the label often given for the whole period, but there were many radical and conservative reformers as well.  Chambers noted, Progressives “battled conservatives, radicals, other reformers, and often each other.”  Acknowledging the multiplicity of ideological reform groups was a marked change in direction as most historians up to this point had tended to focus mostly on those particular individuals and groups who claimed the “Progressive” mantle.   

Chambers noted that recent scholarship in the 1990s had emphasized the socio-political contexts of reform (“the environment of politics, power, ideas, and values”), and also the role of the state, specifically the relationships between different “political structures” and particular social groups.  Perhaps the most notable new direction in the historiography of the period had been Chambers’ use of the term “the new interventionists” to describe the whole, broad reform movement of the period, which included Progressives, but also included the many other ideological reform groups of the period   The new interventionists used voluntary associations and sometimes the state to challenge 19th century lassie faire individualism and free-market capitalism and this challenge took many forms: Progressives, moderates, conservatives, traditionalists, and radical activists like socialists, communists, and anarchists.  The new interventionists, Chambers claimed, left a “divided legacy.”  They seemed to have been more successful “at arousing indignation and protest than at maintaining effective government and substantially ameliorating urban problems.”  They also over-relied on strong leadership and monolithic reform visions that often led to “the tyranny of change,” whereby, the general public supported or elected strong leaders but had very little impact on public planning or policy.[xxxvi]

William Deverell argued that by 1994 the concepts of “progressivism” and “progressive” carried “diverse and heavy burdens of meaning,” which made many scholars believe that these terms had “outlived their usefulness as meaningful expressions by which to explain” the past: these concepts had lost, in the words of Martin Sklar, their “interpretive precision.”[xxxvii]  But Deverell argued that scholars must not loose sight of the fact that “individuals, parties, and groups used the terms progressive and progressivism to define themselves, their work, and their outlook as the new century arrived.”  He stressed that there was an “historical context” within which these terms were “borrowed, taken, utilized, even invented” and scholars and historians would do well to admit that these terms “once meant something” before these terms become jettisoned for more precise conceptualizations.  Deverell noted that while progressivism had become “an embattled word, an embattled concept,” real derivatives from the “progressive phenomenon” were still visible in the current socio-political climate and discourse: “Progressivism is alive and well four score years after its birth.”

Gary Gerstle’s “The Protean Character of American Liberalism” (1994) discussed the changing ethos of American liberalism from the turn of the twentieth century to the New Deal.[xxxviii]  Gerstle argued, it is “unwise to treat the liberal community as a stable political entity or to presume that the criteria for identifying liberals in one period can be applied to another.  Any effort to define the liberal community must be firmly located in time and space.”  Gerstle noted that the “liberal tradition” had three “foundational principles” (emancipation, rationality, and progress), but overall liberalism had a marked “malleability” that made for variant socio-political programs and ideologies. 

Classical liberalism revolved around free markets, limited statism, and bourgeois morality, which often defended corporate capitalism, segregation and disenfranchisement.  By the end of the 19th century liberalism displayed a reformist edge and it organized “rational interventions in society and culture,” often by turning “to the state as an institutional medium capable of reconstructing society and of educating citizens.”  Progressivism was a three pronged liberal reaction to (a) socialism and labor radicalism, (b) the “extraordinary concentration of power and wealth,” and (c) a diverse influx of immigrating ethnic groups.  Progressives wanted to find ways to promote and protect “freedom of trade and individual liberty” by way of state regulation and welfare, and by way of “guild socialism.”  They also wanted to engage in “cultural reconstruction” because liberals believed in the importance of individual moral character as the foundation of civic virtue.  When dealing with foreigners this “reconstruction” took the form of “Americanization” in order to “culturally and morally transform…aliens into citizens.”  But Progressives were a diverse bunch (“left-leaning Progressives” ranging from socialists to left leaning pluralists, and “rightward-leaning Progressives” from Americanizers to hard core nationalists preaching “100 percent Americanism”) and because of these conflicts of purposes and methods they “had difficulty fashioning a cultural politics to which they could all adhere,” which eventually lead to a loss of “coherence as a political movement.”

During World War I and the Red Scare Progressives felt themselves and their ideological convictions to be “impotent in the face of a reactionary nationalism.”  Liberals largely “give up the fight to create a new culture and new nationalism,” and began to ignore the “irrational” realm of culture to focus instead on the more rational and therefore changeable realm of economics and political economy.  This lead to a widespread “exclusion of ethnicity and race” from liberal social scientific analysis, which lead to a “more narrowly conceived” liberal program of economic recovery during the New Deal years.  It took the rise of Hitler and the Nazi party to bring back liberal discussions of “racial and ethnic discrimination.”  After World War II liberals once again “reconstituted” their political focus and began to define “issues of ethnicity and race as appropriate targets of rational social action,” while treating “class politics as an expression of irrationality” and therefore beyond the scope of liberal intervention.       

In 1997 Eric Foner edited a volume for the American Historical Association that offered a look at the “new” American history written over the last 20 years.  Within this volume Richard L. McCormick talked about Progressivism and other reform impulses in “Public Life in Industrial America, 1877 – 1917.”[xxxix]  In this essay McCormick claimed that the central issue of this period was industrialization and modernization, and how individuals and groups addressed the unsettling consequences of these two developments.  There is no “coherent synthesis,” McCormick argued, for describing the “complex” social, political and cultural reactions to industrialization and modernization.  There were “many organized endeavors” that produced many “unexpected results.” But McCormick did argue for some common themes:

“Most people confronted variations on a common problem: the defense of their families and communities against outside forces emanating from industrial growth and the increasing heterogeneity of the population.  Americans faced that problem, moreover, within a common environment: a rapidly expanding economy that was causing massive dislocations, frequent depressions, and widespread unemployment.”

In response to this common problem and a common environment, “virtually every segment of society plunged into public life to advance (or defend) their private values.”  But many different segments of society acted in many different ways for many different reasons.[xl]  McCormick focused on some of the major segments of society that have been covered in the recent literature: business and financial interests, industrial workers, farmers, and middle-class women.  He described how they variously responded to the common problem of the era: looking to the government to promote economic growth; organizing and looking to the government to foster unionization and industrial reform; organizing political blocks and cooperative ventures; joining associations and lobbying for reform.  McCormick argued that the most notable phenomenon of the era was the organization of socio-political-cultural associations that addressed a wide range of social problems from a wide range of perspectives, and “increasingly offered not panaceas but full-blown agendas for social and political change.”  In a certain sense these radical, Populist, and Progressive groups failed to achieve much, as decades of historians have shown, but not because they were necessarily naive or ineffective, but because “their enemies were more powerful” and because voting and policy change were seen as the only legitimate form of success.  However, McCormick makes clear that reformers of this period were successful in a much larger sense; they were able to create hundreds of organized, “non partisan” associations, which were able to drain “money, manpower, and organizational muscle” from political parties, and in turn “reshaped” the governing system throughout the century along “activist” and “interventionist lines.” 

The “seeds of Progressivism were planted,” McCormick argued, in response to two looming questions: whether social and political institutions were “adequate” enough to address and fix devastating times, and whether “democracy and economic equality were possible in an industrial society?”  The Progressives[xli] were not alone “in trying to use public, political means to solves problems,” but they might have been the most effective and successful group to do so.  The Progressive project consisted of four “distinctive methods:” organizing voluntary associations, investigating pertinent problem, finding the facts, and using social scientific analysis to offer a solution.  Progressives seemed to believe that experts using the scientific method could find the perfect solutions to all social problems and, further, they believed the solutions would benefit everyone as well as society as a whole.  But in reality, Progressives used the rhetoric of science and the common good to mask the imposition of their own values, especially in relation to the “racial and ethnic groups they hated and feared,” in their broad efforts to “improve and control the often frightening conditions of industrial life.”  

Another study of Progressivism was done in 2000 by a political scientist who was engaged on a longitudinal study of a much broader topic.  Robert D. Putnam’s Bowling Alone: The Collapse and Revival of American Community focused on the change of social capital and civic engagement in the United States over the course of the 20th century.  In the last section of his book, as a way to set up and inform his policy prescriptions, Putnam devoted a chapter to “Lessons of History: The Gilded Age and the Progressive Era.”  This chapter was indebted to many of the books reviewed in this paper.  In this chapter Putnam praised the “Progressive Era” (which he located from 1900 – 1915) as a good example of “practical civic enthusiasm,” but he also said that it was suffused with “exclusion” based on class, ethnicity, and race.  Progressives were a “practical” and “experimental” bunch of reformers who shifted programmatically between professionalism and grassroots democracy in their conviction that social, political and economic institutions needed to be better adapted to the modern industrial world – although Putman made it clear that Progressives seemed to prefer “technocratic elitism” and “expert solutions.”  The main engine of reform was the voluntary association (social, political, religious, and cultural), which was the main focus of Putnam’s study.  Putnam argued that the period from 1870 to 1920 displayed a “civic inventiveness” in terms of the founding, range, and durability of associational organizations, which was and still is un-paralleled in U.S. history: “to a remarkable extent American civil society at the close of the twentieth century still rested on organizational foundations laid at the beginning of the century.”  Putnam called Progressivism a “broad and variegated” “social movement” that may not have been much of a social movement in the conventional sense; however, it represented a “civic communitarian reaction to the ideological individualism of the Gilded Age” and the primary form this reaction took was the creation of voluntary associations and socio-political institutions, which greatly increased the aggregate measure of social capital and civic engagement.  It was this creation of social capital and civic engagement that marks the Progressive movement as a seminal event in the history of the U.S. and it had an impact many decades after the Progressives as a “movement” faded from the stage.  But Putnam ended his chapter with a warning: “social capital is inevitably easier to foster within a homogeneous community.”  The Progressives’ broad expansion of social capital was fostered by systematic socio-political exclusion based on class, ethnicity and race.  Putnam praises the Progressive Era for its inventiveness, enthusiasm, and idealism, but warns that its particular reforms “are no longer appropriate for our time” – “Our challenge now is to reinvent the twenty-first-century equivalent.”[xlii]

The last and most recent study to be examined is Michael McGerr’s A Fierce Discontent: The Rise and Fall of the Progressive Movement in America, 1870 – 1920.[xliii]  This impressively comprehensive book looked at Progressivism in relation to a broad swath of social, political, and cultural responses to industrialization and modernity.  Industrialization “fractured old ideologies,” wrote McGerr, and “created new ones, including progressivism.”[xliv]  Progressives articulated, in the words of one of their figure heads Theodore Roosevelt, a “fierce discontent,” and they believed both in social progress and in the moral regeneration of their nation.  Progressivism was the “creed of a crusading middle class” that offered the “promise of utopianism” in the wake of industrial inefficiency, urban chaos, political degeneracy, and cultural confusion.  Progressivism, McGerr claimed, was a “radical movement” – what he called “the radical center” – that sought not only to “use the state to regulate the economy,” but also to “transform” “other social classes,” other Americans, into a new socio-cultural body politic. It was this demand for “social transformation,” McGerr claimed, that “remains at once profoundly impressive and profoundly disturbing a century later.”[xlv]

McGerr also acknowledged that Progressivism contained many “ambiguities and contradictions,” but its various “fault lines” never “split wide open,” partly due to the fact that the Progressive middle class was “overwhelmingly white and Protestant” and, for the most part (despite the fissures of class and gender) culturally homogeneous.  This raises a central question about which the literature on Progressivism and early 20th century reform has been largely silent until the 1970s, but which many historians since then, including McGerr, have exposed in detail.

There is a distinct and disturbing relationship between what Nancy MacLean has termed “reactionary populism” and what we have labeled “Progressivism.”  MacLean’s book on the Ku Klux Klan described her subject not as the backwoods yokels they are often mistaken for, but as an organized movement composed of white, evangelical Protestant, mostly petit-bourgeois (but included working class laborers and middle class professionals) who felt threatened by the developments of modernity, and who thereby fomented a reactionary form of populism.  The rise of divorce, feminism, black radicalism, white racial liberalism, labor unionization and strikes, monopoly capitalism, and increased immigration are just some of the major issues initiating their conservative reaction.[xlvi]  MacLean’s Klan members were going through their own status revolution, whereby, the typical Klansmen was economically better off the most blacks and many whites and often upwardly mobile, but still felt “vulnerable,” “unstable” and insecure.[xlvii] 

Klansmen were conservative, populist, Jacksonian democrats with an explicitly racialized and Protestant conception of White Anglo-Saxon citizenship consecrating white supremacy.  They reacted to modernity and industrialization (to the extent that industrialization touched the South) in systematically similar ways to the Progressive programs: both groups formed organized associations; they rhetorically denounced “threats” to their idealized social order; they formulated an ideology to defend an embattled cultural identity; they took action to “reform” or remedy what they considered to be negative socio-political and cultural developments; and they used coercion when rhetorical appeals were not effective.  The two main differences between Progressives and reactionary populists were that the Klansmen had an intense distrust of centralized government and statist regulatory authority, and they had a willingness to use violent force[xlviii] as a standard socio-political tactic.

Another similarity between Progressives and Klansmen was a hierarchical, Social Darwinist belief in the racial and cultural superiority of “white” “civilization,” which was often equated with Americanism.[xlix]  C. Vann Woodward pointed out in 1954 that many Americans, including Progressive reformers (living in all areas of the nation, the North, West and South) shared many of the Klansmen’s beliefs about a “White” America: “a republic is possible only to men of homogenous race;” the United States of America was “a white man’s nation” based on a “white man’s religion:” “to stand as impregnable as a tower against every encroachment upon the white man’s liberty, the white man’s institutions, the white man’s ideals, in the white man’s country, under the white man’s flag.”[l]  It is no accident of historical fortune that the “Progressive Era” was also the “great age of segregation” in the United States.[li]  The Progressives for the most part harbored deep suspicions and prejudices against many groups and social classes that seemed alien to their WASP middle class way of life.  Progressive reformers set up hierarchically ordered binary oppositions of identity based on class, race, gender, religion and age.  The “fundamental paradox of progressive politics,” wrote McGerr, was that Progressives spoke the language of democracy, but in thought and deed they were “not very democratic at all:” the “progressives’ condescension toward other groups” created “a narrow definition of ‘the people,’” dictated antiparticipatory reforms,” “supported disfranchisement,” and projected a version of Americanism that was “for whites only.”[lii]  David R. Roediger argued, “The Progressive project of imperialist expansion and the Progressive nonproject of Jim Crow segregation ensured that race thinking would retain and increase its potency.”[liii]  Eric Foner pointed out that Progressives “bore the marks of their nineteenth-century origins” and thus “the idea of ‘race’ as a permanent, defining characteristic of individuals and social groups retained a powerful hold on their thinking.  Consciously or not, it circumscribed the ‘imagined community’ of Progressive America.”[liv] 

So then what is “Progressivism” and what is the Progressive legacy?  These terms are embedded in an “age of social politics.”[lv]  There were many reformist groups of various political and ideological stripes at the turn of the 20th century, of which Progressivism was but one potent example.[lvi]  There were not only many reformist groups that articulated many different reform initiatives, but Progressives also took “man paths” towards reform.[lvii]  As a culturally homogeneous and economically secure social class (although uneasy in their security), Progressive reformers had the ability, education, and socio-economic resources to create many diverse voluntary organizations, which they used to further various social, economic, political, and cultural causes.  Progressives were animated on the whole by a Republican-Populist-Protestant infused ideological orientation that often blended capitalist, scientific, and professional methods, all under a politicized and racialized banner of WASP “Americanism.” 

Progressives sought many types of social change and aligned themselves with various other ideological groups to achieve reform coalitions on specific issues and initiatives, but they were primarily concerned with devising a clear and efficient order to harness modernity and industrialization under the tri-partite control of 1) a regulatory State integrated with 2) WASP civic associations and business corporations, and directed by 3) a technocratic elite.   “Americanization,” to introduce this broad and complicated term which is the central focus of this larger study, could be described as the essential yet myriad conceptualization for this controlling order: “America” as a nationalistic and cultural identity would be the new order the Progressives sought and they were very confident, as Gary Gerstle pointed out, “that their use of government and science would turn immigrants into Americans.”[lviii] 

As Robert Wiebe argued in “Framing U.S. History: Democracy, Nationalism, and Socialism,” the challenge of white Americans during the 18th and 19th century was not to reform so much as to “create a social order” and that social order, my larger study will argue, was a program of Americanization, which included the formation of a federated bureaucracy centered within the corporate-capitalist State.  By the early 20th century, this State would come to infuse, unite, and control the parameters of foreign and domestic policy under a neo-liberal rhetoric of welfare capitalism, consumer affluence, and technocratic professionalism.[lix]  However, the large-scale initiative of Americanization would not be uncontested nor would it be rhetorically or programmatically uniform.  As a consensus identity emerged and was inculcated within the public school system, the margins of American society were infused by minority populations who struggled for their own human dignity and opportunity within the American system.  The Progressive century of Americanization would be the ideological center of heated debate.  Preconceived notions of homogeneous and class based democratic citizenship would be challenged as many minority populations asked, “Who gets to be an American?” – and further, as a socio-cultural-political ideal, “What ought America to be?”     

Cultural War: Epistemological Authority, Progressive Politics, and the Americanization Movement

The diverse and often contradictory Progressive reform movement has come to characterize an era of “social politics” in U.S. history.[lx]  At the turn of the 20th century there were many reformist groups with various political and ideological programs (Populists, Progressives, Socialists, anarchists, labor unions, reactionary populists, nativists, and more).  Progressivism was the most influential reform ideology of the 20th century because it offered a conservative liberal-capitalist framework for tempering the more radical demands of socialists and labor activists.[lxi]  Not only were there many reformist groups with many different initiatives, but the highly diverse group called the Progressives also took “man paths” of reform.[lxii]  As a relatively culturally homogeneous and economically secure, yet uneasy, social class, what would be later termed the White Anglo-Saxon Protestant (WASP) middle-class, Progressive reformers had the ability, education, and socio-economic resources to create many diverse voluntary organizations, which they used to further various social, economic, political, and cultural causes.  Progressives were animated on the whole by a Republican-Populist-Protestant infused ideological orientation that often blended capitalist, scientific, and professional methods, all under a politicized and racialized banner of WASP “Americanism.” 

Progressives sought many types of social change and aligned themselves with various other ideological groups to achieve reform coalitions on specific issues and initiatives, but they were primarily concerned with devising a clear and efficient order to harness modernity and industrialization under the tri-partite control of 1) a regulatory State integrated with 2) WASP civic associations and business corporations, and directed by 3) a technocratic elite.   The idea of “Americanization” could be described as the fundamental yet myriad conceptualization for this controlling order: “America” as a distinct people with a uniform culture and a clear sense of national identity would be the new order the Progressives sought and they were very confident, as Gary Gerstle pointed out, “that their use of government and science would turn immigrants into Americans” and, thereby, mold newcomers into the new constructed Progressive American nation.[lxiii] 

But there is also a disturbing relationship between Progressive reformers and “reactionary populism” that should be addressed.  Reactionary populists like Ku Klux Klan members were not the stereotypical backwoods yokel.  The Klan was as an organized movement composed of white, evangelical Protestant, mostly petit-bourgeois (but included working class laborers and middle class professionals) who felt threatened by the developments of modernity.  The rise of divorce, feminism, black radicalism, white racial liberalism, labor unionization and strikes, monopoly capitalism, and increased immigration are just some of the major issues initiating their conservative reaction.[lxiv]  Klan members were going through what Richard Hofstadter once called (in a different context) a “status revolution.”  Klansmen were economically better off then most blacks and many whites and often upwardly mobile, but they still felt “vulnerable,” “unstable” and insecure in their relatively privileged social position.[lxv] 

Klansmen were conservative, populist, Jacksonian democrats with an explicitly racialized and Protestant conception of White Anglo-Saxon citizenship consecrating white supremacy.  They reacted to modernity and industrialization (to the extent that industrialization touched the South) in systematically similar ways to the Progressive programs: both groups formed organized associations; they rhetorically denounced “threats” to their idealized social order; they formulated an ideology to defend an embattled cultural identity; they took action to “reform” or remedy what they considered to be negative socio-political and cultural developments; and they used coercion when rhetorical appeals were not effective.  The two main differences between Progressives and reactionary populists were that the Klansmen had an intense distrust of centralized government and statist regulatory authority, and they had a willingness to use violent force[lxvi] as a standard socio-political tactic.

Another similarity between Progressives and Klansmen was a hierarchical, Social Darwinist belief in the racial and cultural superiority of “white” “civilization,” which was often equated with Americanism.[lxvii]  C. Vann Woodward pointed out in 1954 that many Americans, including Progressive reformers (living in all areas of the nation, the North, West and South) shared many of the Klansmen’s beliefs about a “White” America: “a republic is possible only to men of homogenous race;” the United States of America was “a white man’s nation” based on a “white man’s religion:” “to stand as impregnable as a tower against every encroachment upon the white man’s liberty, the white man’s institutions, the white man’s ideals, in the white man’s country, under the white man’s flag.”[lxviii]  It is no accident of historical fortune that the “Progressive Era” was also the “great age of segregation” in the United States.[lxix]  The Progressives for the most part harbored deep suspicions and prejudices against many groups and social classes that seemed alien to their WASP middle class way of life.  Progressive reformers set up hierarchically ordered binary oppositions of identity based on class, race, gender, religion and age.  The “fundamental paradox of progressive politics,” wrote McGerr, was that Progressives spoke the language of democracy, but in thought and deed they were “not very democratic at all:” the “progressives’ condescension toward other groups” created “a narrow definition of ‘the people,’” dictated antiparticipatory reforms,” “supported disfranchisement,” and projected a version of Americanism that was “for whites only.”[lxx]  David R. Roediger argued, “The Progressive project of imperialist expansion and the Progressive nonproject of Jim Crow segregation ensured that race thinking would retain and increase its potency.”[lxxi]  Eric Foner pointed out that Progressives “bore the marks of their nineteenth-century origins” and thus “the idea of ‘race’ as a permanent, defining characteristic of individuals and social groups retained a powerful hold on their thinking.  Consciously or not, it circumscribed the ‘imagined community’ of Progressive America.”[lxxii] 

As Robert Wiebe argued, the challenge of white Americans during the late 18th and 19th century was not to reform so much as to “create a social order” and that social order was a program of Americanization, which included the expansion of a corporate-capitalist State, the dissemination of a WASP nationalism (Americanism), and the trained loyalty of the American public through the public schools and an coordinated civic society.  By the early 20th century, this State would come to infuse, unite, and control the parameters of foreign and domestic policy under a neo-liberal rhetoric of welfare capitalism, consumer affluence, and technocratic professionalism.[lxxiii]  However, the large-scale initiative of Americanization would not be uncontested nor would it be rhetorically or programmatically uniform.  As a consensus identity emerged and was inculcated within the public school system, the margins of American society were infused by minority populations who struggled for their own human dignity and opportunity within the American system.  The Progressive project of Americanization would be the ideological center of heated debate over American nationalism, citizenship, and the common good.  The notion of cultural homogeneous, racialized, and class-based democratic citizenship would be challenged as many minority populations.  The early 20th century debate focused on, “Who gets to be an American?” and “What ought America to be?”     

Many white Americans supported a white supremacist view of American nationalism.  After the conquest of the Philippines president McKinley wrapped U.S. foreign policy in this doctrine of “the white man’s burden.”  He stated that the Filipinos could not be left to themselves because “they were unfit for self-government” and, thus, the Americans had a duty “to take them all, and to educate the Filipinos, and uplift and civilize and Christianize them, and by God’s grace do the very best we could by them.”  Indiana Senator Albert J. Beveridge believed that the “American Republic” was destined, through the will of God and the dictates of the “highest law” of “race,” to be “the most masterful race in history.”  Nell Irvin Painter explained: “Imperialism was elemental, racial, predestined, for God had prepared the English-speaking people, master organizers, for governing what Beveridge called ‘save and senile people.’”  Even anti-imperialists, who argued against the trappings of empire for many reasons, often framed their critiques of foreign intervention with the same racist assumptions, and focused more on the implications of empire for poor whites in America.  Many Southerners actually felt vindicated by Imperial policies, although skeptical about ruling over more non-white people.  Benjamin Tillman argued to his fellows in Congress that “We of the South” had already “borne this white man’s burden of a colored race in our midst.”  In 1883 the Supreme Court had already invalidated the Civil Rights Act of 1875 and by the 1890s there was widespread acceptance of Jim Crow segregation and disenfranchisement laws.  The color line became an increasingly important national preoccupation by the early 20th century as the U.S. became defined more and more as a white man’s nation.  Thus self-proclaimed “progressives” never touched the white supremacy of the South and de facto “racial hierarchy” of the country as a whole.[lxxiv]

The conservative Progressive Teddy Roosevelt saw the United States as standing on the threshold of “Armageddon” with the evils of plutocratic industrial power on one side and the evils of the violent mob on the other.  Under the banners of the “New Nationalism” and the “New Freedom,” Roosevelt called for the regulation of society and the economy by an empowered and enlightened federal government which would act as a disinterested arbitrator between conflicting political factions, like labor and capital (of course more radical voices pointed out the impossibility of a disinterested federal government as federal policy was often in the hands of industrial capitalists and their appointed voices in the Congress).  Teddy Roosevelt succinctly summarized the ideals of these Progressive reformers: “the object of the government is the welfare of the people.  The material progress and prosperity of a nation are desirable chiefly so far as they lead to the moral and material welfare of all good citizens.”[lxxv]  But as Gary Gerstle has pointed out, Roosevelt’s nationalism was based on a racist platform: 1) “political and social equality for all, irrespective of race, ethnicity, or nationality, and a regulated economy that would place economic opportunity and security within the reach of everyone;” 2) the maximizing of “opportunity” for racially superior Americans while also limiting opportunity for racially inferior Americans and immigrants; 3) dealing out “harsh discipline” by means of “marginalization” and “punishment” and/or “Americanization” to “immigrants, political radicals, and others who were thought to imperil the nation’s welfare.”  Rooseveltian nationalism pivoted around a conception of “controlled hybridity” by which both “racial hybridity and purity” and “racial inclusion and exclusion” combined into a more expansive Americanism, but one still marked by racial prejudice, intolerance, and WASP superiority.  Roosevelt embraced many of the new European immigrants, both Catholic and Jewish, but he continued to exclude Afro-Americans and Asians from the “crucible” of America.  Roosevelt adopted Herbert Croly’s conception of “New Nationalism” and used it as a Progressive platform to extend full citizenship only to the new European immigrants on the condition that they left behind their old cultural affiliations to became “100% percent American.”[lxxvi]

Gerstle traces coercive Americanization programs to Theodore Roosevelt’s conception of racial nationalism.  Roosevelt’s conception of “controlled hybridity” allowed for the assimilation of certain ethnic minorities in American only if they completely Americanized by which he meant leaving behind European identity, tradition, and loyalty and taking up American identity, tradition, and loyalty: The immigrant “must not bring in his Old-World religious[,] race[,] and national antipathies, but must merge them into love for our common country, and must take pride in the things which we can all take pride in.  He must revere our flag; not only must is come first, but no other flag should ever come second.  He must learn to celebrate Washington’s birthday rather than that of the Queen or Kaiser, and the Fourth of July instead of St. Patrick’s Day…Above all, the immigrant must learn to talk and think and be United States.”  Roosevelt believed that the duty of the American public school should be to turn immigrants [“hyphenated Americans”] into “Americans pure and simple” because it was “an immense benefit to the European immigrant to change him into an American citizen.”  He also supported private voluntary associations in their work of Americanizing the immigrant both outside and inside the school.[lxxvii]

John Higham traced the origins of early 20th century Americanization efforts to the widespread xenophobia and nativism of the 1890s and earlier.  Early forms of nativism congealed into a rampant and rabid nationalist crusade of “America for Americans” and “100 per cent Americanism” during World War I.  Fear of the foreigner gave way to a more ambiguous fear of “disloyalty,” “the gravest sin in the morality of nationalism,” which was any thought that might question the “Absolute and Unqualified Loyalty to Our Country.”  This search for disloyalty focused uncomfortably on “hyphenated Americans” (German-Americans in particular) and their ability to support not only the war effort, but the greater cause of American nationalism.  Infusing the search for disloyalty was a “positive and prescriptive” rhetorical abstraction that did not rise “to the dignity of a systematic doctrine:” “100 per cent Americanism.”  While there was no specific dogmatic or programmatic ritual to prove one’s “Americanism,” there were several assumptions underlying this phrase.  One was a “belligerent” demand for “universal conformity” to the “spirit of nationalism” and total national loyalty” to the State, which was regulated through “the pressure of collective judgment.”  However, “passive assent to the national purpose was not enough; it must be grasped and carried forward with evangelical fervor” through the “inculcation of a spirit of duty:” “Patriotism therefore was interpreted as service.”  Theodore Roosevelt forcefully supported this sentiment: “We must sternly insist that all our people practice the patriotism of service…for patriotism means service to the Nation…We cannot render such service if our loyalty is in even the smallest degree divided.”  It was at this time in 1917 that “The American’s Creed” (“I pledge allegiance to the flag…”) was introduced as a classroom ritual in public schools to remind children of the object of their loyalty, but more so to rhetorically instill the virtue of “right-thinking, i.e. the enthusiastic cultivation of obedience and conformity.”[lxxviii] 

100 per cent Americanism, as Higham argued, was primarily a rhetorical affair of “propaganda” and “exhortation,” but with the onset of the war nationalists supported the expansion of state powers and “the punitive and coercive powers” of the state to support if not mandate loyalty and conformity.  There were many grass roots level initiatives to suppress German language newspapers, eliminate German from the public school curricula, boycott German opera, and rename German foods (sauerkraut became “liberty cabbage”).  There were even many “secret societies” of paralegal militias looking for spies and disloyal subjects.  One reported organization was the Anti-Yellow Dog League (supposedly with a thousand affiliated branches), which was made up of adolescent boys over 10 who searched for disloyal Americans.  Perhaps the most famous paralegal organization was the American Protective League, which boasted 250,000 members and 1,200 dispersed units.  The APL was the Justice Department’s “semiofficial” loyalty and conformity watchdog (they even had official badges) composed mostly of middle class professionals and subsidized by corporations.  The state and federal governments acted in turn, partly in response to the vitriolic sentiment of the American public.  Congress passed an act which repealed the charter of the German-American Alliance and many state governments banned the teaching of German.  The Alien Enemies Act of 1798 was revitalized (this statute gave the President “arbitrary” authority over aliens in the U.S. in terms of arresting, restraining, and deporting individuals at will) and the Espionage Act in 1917 was passed (this statue penalized citizens for obstructing the war effort or aiding the enemy via “false statements”).

The Sedition Act was passed in 1918, which made any disloyal opinion illegal (whether against the nation, the flag, the government, or the Constitution) and punishable by twenty years in prison.  This Act was used extensively against radicals in the U.S. as “any radical critic of the way was customarily designated a ‘pro-German agitator.’”  As Higham noted, “the new creed of total loyalty outlawed so many kinds of dissent.”[lxxix]

As far as the immigrant population in America was concerned there seemed to be a “paradox of American nationalism,” which combined both “fraternity” and “hatred.”  The demands for unity and conformity turned coercive and aggressive mostly towards Germans and radicals, which thereby allowed many immigrant individuals and communities to at least outwardly conform to nationalist purpose and even join the military.  As Higham argued, “To a remarkable degree the psychic climate of war gave the average alien not only protection but also a sense of participation and belonging,” albeit within an atmosphere of “force of fear and compulsion.”  This charged atmosphere of 100 per cent Americanism survived and thrived after the war as self-proclaimed Americans still searched out disloyalty.  This placed immigrants in a precarious position.  The American Legion formed in 1919 in order “To foster and perpetuate a one-hundred-percent Americanism” and ferret out radical agitation.  Other “Loyal Legions” and vigilante groups (the second Ku Klux Klan re-emerged in 1915 and grew to several million followers in the 1920s) began to conflate dissolute, radical agitation, and the foreign-born as related problems.  The Big Red Scare of 1919 ignited a fever pitch of nationalist hysteria whereby anti-radical nativism began to indiscriminately target immigrant populations, which in turn began to effect industrial labor relations.  The Red Scare also pushed zealots like Attorney General Palmer to push for a general sedition law, which would allow for the prosecution of American citizens as well as the foreign born for dissenting opinion.  The New York legislature threw out five elected members solely because of their Socialist affiliation.  But when Palmer’s apocalyptic foretelling of revolution did not materialize on May Day 1920 the country began to realize that there was no widespread internal threat and by the mid-1920s the crest of 100 per cent Americanism began to flow into more peaceful expressions of national fervor.[lxxx]

Political liberals of the time did not often disagree with conservative nationalist ideology, except for the more rabid forms of white supremacy and xenophobia.  Some liberals did, however, disagree over tactics.  By the end of the 19th century liberalism displayed a reformist edge and it organized, as Gary Gerstle has documented, “rational interventions in society and culture,” often by turning “to the state as an institutional medium capable of reconstructing society and of educating citizens.” Classical liberalism revolved around free markets, limited statism, and bourgeois morality, which often defended corporate capitalism, segregation and disenfranchisement.  Progressivism was a three pronged liberal reaction to (a) socialism and labor radicalism, (b) the “extraordinary concentration of power and wealth,” and (c) a diverse influx of immigrating ethnic groups.  Progressives wanted to find ways to promote and protect “freedom of trade and individual liberty” by way of state regulation and welfare, and by way of “guild socialism.”  They also wanted to engage in “cultural reconstruction” because liberals believed in the importance of individual moral character as the foundation of civic virtue.  When dealing with foreigners this “reconstruction” took the form of “Americanization” in order to “culturally and morally transform…aliens into citizens.”  But Progressives were a diverse bunch (“left-leaning Progressives” ranging from socialists to left leaning pluralists, and “rightward-leaning Progressives” from Americanizers to hard core nationalists preaching “100 percent Americanism”) and because of these conflicts of purposes and methods they “had difficulty fashioning a cultural politics to which they could all adhere,” which eventually lead to a loss of “coherence as a political movement.”[lxxxi]  The Americanization movement, however, was an important liberal focus point for first decade of the 20th century.  Americanizers ranged from the more conservative and exclusivist “new nationalists” led by Theodore Roosevelt, Herbert Croly, and Frances Kellor, to the more liberal and egalitarian “cosmopolitan pluralists” led by John Dewy, Randolph Bourne, and Jane Addams. 

An Americanization movement “emerged” from within the Progressive movement in order to offer “moderate civic nationalist alternatives” to the coercive racial ideology of white supremacists, exclusionists, and nativists that wanted immigration restrictions and limited freedom for immigrants.  Noah Pickus defined the Americanization movement in a positive light as “a wide range of legal, political, medical, civic, and cultural efforts to help immigrants adjust to their new surroundings and to encourage Americans to accept them.”  The Bureau of Naturalization in “An Outline Course in Citizenship” (1916) defined Americanization as the transformation of “uniformed foreigners, not comprehending our language, customs, or governmental institutions, to intelligent, loyal, and productive members of society.”  The Americanization movement was reacting against the sense of social fragmentation and conflict caused industrial, economic, social and institutional changes and it was made dramatically urgent by the massive influx of immigrants and by the strange newness of a “nationally oriented American society.”  Progressive reformers felt an urgent need to reorder society and give to all citizens a new “common identity” – a national identity as Americans.[lxxxii] 

The Americanization movement was concerned with “national unity,” but different factions approached this central issue differently.  Pickus broke the Americanization movement into two camps: “right-leaning Progressives” like Theodore Roosevelt, Herbert Croly, and Frances Kellor, and “left-leaning Progressives” like John Dewey, Randolph Bourne, and Jane Addams.  Both wings offered liberal alternatives to immigrant restriction, but the left wing wanted a pluralist and cosmopolitan “international nation,” while the right wing believed in a narrower nationalism that welcomed immigrants only if they “relinquished cultural and political habits thought to be at odds with a robust American identity,” and the right wing was willing to use compulsion and force in order to create and preserve the bonds of national unity.[lxxxiii]

The Americanization movement has been known primarily because of the actions of the more powerful, “mainstream,” and influential right-leaning Progressives, and Pickus focused more on this group in his book.  Under the banner of “New Nationalism,” right-leaning Progressives sought to “eradicate” the ethnic identity of white European immigrants, while disavowing (through silence and segregation) any place for non-white Americans, in order to establish a “uniform national identity” and a fervent sense of patriotism based on WASP principles and culture.  Theodore Roosevelt proudly proclaimed in 1906, “We are making a new race” and he later added “The only man who is a good American is an American and nothing else…There is no room in this country for hyphenated Americans.”  Roosevelt admonished, “The immigrant must learn to talk and think and be United States.”  Nationally the Americanization movement was administrated by the newly formed (1905) Bureau of Naturalization (under the leadership of Commissioner Richard Campbell and his deputy Raymond Crist) and the Bureau of Education (under the leadership of commissioner Philander P. Claxton, Fred Butler, direct of the Americanization Division, and more importantly, Frances Kellor, director of the Division for Immigrant Education, a division completely supported financially by a non-governmental organization, the National Americanization Committee, also lead by Kellor).

Up till the early 1910s the primary method of Americanization had been teaching immigrants how to be “sufficiently American,” as Fred Butler asserted, “so that they will not be a danger to us.”  However, Noah Pickus has noted that Americanizers turned to more “aggressive” methods by the fall of 1915, symbolically demonstrated through the National Americanization Committee’s change of slogan from “Many People, But One Nation” to “America First.”  Frances Kellor and the NAC were very concerned about people not speaking “the same language,” not “follow[ing] the same flag,” and engaging in “anti-American” activities like “class consciousness and race hatred.”  Americanization efforts sought to not only make citizens of immigrants, but to make all Americans “loyal” with a “respect for authority” because the “security and prosperity” of the nation depended on it.  NAC organized and promoted social and industrial programs, military preparedness, coercive educational programs, and recruitment of ethnic leaders, especially members of the ethnic presses.  Frances Kellor had wanted to keep Americanization efforts from “alien baiting” and “repressive measures,” and she argued that Americanization should also accompany increased economic opportunities for immigrants, but Pickus argued that she was “pushed aside by forces that were committed to an ideologically pure Americans and had no interest in programs that directly aided newcomers.”  In 1919 the government banned NGO support of government agencies and, thus, NAC support for the Division for Immigrant Education came to an end, and Frances Kellor was removed as national coordinator for Americanization efforts. 

In 1918 the Bureau of Naturalization had begun to use naturalization fees to publish Americanization textbooks and distribute them as well as establishing the Division of Citizenship Training led by Deputy Commissioner Raymond Crist.  Crist believed that “nearly all can be transformed through attendance at the public schools into desirable citizenship material.”  He helped coordinate support for Americanization programs in public schools across the nation and by 1922 more than 750 U.S. cities and towns had some type of Americanization program, however, these programs suffered from high drop out rates, dry fact-based textbooks, and reliance on rote memorization and recitation.  In 1922 the secretary of labor, James J. Davis, wanted to set up a registration system to force aliens to register for a fee upon entrance to America, and he also wanted mandatory Americanization programs. 

Davis felt strongly that the U.S. had been “making citizenship entirely too cheap” and he wanted to protect Americans from “contact with the mental, moral and physical delinquents of all the world.”  He defended his insistence of coercion in the face of critics by arguing, “If we compel the alien to know America, I have no fear that there will come that change of heart necessary to produce an American citizen.”  The push for more coercive Americanization programs linked Americanization efforts to the simultaneous push by more nativist and reactionary elements for exclusionary immigration policies.  However, even these coercive efforts crumbled by the early 1920s as federal, state, and local politicians “proved unwilling to support Americanization programs if doing so required them to provide funding.”  But Americanization efforts did not die out, instead they expanded and folded into the very fabric of American life and public schooling.[lxxxiv] 

Noah Pickus has argued that left-leaning Progressives did not have a strong enough political “vision” to battle right-wing nationalism and, thus, their “vagueness and confusion” could not put forth a “clear, coherent, [or] compelling moderate alternative position.”  Thus Americanization devolved into a “zero-sum calculation” that forced immigrants to become “100% American.”  However, Pickus argued that Americanization was not a “coercive and exclusionary project from its inception.”  He argued that it was the “fear and insecurity of the war” that helped “legitimate otherwise objectionable policies,” and he further argued that part of the reason Americanization efforts collapsed was that “many of its proponents were simply not willing to pursue compulsory assimilatory measures to their logical extremes.”  Pickus claimed that the “achievements” of the Americanization movement were “remarkable,” and he listed four: legislation to protect immigrants, “large-scale practical assistance” to immigrants, outreach programs (including the development of adult education), and improvements to the naturalization system.[lxxxv]     

 

Institutionalizing Progressivism and Americanism: Education Reform and the ‘One Best System’

This essay follows close on the heels of our first foray into the historiographical debate over the conceptual terminology of social, cultural, and political “Progressivism.”  This essay will develop a comprehensive, yet selective portrait of so-called “Progressive” education so as to outline the major ideological and curricular developments that this term (both theory and practice) designates.  We will also trace the borders of historiographical debate over the conceptual delineation of Progressive education and, thereby, evaluate its usefulness as a concept for understanding U.S. educational reform programs during the first decades of the 20th century.      

 

The Progressive Education Movement: A Short History

The ideological and curricular roots of Progressive education go back centuries, rooted especially in French and German Romanticism.  Early philosophical and educational influences include Jean Jacques Rousseau (1712-1778), Jean Heinrich Pestalozzi (1746-1827), Johann Friedrich Herbart (1776-1834), and Friedrich Froebel (1782-1852).  The term Progressive applied to education in the English language seems to have come from Necker de Saussure’s book L’Education Progressive, ou Etude du Course de la Vie (Paris, 1836), which was translated into English in London as Progressive Education; or, Considerations on the Course of Life (1839). 

American Progressive education is often linked with the earlier nationalist and millennial “propaganda” of the common school reformer Horace Mann, whose mid-19th century common-school movement equated “education” with “national progress.”  Mann combined “Jeffersonian republicanism,” “Christian moralism,” and “Emersonian idealism” within his “total faith” in “the power of education.”  Mann believed universal education would be the “great equalizer” of democratic citizens.  He also saw education as a moderating force that would “balance the wheel” of society while also creating “wealth undreamed of.”  Mann was deeply disturbed by the conflict he saw around him (social, political, economic, and cultural).  He wanted a shared national value system that would insure a sense of community and a common political identity.  He saw a public, “common” school as the perfect instrument for this mission.  But in order to realize this vision of a public school system, Mann had to form “political coalitions” that often united “disparate interests” in a very “political” program of consensus building.[lxxxvi]

What Horace Mann began, men like William Torrey Harris saw to fruition.  When Harris started his work as a school reformer the idea of “universal education” was still very “radical” to most Americans.  When Harris had finished his career, universal education “had been made the nub of an essentially conservative ideology.”  Harris argued for a broader definition of education as a process of socialization that would inculcate children into the local and emerging “national” culture and prepare them for adulthood as democratic citizens.  His four basic principles of education were: 1) schooling should prepare children to become lifelong learners as adults; 2) the school should teach only what the child would not be taught by family, friends, and associates; 3) the school should teach only such subject matters as would have “a general theoretical bearing on the world in which the pupil lives;” and 4) the school should teach “moral education,” but never “religious education.”[lxxxvii] 

The early formation of American Progressive education as a “movement,” according to self proclaimed Progressives John Dewey and Robert Holmes Beck, started in Quincy, Massachusetts.  It was here that Colonel Francis W. Parker became the superintendent of schools in 1873 and he initiated the “Quincy System” soon thereafter.  This new system of education became a quintessential model for what later reformers would label “Progressive.”  In 1892 the journalist Joseph Mayer Rice ran a series on U.S. public schools for the Forum, which was published as a book in 1893, The Public School System of the United States.  While he did not explicitly mentioning a Progressive educational movement, he did use the term Progressive many times in relation to notable school reforms and initiatives, especially the “Quincy System” of Colonel Parker.  Also in 1892, several attendees (including John Dewey) of the National Education Association meeting in Saratoga Springs, New York formed the National Herbart Society to promote the educational philosophy of the famous German pedagogue.  A year later G. Stanley Hall published his first major research project on child study, “The Contents of Children’s Minds” (1893).  This research subject would eventually feed into a larger child study movement that would become the major plank of the Progressive education platform: child-centered curriculum and instruction.[lxxxviii] 

A Progressive educational “movement” was said to have stirred in earnest by the time John Dewey began his “Laboratory School” in Chicago in 1896 and gave his lectures on The School and Society in 1899.  The movement supposedly congealed between the founding of the Association for the Advancement of Progressive Education (or the Progressive Education Association, PEA) in 1919 and its publication of Progressive Education starting in 1924.  The high-water mark for Progressive education in terms of organizational development and theoretical vitality was during the 1930s.  An impassioned organ of radical Progressive educational theory and practice, The Social Frontier, appeared in print in 1934 as an outlet for Social Reconstructionist thought.  Due to financial insolvency, it was later tempered and incorporated into the PEA as Frontiers of Democracy, which ran from 1939 to 1944.  In 1936 many influential Progressive educators and intellectuals formed the John Dewey Society as a moderate forum to discuss Progressive and liberal philosophy.  The John Dewey Society also started to publish important educational research yearbooks by 1937. 

1938 might have marked the apex of Progressivism in the U.S.  In this year the Progressive Education Association’s enrollment peaked at 10,440 members; Time magazine featured the PEA as a cover story and announced its wide influence; and John Dewey and Boyd Bode both warned fellow Progressives that the movement was dissolving into a non-political, child-centered libertarianism instead of a comprehensive movement for social democracy.[lxxxix]  However, despite its organizational success, the actual impact of Progressive innovations on American education by the 1930s is uncertain.  The celebratory framework of most reformist literature has obscured more concrete evaluations by later historians.[xc]  C.A. Bowers pointed out that due to Progressive educator’s focus on elementary school teachers and classrooms, “the influence of the Progressive education movement was restricted to only a fraction of the nation’s 1 million teachers” – although he argued that one should not discount the wide influence of Progressive intellectuals in teacher training Education departments.  Bowers estimated that William H. Kilpatrick taught almost 35,000 students between 1909 and 1938.  Larry Cuban has made one estimate of Progressive influence on the practice of public schooling.  He argued that at its peak (between 1920-40) no more than 25% of New York public school teachers “adopted Progressive teaching practices, broadly defined, and used them to varying degrees in the classrooms.”  David Tyack, Robert Lowe, and Elisabeth Hansot argued that, overall, actual Progressive reform in public schools was a mixed bag, and to the extent that concrete Progressive reforms were initiated and retained over a long period of time, they “fared best in relatively prosperous states and districts” and “most affected children from favored social classes.  Ironically, of course, these were the groups least in need of help.”[xci]

It is important to note in more detail the radical group of Progressive educators that organized as a block during the 1930s in opposition to capitalism and New Deal liberalism.  They called themselves “Social Reconstructionists” and they were the radical wing of the Progressive education movement.  The intellectual catalyst and the most important spokesman for this group was George S. Counts whose call to arms – “Dare Progressive Education Be Progressive?” – was unleashed in 1932.  Taking inspiration from radical social scientists like Charles Beard and Thorstein Veblen, as well as the broader socialist movement, Counts published the first manifesto for the Social Reconstructionist platform in 1932, Dare the School Build a New Social Order?, which was shortly followed by The Social Foundations of Education (1934) and the more tempered writings of William H. Kilpatrick, Education and the Social Crisis (1932) and his edited volume of radical Progressive thought The Educational Frontier (1933).  Most of the social reconstructionists were first active members of the PEA, but between 1931 and 1933, these radicals expressed their desire for more militant social reform through education in the pages of Progressive Education and within PEA committees – most notably the Committee on Social and Economic Problems and its publication, A Call to the Teachers of the Nation (1933).  After Counts self-consciously raised the ideological banner of Social Reconstruction, he helped found The Social Frontier in 1934, which was then the official organ for radical Progressive thought and became a marked contrast to the more moderate views found in Progressive Education.  According to C. A. Bowers, the social Reconstructionist faction rose to prominence in the wake of the Great Depression and took control of the Progressive education movement by 1947, although by then they espoused a more moderate platform based on democratic values, like deliberation and “democratic living.”  But of course, by this time Progressive education was becoming an embattled cause.[xcii]   

By the late 1940s and early 1950s both wings of Progressive education were under widespread attack as the cultural climate in the U.S. narrowed its horizons and punished unpopular opinions.  By mid-century, America was becoming a very “counterprogressive” country.[xciii]  Lawrence Cremin noted, “The surprising thing about the Progressive response to the assault of the fifties is not that the movement collapsed, but that it collapsed so readily.”  In 1951 David Hullburd published This Happened in Pasadena chronicling the demise of Pasadena’s Progressive superintendent Willard Goslin.  John Dewey died in 1952.  The Progressive Education Association collapsed by 1955.  Progressive Education (financed by the John Dewey Society after the end of the PEA) issued its last publication in July, 1957.  And the John Dewey Society published its last yearbook in 1962 (but the organization remains active to date).  Despite the speedy demise of the movement within a decade, Lawrence Cremin was somberly optimistic about its importance.  In 1961 he noted, “the transformation” Progressive educators were able to achieve in the school system “was in many ways” “irreversible.”  He hinted that Progressive education would be back, if in fact it ever completely left: “the authentic Progressive vision remained strangely pertinent” – perhaps “awaiting” a “reformulation and resuscitation that would ultimately derive from a larger resurgence of reform in American life and thought.”  Cremin uttered these words quite self-consciously as the first comprehensive chronicler of the history Progressive education.[xciv]

 

Historiography of The Progressive Education Movement

As an academic pursuit in the United States, the History of Education is a relatively new field of study.  It has been around for only about 100 years and it is still arguably fighting for its status as a major disciplinary category of history.  It was originally linked to the Philosophy of Education in the late 19th century and began to emerge on its own with the publication of Source Book of the History of Education for the Greek and Roman Period (1901), which was written by a sociologist named Paul Monroe.  Monroe was asked to research the History of Education by the Dean of Teachers College at Columbia University, James Earl Russell, and Monroe would write several volumes thereafter.  Due to Monroe’s work, the History of Education emerged as a disciplinary field of study.  The first institution to offer doctoral degrees in History of Education was Teachers College at Columbia University.  Teachers College alumni produced several influential dissertations on the History of Education during the first two decades of the 20th century.[xcv] 

It was Ellwood Patterson Cubberley, the first dean of the School of Education at Stanford University, who took hold of the History of U.S. Education and strove to make it not only a thriving academic discipline, but also a professional “science.”  His monumental work toward this end was Public Education in the United States (1919).  It was an important early contribution toward the so-called “scientific” history of the early 20th century, although it suffered from the same flawed conceptions of “science” and “objectivity” as did other “scientific” works of history that emerged at the time.[xcvi]  Under the rhetoric of “science,” Cubberley’s work suffered from a selective and celebratory “Whig” interpretation of educational history and was used as a campaign tool for his own part in the Progressive educational crusade.   Despite the efforts of scholars like Cubberley, the History of Education remained a small sub-field for the first half of the 20th century and most of the major organs of historical research, including the American Historical Association, would publish only a few articles on the subject.[xcvii] 

It was not until the 1960s and the breakthrough scholarship of Bernard Bailyn and Lawrence A. Cremin, combined with the launching of the journal History of Education Quarterly, that the History of Education became a respected sub-field within the academy.[xcviii]  By this time the historical community was going through a transvaluation of values, as professional and epistemological standards were changing.  Much of the new history and historiography of U.S. education challenged old Whiggish pieties and introduced a much more complicated, fragmented, and often radical critique of American education.  The 1960’s historiographical debate within the history profession, especially within the education community, stoked the flames of a cultural divide that would fulminate into the 21st century.[xcix] 

One of the seminal works of this formative period was The Transformation of the School: Progressivism in American Education, 1876 – 1957 (1961) by Lawrence Cremin.  It was an important and still is in many ways an unsurpassed study of the history of Progressive education.[c]  In this prizewinning book[ci] Cremin tried to sketch a full picture of not only the educational and theoretical principles of the movement, but also its intellectual and historical generation.  Like last chapter’s survey of the historiographical literature on the larger conception of “Progressivism,” we will now focus particularly on various conceptions of Progressive education so as to get some clarity about the meaning and significance of the term “Progressive” as it related to education and educational reform.  Thus, we will be restricting our historiographical discussion to one central question: What was Progressive education?  To the extent that Americanization was involved within the Progressive educational program, it will be mentioned as a topical subject, but a full analysis of the history and meaning of Americanization programs and a review of the literature on this topic will come later in the chapter.  

In The Transformation of the School, Cremin was quite clear that Progressive education was a “many-sided effort” and “marked from the very beginning by a pluralistic, frequently contradictory, character.”  He cautioned his reader that he would offer no “capsule definition of Progressive education” because “none exists, and none ever will; for throughout its history Progressive education meant different things to different people.”  However, with this caution in mind, Cremin offered several definitions with which one could define this movement.  Progressives were “moderate” reformers who believed in democracy and wanted to use education as “an adjunct to politics in realizing the promise of American life.”  He described Progressive education as “part of a vast humanitarian effort to apply the promise of American life – the ideal of government by, of, and for the people – to the puzzling new urban-industrial civilization that came into being during the later half of the nineteenth century.”  As such it was a “many-sided effort to use the schools to improve the lives of individuals” in four distinct ways: (1) a “broadening” of the school to meet and treat all areas of the community; (2) applying the new “scientific” research of educational professionals inside the classroom; (3) reshaping a student centered curriculum to meet the needs of a diverse study body; (4) instilling a “radical faith that culture could be democratized” and thereby training responsible citizens to lead the country to progress and prosperity.  A quintessential expression of the Progressive ethos came from Jane Addams, who Cremin quoted in his introduction: “We have learned to say that the good must be extended to all of society before it can be held secure by any one person or any one class; but we have not yet learned to add to that statement, that unless all men and all classes contribute to a good, we cannot even be sure that it is worth having.”[cii]

Cremin argued that Progressive education and its pedagogical agenda could best be defined by summarizing the seven founding principles of the Association for the Advancement of Progressive Education (or PEA).  PEA’s 1919 statement of purpose proclaimed, “The aim of Progressive Education is the freest and fullest development of the individual, based upon the scientific study of his mental, physical, spiritual, and social characteristics and needs.” The principles of this organization included: (1) children should be free to naturally develop according to both individual self-expression and the social needs of the community; (2) the learning process should include a) hands-on direct experience, b) a holistic conception of knowledge and its practical application, as well as c) self-reflexivity; (3) the teacher should guide the social and intellectual development of the child and this necessitates a) a well trained and creative teacher, b) a stimulus-rich learning environment, and c) small class sizes; (4) learning assessments should include both “objective and subjective reports” on the “physical, mental, moral, and social” aspects of the child’s development; (5) the overall wellbeing and health of the student is a primary concern; (6) the school should communicate and cooperate with the home in educational, developmental, and extracurricular endeavors; (7) the Progressive school should be a “laboratory” of “new ideas” and it should take the lead in educational initiative.[ciii] 

Cremin also evaluated the specific impacts of Progressive initiatives within the U.S. public school system.  He listed 10 points of measurable change: (1) an “extension” of education on all levels whereby more and more children were steadily attending kindergartens on through high school; (2) school system shifted to six years in elementary, three years in junior high, and three years in high school; (3) a “continuing expansion and reorganization of the curriculum at all levels;” (4) expansion of extracurricular activities; (5) “more variation and flexibility in the grouping of students;” (6) the learning environment – classroom – became more active, informal and mobile; (7) teaching materials, including textbooks, expanded to increase the interest and learning of the student; (8) the architecture of schools changed to accommodate gymnasiums, playgrounds, athletic fields, and such; (9) teachers became better trained and certified – in word, professionalized; (10) school administration became more centralized, professionalized, and bureaucratic.[civ] 

There were also some notable failures of Progressive education as well, which Cremin noted: (1) because of success and the diversity of its practitioners, it eventually suffered from schisms and the distortion of its comprehensive aims; (2) Progressives were better able to articulate “what they were against than what they were for;” (3) Progressive reforms often demanded too much time and ability from teachers; (4) after reforms were initiated, Progressives were often tied to specific programs and could not “formulate next steps;” (5) a failure to adequately deal with the conservative post-war climate; (6) professionalization of educators and administrators brought isolation from reform coalition partners in the public who were key in backing and initiating reform programs; and finally, Cremin argued, (7) Progressive educators became to attached to Progressive initiatives and too detached from the “continuing transformation of American society.”[cv]

In the third volume of Lawrence Cremin’s award winning series,[cvi] American Education: The Metropolitan Experience, 1876 – 1980 (1988), he revisited his definition of “education” and how its meaning in the American context was tied to both nationalism and reformism.   In this volume Cremin noted that by the late 19th century education was becoming increasingly valued by the public at large and so educational reforms were becoming increasingly political conflicts.  But at the same time, Cremin pointed out, the “American paideia” had not been not settled or formalized and, thus, “Americans were still in the process of defining what it meant to be an American.”  However, this did not stop the growing corporate state and its elite WASP representatives from fashioning their own version of American identity as an Anglo Saxon “manifest destiny,” which was being actively carried over the continent and across the seas as a form of “cultural imperialism” (accompanying, of course, more traditional forms of economic and political imperialism as well).  But struggling alongside this push for a dominant American paideia modeled on WASP cultural values were “alternative American paideias” fomented by African Americans, Native Americans, and immigrant communities.  This created a “complicated” educational terrain as competing socio-cultural groups fought over the right to transmit their own diverse cultural value systems. 

It is within this context that “Americanization” programs were launched both within and outside of the pubic school system by mostly Progressive forces.   The arch-purpose of these programs was to bring a homogenized ideological order to the newly conceived “nation” and, thereby, solidify a dominant American identity with which to inculcate both children and adults so as to “assimilate” the population into what Progressive reformers believed to be the “dominant American community.”  But Cremin also noted the “pluralistic” character of the many (often “contradictory”) Progressive “movements,” and thus he dwelt a great deal on how Progressivism also contained a strand of “liberalism” that sought to “democratize the concept of culture” and promote an “inclusive politics” that addressed the “problems of inequality” within the U.S.[cvii]

The last work by Lawrence Cremin that we will note is “Education as Politics,” a lecture given in 1989.  Cremin made it clear (within the highly charged standards and multicultural educational debates of the 1980s) that “education has always served political functions.” More specifically, he claimed the educational endeavor eternally focuses on the “future character of the community” and to that extent education can never be separated from politics: “It is impossible to talk about education apart from some conception of the good life; people will inevitably differ in their conceptions of the good life, and hence they will inevitably disagree on matters of education; therefore the discussion of education fall squarely within the domain of politics.” 

Cremin argued that U.S. education has always been politicized, especially by Progressive reformers, but he tried to make the argument that it became “increasingly politicized” in the wake of Progressivism, post WWII, as many diverse groups “with differing conceptions of the good life” escalated the battle over “the nature and character of education.”  These battles ensued, Cremin pointed out, because of a longstanding U.S. Progressive tradition to use the system of education to try and “solve” all sorts of socio-political problems, “and in so doing to invest education with all kinds of millennial hopes and expectations.”  Cremin mentioned social critics like Hannah Arendt who pointed out that educational systems are limited in their ability to change the world, yet she noted that this has not stopped successive waves of Americans from trying to use education for just that purpose.[cviii]  When people battle over educational systems and curriculum, Cremin argued, they are really debating “alternative views of the good life,” especially what “kind of America they would prefer to live in and what it might mean to be an American.” 

Cremin believed Dewey to be the great philosopher of American social and political ideals in relation to its educational practices, but Dewey was not the only intellectual force to make the connection between education and politics.  Cremin argued that a “distinctively American paideia” molded out of WASP values, nationality, and patriotism became the norm during the 19th century and it demanded a “relentless” program for cultural and political “assimilation:” “the more different the newcomers from the British-American model, the more intense the manifestations of concern.”  But the process and programs of “Americanization,” Cremin argued, did not have the desired effect.  First of all, for all the rhetoric of a unified WASP paideia, it was never completely realized, and it was often “loosely and variously defined:” The American norm to which school children were “supposed to be assimilating often proved confusing and elusive.”  Second, the American paideia began to change in relation to the ever evolving context of American society.  And finally, deep seated racism in all parts of the U.S. gave rise to many severe restrictions and rejections of specific minority communities based on their assumed inferiorities.  This in turn gave rise to many protest movements over the course of the 20th century and a vigorous debate over “precisely what it meant to be an American.”  Cremin ended his essay by noting that American identity has always “inevitably depend[ed]” on the complex and changing “interaction” of the diverse U.S. population.  He also reiterated the limited, yet central, role of education within past and present debates on Americanism: “Education cannot take the place of politics, though it is inescapably involved in politics, and education is rarely a sufficient instrument for achieving political goals, though it is almost always a necessary condition for achieving political goals.”[cix]

If Lawrence Cremin was the first major historian of U.S. education, his seminal reputation was eclipsed not a generation later by the work of David B. Tyack, professor of Education and History at Stanford University.  Tyack has authored and co-authored a host of seminal works that have focused on various reform initiatives during the 19th and 20th centuries.  We will be surveying several of his major works. 

His first major book was The One Best System: A History of American Urban Education (1974).  This book focused on the “politics” of education by which Tyack meant “who got what, where, when, and how.”  Tyack wanted to study not only the decision makers who initiated reform, but also those segments of the American population (the “poor and dispossessed) who were marginalized from the political process and, thereby, often the passive recipients of reform programs.  Being largely left out of political decisions, the poor were often “victimize[ed]” “predictab[ly] and regular[ly]” by “systematic” reform initiatives that were not drafted or implemented in their interests.  And further, these “victims” of systematic injustice were often blamed for their own marginalization.  In framing his discussion around the issue of justice, Tyack’s study invoked (while criticizing) Progressive principles.  He primarily sought to expose the “systemic injustice” at the root of Progressive reforms, which meant a focus not on individuals per se but on the institutions within the “social system” that created and reinforced an atmosphere of injustice:

“It is more important to expose and correct the injustice of the social system than to scold its agents.  Indeed, one of the chief reasons for the failures of educational reforms of the past has been precisely that they called for a change of philosophy or tactics on the part of the individual school employee rather than systemic change – and concurrent transformations in the distribution of power and wealth in the society as a whole…Despite frequent good intentions and abundant rhetoric about “equal educational opportunity,” schools have rarely taught the children of the poor effectively – and this failure has been systematic, not idiosyncratic.  Talk about “keeping the schools out of politics” has often served to obscure actual alignments of power and patterns of privilege.  Americans have often perpetuated social injustice by blaming the victim, particularly in the case of institutionalized racism…The search for conspiracies of villains is a fruitless occupation; to the extent that there was deception, it was largely self-deception.  But to say that institutionalized racism, or unequal treatment of the poor, or cultural chauvinism were unconscious or unintentional does not erase their effects on children.”

Tyack was also lending his skills as a scholar toward a broader initiative of “social justice,” which he argued (also working out of a Progressive conception) could be found “in the old goal of a common school, reinterpreted in radically reformed institutions.”[cx]

Tyack looked mostly at the urban reforms of a growing urban society.  Administrative Progressives believed that the older systems of rural schools in the U.S. were too haphazardly organized, inefficient, substandard, and too “subordinated” to community interests.  Reformers, especially urban reformers, thought that rural communities were backwards and ignorant of the complex needs of modern society.

Progressive reformers “blended economic realism with nostalgia, efficient professionalism with evangelical righteousness” so as to initiate a complex re-ordering, nationalization, and professionalization of the public school system.  They wanted to engineer the “one best system” of education that could create a “standardized, modernized ‘community’ in which leadership came from the professionals.”  While cloaked in the rhetoric of democracy, the needs of society, and the education of all, Progressive school reforms in urban areas were more about reconstituting the nature of authority in order to “transfer of power from laymen to professionals,” and thereby, create a nationalized (and standardized) educational bureaucracy.  The results of this restructuring did lead to “better school buildings, a broader and more contemporary course of studies, and better qualified teachers and administrators,” while also giving “country youth greater occupational mobility” and introducing them to “different life-styles.”[cxi] 

But there was also a darker side to urban reforms.  In a search for the “one best system,” administrative Progressives continually stressed “order” and “standardization.”  It was a program of “institutionalization” to combat the social chaos of modernity in urban America.  William T. Harris, superintendent of schools in St. Louis, asserted in his School Report for 1871, “The first requisite of the school is Order: each pupil must be taught first and foremost to conform his behavior to a general standard.”[cxii]  School modernization and professionalization was modeled on the factory system of bureaucratic division of labor and it often reinforced principles like punctuality, chain of command, coordination, systematizing, hierarchical organization, impersonal rules, regularized procedures, objective standards, efficiency, rationality, and precision.  In some cases reformers sought professional bureaucracies so as to promote a more equalized “meritocracy” that would serve all segments of the urban community impartially and fully.  However, the “rational” bureaucratic systems of education often “reinforced racial, religious, and class privilege,” as well as normalizing “subordination” of students and teachers to the authority of white, male school administrators.  WASP professionals simply assumed that their values and interests as “honest and competent experts” were universal goods and, thus, under their control “public education was the most human form of social control and the safest method of social renewal.” [cxiii]               

Prefiguring a later book, David Tyack and Elisabeth Hansot published “From Social Movement to Professional Management: An Inquiry into the Changing Character of Leadership in Public Education” (1980).  In this article Tyack and Hansot “interpret[ed] changing forms of leadership in public education” from the 19th to the 20th centuries.  The common school reformers largely shared a “Protestant-republican ideology” and engaged in an evangelical process of “nation building” through a “millennial” crusade to create a “righteous society.”  Common school reformers were lead by charismatic leaders whose main tools were exhortation and persuasion based on a shared Protestant-republican ideology: “leadership in public education largely took the form of guiding a decentralized social movement because the chief task was the creation of common schools through the mobilization of opinion and effort at the local level.”  20th century reformers believed in “social efficiency,” by which they meant organizational reforms resulting in “new structures and processes of schooling that would enable public education to mesh smoothly and efficiently with a corporate society.”  These professional school men sought to “take the school out of politics” by centralizing school authority, consolidating children in larger schools, standardizing curriculum, and normalizing a bureaucratic-business model of education: “Believing that the basic structure of society was just and Progressive, the new leaders thought that they knew how to bring about a smoothly running, socially efficient, and stable society in which education was the major form of human engineering.” 

Tyack and Hansot emphasized that these two movements were “not so sharply distinct” and that there was “significant overlap between the two eras.”  Both movements shared in the continuity of organizational structuring and expansion that started with the common school leadership.  Tyack and Hansot argue that the grass-roots initiated common school movement was the “most impressive case of institution building in American history.”  Its success was largely due to a homogeneous leadership core, which shared similar ideological orientations and social and economic interests.  These reformers wanted to create a national system of Christian common schools in which a “Protestant paideia” would “express and perpetuate” their shared socio-cultural values and “civic purpose.”  Tyack and Hansot argued that part of the “genius” of this movement “was that its leaders were able to wrap their cause in a noncontroversial Americanism,” which legitimated their effort by consecrating the Protestant-republican ideology as both a “social mandate” and a national mission.  Early 20th century reformers worked within the earlier common-school tradition while engineering an organizational “revolution” so as to reconfigure the established American paideia for an industrial, corporate capital nation-state.[cxiv]

Tyack and Hansot later expanded “From Social Movement to Professional Management” into a book on the same topic.  In Managers of Virtue: Public School Leadership in America, 1820 – 1980 (1982), Tyack and Hansot re-examined the 19th century common school movement that created the U.S. educational system.  In structuring their conceptual framework, Tyack and Hansot incorporated much of the “radical critique” of public schooling that historians had written since the late 1960s.[cxv]  Tyack and Hansot argued that 19th century common school reformers saw their educational program as part of a larger mission of consolidating and consecrating a “Christian nation” based on “patriotism, godliness, and prosperity.”  The project of American nationalism converged with the reformer’s visions of the Kingdom of God, whereby, an idealized version of the republic demanded righteous citizens engaged in a providential project.  Common school reformers rarely acknowledged their own socio-cultural “blinders” and pontificated as if they spoke for all Americans, thereby, programmatically trying to assimilate citizens and immigrants alike into a chauvinistic WASP “version of Americanism.”  In the words of one enthusiastic commentator: “American is Protestantism…Protestantism is Life, is Light, is Civilization, is the spirit of the age.  Education with all its adjuncts, is Protestantism.  In fact Protestantism is education itself.”  Tyack and Hansot argued that the American common school movement was the “most ambitious and successful social movement” of the 19th century.  By century’s end, it was able to create “more schooling for more people than in any other nation and resulted in patterns of education that were remarkably uniform in purpose, structure, and curriculum, despite the reality of local control in hundreds of thousands of separate communities.”[cxvi]

Progressive reformers around the turn of the 20th century carried on similar activities, but with a slightly different focus.  They sustained the “earlier moral earnestness and sense of mission” of the common school reformers, although Progressives lost “much of the specifically religious content” for a more secular nationalism.  Progressives sought to “control the course of human evolution scientifically through improving education.”  Progressives used a rhetoric of “moral charisma and millennial hope” to sanctify their “dream” of “professionalism” and “social efficiency.”  Believing whole-heartedly in the “myth” of progress, Progressives saw themselves as “social engineers who sought to bring about a smoothly meshing corporate society,” and thereby, “redesign” the public schools to compliment this project.   Of course this meant “constraining” public oversight in the schooling process so that public education could become a “professionalized” endeavor that prepared students for their subordinate places in the emerging, modern mass-industrial society.[cxvii]    

Tyack and Hansot described administrative Progressives as part of a self-conscious leadership elite (several prominent administrative Progressives described their select group as the “educational trust”).  They saw themselves as “professional managers” who were able to reshape the public school system “according to cannons of business efficiency and scientific expertise.”  These administrative Progressives used a rhetoric of “science and business efficiency” in order to reshape the discourse of public schooling in terms of “problems to be solved by experts.”  They believed that “experts would run everything to everyone’s benefit.”  This rhetoric helped legitimize institutional reforms whereby educational power was “consolidated” in “large and centralized organizations” that were modeled after corporate structures: “In seeking to depoliticize education, in moving the regulation of education upward and inward in urban and state bureaucracies, in basing legitimation for new authority on scientific expertise, the new managers in education were following patterns of action and thought pioneered in the corporate sector of business.”  And while the schools were operating more and more like corporate organizations, they were also legitimizing the gross inequality and hierarchy of an industrial mass-society under the cover of a meritocratic equality of opportunity that was supposedly being taught in the public schools.  But Tyack and Hansot make clear that the administrative Progressives were contested at every turn and their vision of public schooling was not the only administrative program.  However, “the ideology of depoliticized expertise splintered opposition and defused the effectiveness of protest” and thus the “ideology of professionalism” was able to entrench the vision and program of administrative Progressives within the centralized, bureaucratic public school system that remains to this day.[cxviii]

In later work, David Tyack and Elisabeth Hansot, along with Robert Lowe, researched Progressive education during the Great Depression in Public Schools in Hard Time: The Great Depression and Recent Years (1984).  Their emphasis fell on the “complex interaction” of the “political economy of public education” during the Great Depression years and, specifically, how the process and organization of schooling was effected by the tug and pull of “local governance and finance, of growing assertions of state power, and of national influence of various kinds exerted largely through powerful private organizations.”  They demonstrated how “pluralistic patterns of interests and power” orchestrated “quite different results in different places.”  Tyack et. al. also discussed the 1930’s as the possible “high point” of Progressive education, but acknowledged that different historians have used the “foggy concept” to refer to “many different ideas and practices” so its quite hard to make an argument for its peak. 

The conceptual muddle of “Progressivism” was not helped by the reformers penchant for negative ideological maneuvering (what they were against) instead of positive programmatic statements (what they were for).  There was also the added difficulty of distinguishing between “what leaders said” and “what actually happened behind the schoolhouse door.”  The authors noted that Progressive education as a historical concept refers to many “kinds of reformers” who “thought of themselves as Progressive,” who defined the significance of “Progressive” in many different ways, and who worked for organizational and curricular modification to meet the needs of changing historical circumstances as they saw it.  Social Reconstructionists, reformist administrators, libertarians, and liberals all had a different vision and program of Progressive education.  To the extent that “Progressive” reforms in education happened during the Great Depression, it was most significantly a “classroom affair, a new kind of interaction between the teacher and the students,” most likely highly varied between different classrooms, schools, and districts, but also limited in terms of the power of tradition teaching practice and cutbacks due to fiscal retrenchment. 

The authors also noted that as specific cultural and historical contexts dictated, “Progressive methods could be used to serve conservative ends,” specifically they mention how “Progressive” reforms rarely if even confronted the structural inequalities associated with race and class.  The black school reformer and Progressive Horace Mann Bond articulated this issue clearly at the time (he has often been left out of most historical discussions of “Progressive” education as have other back school reformers of the period).  In “The Curriculum and the Negro Child,” Bond wrote: “The schools have never built a new social order, but have always in all times in all lands been the instrument through which social forces were perpetuated.”  Tyack et. al. maintained that no significant widespread “Progressive” changes occurred during the Great Depression years.  The organizational and curricular operations of public schools “changed very little,” and to the extent there were reforms initiated, the can be seen as “short-term dislocations” in the midst of “long-term continuity.”[cxix]

Outside of the preeminent work of the two leading History of Education scholars, Lawrence Cremin and David Tyack, there have been many other important works published on both Progressive education history and the larger history of educational reform that surrounds this particular movement.  One important early work was by C. A. Bowers in 1969, The Progressive Educator and the Depression: The Radical Years.  Bowers argued that there were two factions within the Progressive educational movement.  The more powerful and mainstream faction represented a romantically oriented “cult of the child” and they articulated a child-centered pedagogy.  The other faction came to be known as the “Social Reconstructionists.” They wanted the schools to be part of a larger effort to address current social problems so as to use the schools to reform society.  The Social Reconstructionists used the rhetoric of class struggle to advocate a platform of social planning and socialistic collectivity. 

When George S. Counts gave his landmark speech, “Dare Progressive Education Be Progressive?” in 1932, he was both criticizing the movement’s political neutrality and urging Progressive educators, specifically members of the PEA, to forsake moderate liberal reformism in order to embrace more radical educational, social and political pieties.  Counts of course meant the rejection of capitalism so that schools could embrace and propagate socialism.  To further this mission, Counts wanted teachers to become political actors inside the nation’s classrooms and, thereby, not be afraid to use “indoctrination’ to “check and challenge” capitalist dogma.  Counts believed that schools would indoctrinate students no matter what and, thus, the question became, in whose interests would the public school curriculum serve? 

The Social Reconstructionists had a very definite idea.  In a PEA pamphlet drafted by the Committee on Social and Economic Problems, A Call to the Teachers of the Nation (1934), they exhorted teachers to reject capitalism and renew American democracy: “[teachers] owe nothing to the present economic system, except to improve it; they owe nothing to any privileged caste, except to strip it of its privileges…a powerful organization, militantly devoted to the building of a better social order and to the fulfillment…of the democratic aspirations of the American people.”  Bowers called this “one of the most extreme and utopian statements to be made by any group during the depression” – even more so than the 1934 Manifesto of the Communist Party of the U.S.A.[cxx]

Bowers critiqued the Social Reconstructionists, usually by surveying the criticisms of their contemporaries; Progressive educationalists like John Dewey and Boyd Bode made many trenchant critiques.  Bowers noted several.  The Social Reconstructionists had an “ubiquitous sense of mission,” which harkened back to the evangelical millennialism of the common school reformers; they often espoused a simplistic utopianism; and they had a romantic conception of the “power of education to eradicate the evil in the world.”  Bowers also called the Social Reconstructionists “poor social analysts” because they “lacked an understanding of the teacher’s actual position in society:” “Even though teachers had no real protection from being dismissed arbitrarily by school boards – and they thus possessed neither economic security nor the ability to formulate significant policy – the Social Reconstructionists viewed them as a force capable of directing social change.”  Bowers argued that these educational radicals took a position too extreme to align themselves with labor and to infatuated with the schools to fit well with the Communists, which made their call for teachers to lead the class struggle seem ridiculous to most observers.  Bowers quipped, “the editor’s messianic zeal had led them far down the road of absurdity.”  Alienating themselves from other Progressives and ignored by other radicals, the Social Reconstructionists eventually abandoned their radical socialism.  They took a conservative turn during the war, which intensified afterwards.  Calls for class war were exchanged for slogans urging the saving of democracy and the fighting of totalitarianism.  Ironically, after their journal folded, the more moderate Social Reconstructionists took the field as the most powerful and influential Progressive educators and exerted an important authority over curricular debates in the late 1940s.  The message had now become community centered schools, democratic deliberation, democratic cooperation, and fostering “democratic living.”  This “new doctrine” would have wide and lasting imprint on the American public schools, but would eventually be rhetorically co-opted more conservative forces in the 1950s.[cxxi]

In “Education and Progressivism,” Joel Spring argued that “Progressivism” had been used and defined so broadly, specifically Cremin’s use of the term in The Transformation of the School, that it was “a valueless definition since it literally includes everyone.”  Spring criticized the “lack of clarity” and “confused picture” that this “vague” and “obscure” term identified.  He instead called for a more “sharply defined” conceptual terminology of educational reform based on the particulars of various reformist ideology.  Specifically Spring suggested that reformer’s visions “of the good life” – the ultimate purposes reformers were trying to produce in changing individuals and society – could be the best way to conceptualize distinct “reform” movements.  Spring focused on one example in his article: the movement for “social efficiency.”[cxxii]

Herbert M. Kliebard followed Spring’s lead in 1986 when he published the 1st of three editions of his very influential book, The Struggle for the American Curriculum.  Kliebard completely denied the existence of a Progressive educational “movement:” 

“The more I studied [Progressive education] the more it seemed to me that the term encompassed such a broad range, not just of different, but of contradictory, ideas on education as to be meaningless.  In the end, I came to believe that the term was not only vacuous but mischievous.  It was not just the word “Progressive” that I thought was inappropriate but the implication that something deserving a single name existed and that something could be identified and defined if we only tried.”

Instead he argued for competing “interest groups” with “distinct” “ideological positions” and “agendas for action.”  These factions contemporaneously co-existed in often “antagonistic” ways, each with its own reform agenda, although sometimes they were able to bury differences in order to form “temporary coalitions around a particular reform.”  During what has been called the Progressive era, these antagonistic factions “struggled for control of the American curriculum” and the 20th century became an educational “battleground.”  Often these groups were fighting over the core issue of “differing forms of knowledge” legitimating specific cultural values.  Kliebard focused on only four interest groups that represented the major educational divisions at the turn of the century.  The most powerful was the entrenched “humanist” group and three reform groups challenging the humanist hegemony were the child study movement, the social efficiency movement, and the social meliorists.  Outside the fray, yet infused within it, Kliebard uniquely argued, was the towering figure of John Dewey who while not directly allied with any one group, he helped define and critique the perimeters of 20th century educational reform.[cxxiii]     

Kliebard refined and articulated his epistemological position with regards to the conceptual territory of “Progressivism” in a 1993 “Afterword” to the 2nd edition entitled “The Search for Meaning: Curriculum Conflict in the Context of Status Politics.”  He claimed that Progressive education was no more than a “mélange of reforms” that have been “lumped together” under a common term.  This was due in a large part to Lawrence Cremin’s seminal use of the phrase.  While Cremin warned against any one definition, he equated it with “the educational phase of American Progressivism writ large.”  Edward A. Krug’s two volumes on the Shaping of the American High School prefigured the turn of direction that would occur in the 1970s when the vagueness and vacuity of the phrase “Progressive movement” was questioned (by Filene and Spring among others), ultimately to be rejected and jettisoned by an influential minority within the historical community.  In its place came two new epistemological uses.  One was a restricted definition of “Progressive” attached to narrower historical entities, like Tyack’s use of “administrative Progressives.”  The other use focused on the “politically and socially regressive nature” of many so called “Progressive” reforms.  Kliebard noted tongue in cheek: “We are left with the feeling that much of what went on in the Progressive era was socially and politically, and perhaps even pedagogically, regressive.”  Thus instead of even using the term “Progressive,” historians like Kliebard have instead looked for ideologically distinct social, political, and educational “movements” that are much more clear and distinguishable in their affiliations, goals, programs, and practices – “persons identified with a movement, in other words, see themselves as sharing common programs or beliefs.”  Using this methodology and narrowing the definition of a “movement” ala Peter G. Filene, Kliebard questioned “Progressivism” out of existence: “Once a movement is understood in this way, one can then go on to determine whether the term Progressive can legitimately be applied to such a collective, but it is not clear at all that such a collective exists…In short, neither in terms of the coherence of the program for reform nor in its membership nor in its overall ideology can a definition of Progressivism as a social and political movement be articulated.”

Instead of using the terminology of Progressivism, Kliebard formulated his own position, which rested on three points.  First, Progressivism cannot be defined “in terms of stable attributes.”  Second, specific ideological subgroups can be identified and their more “consistent and recognizable ideological positions” can be conceptualized.  And third, all reform issues could be complicated by reform coalitions that could consist of a blending of various distinct ideological sub-groups.  Thus, Kliebard’s conception of “Progressive education” was a broadly sweeping “reaction against tradition structures and practices but with multiple ideological positions and programs of reform.”  This broad “reaction” is composed of distinct and “reasonably coherent subgroups and movements,” but in no way do all these pieces “add up to one Progressive education movement.”  Hence the central term “struggle” in the title of Kliebard’s book.  The American curriculum was “contested terrain” and “the prize for which the various interest groups competed.”[cxxiv]    

The political scientist Paul E. Peterson wrote The Politics of School Reform, 1870 – 1940 (1985) in which he used quantitative methods to study three different urban school systems (Atlanta, Chicago, and San Francisco) in an effort to examine the particulars of late 19th and early 20th century educational reform in three unique historical contexts.

In an effort to caution against generalizations, Peterson argued, “diverse participants focused on those specific objectives in which they had the greatest stake.  Although some related their specific demands to larger views of the good society, their demands were met by counterclaims with alternative visions:” “Each of these groups had their own distinctive sets of interests; no stable alliance among any two of them was able to determine policy choice in all situations; instead, outcomes in particular instances fluctuated as different coalitions came together in an ever-changing series of uneasy alliances.”  School policy was a constant battle ground between competing factions.  In order to gain “legitimacy,” Peterson argued, school officials tried to “separate themselves, as institutions, from particular groups and factions:” “No one social group held sufficient economic and political power to dictate the course of school policy.  The ultimate winners in such an uncertain contest were, of course, the schools themselves.  As organizations, they could only prosper from contests and conflicts among competing interests.”  It is out of this complex historical environment that the “politics of institutionalization” took place, whereby, urban educational leaders sought “expansion and professionalization” so as to make public schools an “organized system of autonomous power” within politically divided, fiscally strained, and ethnically contentious communities.[cxxv] 

Peterson’s study paid particular attention to a “threefold system” of social “stratification” in industrial America differentiated by class, status, and political power – especially in relation to the “noticeably inegalitarian” “structure of educational institutions:”[cxxvi] education was a class based institution that “declare[ed] one’s social worth” and “validat[ed] the status of social groups.”  Education was “a prize to be won by each social group in order for that group’s culture to be affirmed, legitimated, and perpetuated.”  To the extent the public schooling became an agent of “cultural imperialism,” Peterson argued, it did so not by “compulsory instruction” but by “the exclusion of a group from pubic schooling.”  Peterson criticized the historical argument that 19th century public schools were used to control and train “docile work force.”  In stead he argued that public school officials “ignored” ethnic immigrants and the poor “until adequate facilities had been extended to the more favored:” “Instead of insisting on attendance in publicly controlled institutions, they allowed foreigners to go to their own schools.  Instead of keeping potential troublemakers under their watchful eyes, the poorest, most outcast segments of the community went uneducated altogether.”  But as public schools became more and more “open” and “responsive” to changing community needs, the common pattern of school reform in relation to racial minorities was to give “separate” or “inadequate facilities,” or to keep them “completely excluded from education.”[cxxvii]

Peterson argued that “were it not for widespread citizen involvement in politics, it is likely that the status differences in a culturally pluralistic society would have led to systematic repression of minorities.”  Local organizations, business, and labor involvement where also at work in expanding the school curriculum: politically powerful ethnic minorities were able to get bilingual education, like the Germans in San Francisco and Chicago; business leaders argued for cheap “basic” education and also manual training; and labor wanted both vocational and a diversified liberal arts curriculum.  Peterson noted,

“By the end of the century the debate over the purposes of public education was subtly shifted from questions of cultural incorporation and citizenship to those of compatibility with the demands of the labor market.  Thus businessmen could attack foreign-language instruction, music, and some forms of manual training as frivolous departures from the fundamental purposes of public education at the same time that they called for additional courses in the practical skills required for growing industrial economies.  Working-class and ethnic groups, on the other hand, defended the differentiated curriculum as an essential ingredient of a democratic society.  At the same time, these groups sought practical courses that would widen avenues of economic opportunity.  School officials, for their part, maneuvered to protect and expand their organization in the context of these changing political pressures.”

This diverse political context was also complicated by the clash of ethnic groups in an American environment of “native dominance” by self professed Anglo Saxons.  Peterson argued, based on the evidence he found, “schools were uninterested in (or incapable of) systematic ethnic discrimination” in terms of access to classrooms and resource allocation because school officials were mostly concerned with consolidating their institutional autonomy in the face of hostile local party machines – although he did qualify this statement by acknowledging that quantitative data cannot “address the quality of the educational experience of children from various ethnic groups” where “ethnic discrimination” most likely happened.[cxxviii]

But growing acceptance of ethnic diversity within the public school system was not the whole story.  At the same time, many schools across the country practiced a systemic exclusion and segregation of specific minority populations.  Peterson focused on the institutional treatment of blacks, Japanese and Chinese populations.  Peterson’s general explanation for segregation and exclusion of particular ethnic minorities in the U.S. was the lack of political power: “If the group could not impose sanctions on elected officials, the schools were content to provide only the legal minimum, ignoring the barrage of pleas and petitions from the minority.  In most cases, political resources were difficult to accumulate because racial minorities either were explicitly denied the right to vote or were left out of the dominant political coalitions.”  After emancipation blacks in the South were eager for education, but during the later 19th century, they were not only educated separately in segregated and overcrowded facilities (often excluding many students because there was not enough room), but those facilities were also “markedly inferior” and school supplies where often lacking.  Blacks were also excluded from secondary schools until 1920.  But they had a strong desire for schooling and measure of political power, which they were able to use effectively up until 1892 in order to receive “concrete” educational concessions.  However, blacks began to be systematically disenfranchised in 1892 when the Jim Crow South initiated the white primary and voter restrictions and, thus, from 1892 until 1940s blacks found it even harder to improve their meager system of segregated education in the South.  In Chicago blacks were able to integrate somewhat into the public schools because they were such a small minority, although when the black population increased by the 1920s de facto segregation ensued and their segregated schools suffered in similar ways as did southern black schools. 

Because they were such a small minority in San Francisco, blacks were integrated into the public school system in 1875.  But the Chinese, constituting about 9% of the population of San Francisco in 1880, were systematically prevented from any public education until 1884 when a lawsuit allowed segregated schooling, which became the norm well into the 20th century.  The Japanese students were allowed to attend integrated schools in San Francisco only because of the considerable support of the Japanese government, which used diplomatic leverage with President Roosevelt.  Peterson emphasized that many minority populations in the U.S. had to first fight for their right to public schooling (which usually resulted in segregated schools), then they had to fight for educational improvements, and finally they had to fight for integration.  Minority success in each stage was the result of “changes in their political status” and as minorities “gained their political rights, their rights to public education also came to be recognized:” “Reform was much more – and much less – than a class struggle, and reformers were often much more – and much less – than a class-conscious elite who imposed their interests and values on a resistant working-class majority.  Reform was itself as complex, uncertain, and pluralistic as many of the other forces shaping urban schools.”[cxxix]      

In 1981 William J. Reese published an award winning paper, “’Partisans of the Proletariat’: The Socialist Working Class and the Milwaukee Schools, 1890 – 1920,” [cxxx] in which he argued that many histories on “Progressivism” and “Progressive education” have focused too much on “new” middle class professionals and, thereby, have ignored other social groups active at the time, like the urban poor, labor groups, and local socialist parties.  He argued that “studies written from the top of the educational system down are certainly valuable, though limited in terms of understanding the process of social change in the schools.”[cxxxi]  Reese argued that the poor and laboring classes were not simply “powerless” and therefore “victimized” by an urban elite.  He suggested instead that at the local level radical politics and third-party movements had some political success, and that the “Progressive” era was at the same time “the golden age of Socialism and labor radicalism.” 

Reese examined Milwaukee, which in 1910 was the first city in the U.S. to be politically swept by a socialist party.  Reese detailed the diversity of the socialist “working class”[cxxxii] and how through a complex historical process it became “intertwined” and engaged in a “symbiotic relationship” with non-socialist groups (middle-class women’s groups, Progressive civic groups, and other voluntary associations) in order to form coalitions to address specific reform issues.  Through the process of reform coalition, these diverse reform groups interacted and influenced each other socially and politically, and while they differed fundamentally on “ultimate ends,” they were able to come to some agreement and find common ground on “immediate programs” like adding free lunch programs or playgrounds to the public school.  The Progressive education “movement” from the 1890s to the 1920s, Reese argued, was no more than an “amalgamation of different groups of people who had assembled at different points in time in response to the unique circumstances of Milwaukee politics,” and when the times changed during WWI and the coalitions fell apart, the “pieces” of the movement “could not be pieced together again.”

Reese and Kenneth Teitelbaum revisited socialist educational reformers in another article a few years later, “American Socialist Pedagogy and Experimentation in the Progressive Era: The Socialist Sunday School” (1983).[cxxxiii]  In this article Reese and Teitelbaum emphasized the socialist commitment to education.  They noted that while socialist groups and parties did align in political coalitions with “liberal Progressives and other radicals” over public school reform issues, they also had strong educational initiatives of their own, like the international Socialist Sunday school movement.  These schools sought in most causes to supplement the public school education of working class children by teaching them democracy, “the socialist spirit,” and “cooperative effort,” so as to instill in them the socialist cause and hopefully produce “good rebels.”  The authors argued for a more diverse understanding of educational reforms during the Progressive era and claimed that the “significance of the Socialist Sunday schools lies in their very existence” as a “dynamic opposition movement to the public school influences of the day.”      

Reese expanded these early efforts on socialistic reform groups and published his important study, Power and the Promise of School Reform: Grassroots Movements During the Progressive Era (1986).  In this book Reese focused on the diversity of school reformers during the Progressive era and argued that school reform was “a battleground between various contending interests.”  School reform was such a contentious issue because a “single system of schools tried to serve a plurality of competing interests.”  Reese’s study looked at the “social conflict” and partisan wrangling over specific educational reforms in Rochester, Toledo, Milwaukee, and Kansas City.  He emphasized how actual reforms came into being through the “interaction between many competing forces:” “school innovation and reform were produced by interaction, resistance, adaptation, and accommodation, with the power of capital clearly in a dominant though never unchallenged position.”[cxxxiv]

Reese noted that many prominent middle class, professional, and business elites regarded the public schools as the foundation of a stable social and economic order, and also, as a reporter for the Kansas City’s Democratic Times claimed, the “handmaiden of economic growth.”  But the rising control and centralization by urban elites was contested at every turn by many grassroots organizations.[cxxxv]  Reese called this process a “dialectics of school reform.”  There was a “constant exchange,” Reese argued, between “those who would centralize and those who would decentralize power.”  There was also cooperation as “shifting coalitions” would come together temporarily on different issues to campaign for municipal reform.  Reese noted one issue in particular that was popular and was able to unite various ideological groups: the overall expansion of the social functions of public schooling, like playgrounds, lunches, and medical care.  But with the coming of WWI the “spirit of civic activism” collapsed and the community became polarized, thus undermining “faith in cooperation” and bringing to an end the “remarkable era of grassroots Progressivism.”[cxxxvi]

The black historian and Progressive Horace Mann Bond published “Education in the South” in 1939.  In this article he agreed with the unabashed fascist Lawrence Dennis that schools were often the “instrument of a dominant elite” and that these elites have used education as a form of “social control.”  While he criticized Dennis, Bond criticized American education even more when he wrote: “The concept of social forces has not been neglected in application to educational institutions in America as a whole.”  But Bond emphasized the South where the “dominant planting aristocracy” has used public schools “to maintain both the structure of social classes and that of racial caste” in order to protect their economic and social interests.  Bond noted that “the masses of white people in Southern States have, slowly and grudgingly, fought toward the achievement of systems of universal education for white children,” but blacks were left largely outside the push for reform.  Bond ended his article by saying that black education may improve, but as long as the “determination of control” lay with powerful, white, racist elites, “we may expect to flow inevitably educational structures that are the instruments of the dominant social and economic class which creates and controls them.”[cxxxvii]    

Taking a page from Bond, James D. Anderson published an important addition to the Progressive education literature, although it was not really about Progressive education.  It was rather an indictment of the educational establishment, which failed to enact truly “Progressive” reforms as far as the second-class education of blacks in South was concerned.  In The Education of Blacks in the South, 1860 – 1935 (1988), Anderson argued for a new understanding of American education in relation to its tortured history with African Americans:

“It is crucial…to recognize that within American democracy there have been classes of oppressed people and that there have been essential relationships between popular education and the politics of oppression.  Both schooling for democratic citizenship and schooling for second class citizenship have been basic traditions in American education…Black education developed within this context of political and economic oppression.”

Anderson made it clear that during the late 19th and early 20th centuries both Northern and Southern whites were in many ways “white supremacists” and “insisted on a second-class education” for blacks in order to accommodate them for “subordinate roles in the southern economy.”[cxxxviii] 

Anderson argued that in black educational circles Book T. Washington stood virtually alone in pandering to white gradualism by developing the Hampton-Tuskegee Idea which offered only industrial education.  Most black educators, black families, and black students wanted a liberal arts style education, just like the majority of white students received.  In regards to education, and much else, Anderson characterized blacks as a “responsible and politically self-conscious social class.”  But due to their subordinate and disenfranchised position, blacks were largely unable to get what they wanted educationally (not to mention politically).  Both white Southern educationalists and Northern educational philanthropists shared a certain “unity of belief in white supremacy,” which largely restricted (and sometimes outright forced) the channels of black education into segregated, inferior, and mostly industrial education.  Many white Southerners felt that school was “inappropriate” for blacks because “learning will spoil the nigger for work.”  Those white Southerners who conceded the need for black education wanted an educational system that would properly control blacks so as to keep them a permanent class of exploited labor.  Northern white missionaries and philanthropists were infused by a combination of white supremacy, paternalism, and democratic idealism.  They wanted blacks to have the Hampton/Tuskeegee model of education so that blacks would become skilled, secure and satisfied in their position as exploited labor.  Not surprisingly the Hampton/Tuskeegee model of education often resembled slave labor with the “educational” curriculum consisting of 10-11 hours of agricultural work a day (for 6-7 days a week) supplemented with some evening classes for the more intellectually gifted.  Anderson concluded his study by focusing on the frustrated struggle of blacks for educational opportunity: “The education of blacks in the South reveals that various contending forces sought either to repress the development of black education or to shape it in ways that contradicted black’s interests in intellectual development.  The educational outcomes demonstrate that blacks go some but not much of what they wanted.  They entered emancipation with fairly definite ideas about how to integrate education into their broader struggle for freedom and prosperity, but they were largely unable to shape their future in accordance with their social vision.”[cxxxix]

In the late 1970s and early 80s Ronald K. Goodenow wrote a series of articles dealing with Progressive education and questions of race and ethnicity.  In these articles he made clear that “Progressivism” is a “complex and shifting phenomenon” that “defies easy definition” and thus he warned that historical “over-generalization is dangerous”[cxl]  We will be looking at two of his papers that dealt with the broader themes of Progressivism covered in this essay. 

His article, “The Progressive Educator, Race and Ethnicity in the Depression Years: An Overview,” dealt with two scholarly omissions in the historical literature on the Progressive era.  Few historians had scrutinized the views of white Progressive educators on race and ethnicity, and few had looked at the “contribution of blacks and ethnics to Progressive education.”  Goodenow noted that some members of the PEA and many social reconstructionists did discuss racial discrimination and attempt to theorize ethnic conflict, although they generally organized their views around an assimilationist/Americanization framework.  Goodenow argued that there were two basic positions that Progressive educators took: “social-structural and institutional” determinants of racial discrimination (Dewey, Counts, Mabel Carney, and Buell Gallagher), and “cultural and psychological” causes of racial prejudice (Kilpatrick and Rugg).  The PEA as an organization discussed race and ethnicity within the confines of the Commission on Intercultural Education (1936-38), but the initiatives of this commission generally ignored structural-institutional determinants of racism and ended up stressing a depoliticized “cultural contribution” approach in an effort to promote national unity, tolerance, and democracy.    

This article also looked at Southern Progressivism, which as an educational program was mostly concerned with the “modernization” of Southern schools, i.e. standardized curriculum, teacher professionalization, and centralized control of schools.  Outside of a few notable exceptions (Mabel Carney and Buell Gallagher), there was little effort done to address race in the South except of course to reinforce segregationist and paternalist social control.  One Southern state curriculum guide explicitly stated that blacks were “a constant menace to the health of the community, a constant threat to its peace and security, and a constant cause of and excuse for the retarded progress of the other race.”  Despite the pious and often empty rhetoric of white reformers, which could serve conservative as well as Progressive ends, blacks were highly interested in Progressivism and generally saw “considerable potential” in using Progressive-democratic rhetoric to their advantage.  By turning Progressive rhetoric against white moderates it became “more difficult for them openly to oppose democratic change.”  There were also liberal black critics of Progressivism, like Horace Mann Bond, who criticized most Progressive programs for not addressing the structural-institutional determinants of racial oppression and for assuming that a “democratic social order” existed in which blacks could democratically seek to address their grievances and fulfill their aspirations.

Goodenow revisited Southern Progressivism three years later in “Paradox in Progressive Educational Reform: The South and the Education of Blacks in the Depression Years.”  Goodenow argued that Southern Progressivism was concentrated on “modernization while concurrently maintaining fundamentally racist patters that themselves were contradictory to much Progressive ideology.”  The main programmatic efforts of Southern Progressives addressed standardized curriculum, scientific management, teacher professionalization, and centralized state control.  Within these programs “tolerance” was often used as a rhetoric for segregation and social control.  Blacks were to be trained “for loyalty, essentially menial tasks, and continued segregation.”  Goodenow condemned much of the Progressive program and its democratic rhetoric as “[Booker T.] Washington’s accommodationism in modern garb.”  The PEA as an organization generally avoided the race issue, but several of its members confronted radical discrimination either directly (Counts, Dewey, Mabel Carney, and Buell Gallagher) or in more oblique ways (Kilpatrick). 

Goodenow also claimed that “historians of Progressivism have totally ignored” the literature of black Progressives like W. A. Robinson, Doxey Wilkerson, Alain Locke, Charles Johnson, and Horace Mann Bond.[cxli]  Some black Progressives used Progressive rhetoric and methods for consciousness raising and social change.  Others, like Bond, argued that Progressive educational reform was futile unless the institutional structure of segregation and racism was attacked: “Let us confess that the schools have never built a new social order, but have always in all times in all lands been the instruments through which social forces were perpetuated.”  In a racist society ruled by racist “social forces,” Bond argued, all educational reform, whatever the rhetoric, would be structured in favor of whites.  In summary, Goodenow condemned Progressivism in the South as a form of “social control,” while he praised it in its role of offering “opportunity to create a more democratic social conscience among whites and a heightened demand for justice among blacks.”  He also praised black Progressives like Bond who criticized and exposed the paradoxes of Progressivism by “testing its democratic ideology against real conditions of oppression.”     

By 1992 the debate on Progressive education had come full circle and Mustafa Emirbayer was basically fleshing out and expanding Lawrence Cremin’s original position.  In “Beyond Structuralism and Voluntarism: The Politics and Discourse of Progressive School Reform, 1890 – 1930”[cxlii] Emirbayer started with Cremin’s landmark conception of the Progressive education as “the educational phase of American Progressivism writ large,” and re-proposed a monolithic interpretation on this movement.  He seemingly defined educational Progressives in a very general way: “inspired by Dewey’s vision, a wide range of educators, parents, and community leaders came together during the late nineteenth and early twentieth centuries in an impassioned crusade to transform American public schooling.”  With this definition he overlooked or ignored pluralistic arguments that denied a monolithic movement and, despite his claim for an empirical foundation, his sociological and political science framework drive an overly deterministic conception that often resulted in superficial and simplistic analysis.[cxliii]  He also based his conceptual framework on one historical context, Boston, and claimed that “school reform unfolded in not dissimilar ways in many other school systems across the county,” although he does admit that his “generalizations” do not “extend as readily to the South.”  Despite these serious failings, his overall analytical framework is intriguing and is very similar to the overall conclusion that I will be drawing at the end of this essay, so his argument merits a closer look.

Emirbayer put forth a conception of Progressivism as “discursive acts[cxliv] by state-building elites,” and he situated his concept within a critical synthesis of two general trends that he found “inadequate.”  He critiqued the strengths and weaknesses of both the school of “structuralist” analysis (Bowles and Gintis, Katz, Nasaw, and Peterson) and also the school of “cultural” analysis (Cremin, Kaestle, Tyack and Hansot).  He argued that structural analysis over-determined institutional power at the expense of human actors, it failed to account for the historical timing of Progressive reforms, and it neglected the importance of cultural factors.  He also argued that cultural analysis tended to “err in the direction of one-sided voluntarism” and ignore “objective constraints on voluntaristic action.” 

Emirbayer broke the Progressive education movement down into three contexts: curricular and pedagogical reforms at the local and national levels; local initiatives to reform the political and administrative structure of schools; and the professionalization of teaching and administrative, including organizational building.  He claimed that “each of these diverse streams of educational Progressivism manifests its own distinctive rhythm and trajectory.  But we can nonetheless group them all together under a common banner because…they all shared a common, unifying discourse, a similar set of concerns expressed in the ideals and images of civic republicanism, Protestant millennialism and liberal individualism.”  Progressives used very influential “cultural discourses” to unite disparate groups into a “broad-based coalition” to achieve the “larger goal” of creating “a new moral basis for American society.”  Emirbayer noted that Progressive education reforms “long outlasted” other reform movements of the Progressive era because of a unique “agenda.”  Progressive education debates represented discursive “struggles” of “oppositional and dominant groups” that battled over different visions and legitimations of the “sacred center” of the “public sphere.”  Both “administrative” and “pedagogical” Progressives were “driven by” a “state-building ideology,” which infused their moral crusade for a corporate welfare state that they envisioned would unite a fragmented urban-industrial republic.  Progressive educators and administrators were working towards a “new moral order” to check the “corruption” and “decay” of older social institutions so as to preserve and consecrate some type of “normative order” at the “sacred center” of American society:

“In their optimistic view, educational reform would help to redeem commonly shared American values and bring ever closer to reality the new ‘democratic’ society that was the true American destiny…As ‘the educational phase of American Progressivism writ large,’ the discourse of the Progressive school reformers embodied both the ‘social control’ dimension so typical of Progressive rhetoric in general, and its more hopeful and millennialist aspiration to a new ‘national community’…school reformers envisioned a generalized Christian spirituality as the basis for an ‘intentionally progressive’ democracy striving toward ever ‘more perfect union.’”

The actualization of the Progressive educational reform was often an “Americanization” program of “socialization” intended for both native and immigrant students.  The socialization process of the curriculum also included differentiation and tracking so as reinforce class-based structures of the American economy.  The end result of these reforms was “often profoundly undemocratic” and “culturally oppressive.  Emirbayer gave Progressive educational reformers credit for being successful in “forging a broad-based coalition” around their distinctive “vision,” which far outlasted all other Progressive reform initiatives and helped usher in a measure of “social stability” over the course of the 20th century.

Before we conclude this essay, we will look at two recent articles that have placed Progressive education within an international context and therefore complicate any conceptual usage of the term.  Marjorie Lamberti studied Progressive education in Imperial Germany at the turn of the century in “Radical Schoolteachers and the Origins of the Progressive Education Movement in Germany, 1900-1914.”[cxlv]  Lamberti chronicles the rise of the neue Padagogik (new pedagogy) and the Arbeitsschule (child-centered school) through the efforts of two predominant strains of Progressive reformers in Germany: radical reformers in Bremen and Hamburg, and more moderate Progressives in Saxony.  Both schools of thought combined a critique of religious instruction in the schools (they wanted it more in line with Modernist scholarship, but not eliminated – although some of the radicals wanted it eliminated) and they put forward a broader critique of teaching practices that were teacher centered, fact oriented, and not in line with the new research in psychology.  These Progressives drew upon German strains of Progressive pedagogy, German culture, and the new research in psychology at German universities, but several influential leaders had also been influenced by John Dewey’s work, especially The School and Society (1899).  The more moderate and majority of German Progressives focused on child centered and learning-by-doing pedagogy that tailored curriculum and instruction to the developmental and psychological needs of the child, while also increasing the professionalization and autonomy of teachers as child development experts.  Although Progressives represented a minority of German teachers, they had a deep impact on the profession and were able to convince the German Teachers’ Association to adopt the “new pedagogy” during the national congress in May 1912, whereby active-learning was added to this organizations program of reform.  This was seven years before the American Progressive Education Association was even founded.  

Jurgen Herbst reviewed the English translation of a German handbook, which centered on the international context of Progressive education.[cxlvi]  The book lacked a clear focus and covered several somewhat successful European Progressive educators and educational movements as well as some less successful attempts in other parts of the globe.  In pondering the international aspect of Progressive education and the editor’s conceptual befuddlement, Herbst rhetorically raised the question of “how far we want to extend the circle that includes activities we might want to classify under progressive education.”  “Are there no viable criteria of inclusion and exclusion?  Does everything fit?”  Herbst analyzed this question by way of a chapter on the development of progressive education in Europe by Jurgen Oelkers.  Herbst summarized that ever since the Reformation “academic institutions were run by governmental authorities in the interest and for the benefit of the state,” and thus European Reformpadagogik had existed alongside the state in “symbiotic relationship” as a “continuous structure” of counter-pedagogical practice stressing “the individualistic spirit” in “antagonistic” relation with the standardization of nationalism.  This suggests that since the Reformation Progressive education has been a social institution that has vied with nationalists over competing visions of the public sphere contained within the centralized organization of the state.  In light of this conceptualization Herbst asked, “it may well be time now to ask whether there is such a thing as a theory of progressive education and, if there is, whether we should begin to debate and define it.”

To conclude this discussion of Progressive education it would be helpful to first restate the conclusions of the last chapter.  It is clear that there were many reformist groups of various political and ideological stripes at the turn of the 20th century, of which Progressivism was but one example.  As a culturally homogeneous and economically secure social class (although uneasy in their security), Progressive reformers had the ability, education, and socio-economic resources to create many diverse voluntary organizations, including educational organizations, which they used to further various social, economic, political, and cultural causes.  Progressives were animated on the whole by a Republican-Populist-Protestant infused ideological orientation that often blended capitalist, scientific, and professional methods, all under a politicized and racialized banner of WASP “Americanism.”

Progressives sought many types of social change and aligned themselves with various other ideological groups to achieve reform coalitions on specific issues and initiatives, but they were primarily concerned with devising a clear and efficient order to harness modernity and industrialization under the tri-partite control of 1) a regulatory State integrated with 2) WASP civic associations and business corporations, and directed by 3) a technocratic elite.  “Americanization” as a nationalistic and cultural identity was the new order the Progressives sought.

The Progressive educational “movement,” to the extent that one can call it a movement outside of the organizational activities of PEA members and their associates, was most explicitly a general educational trend towards a more humane and child centered pedagogy often couched in the language of socialization and democracy – a general educational trend that was spreading across Europe as well.  But Progressive education in the U.S. was also a cultural movement that sought to define a WASP America in its own ideology[cxlvii] and interests and, thereby, to socialize and acculturate American minorities into the dominant Anglo culture (to the extent that different minority groups were deemed worthy of acculturation in specific geographical contexts).  Many minorities were deliberately excluded from Americanization or were offered inclusion on very demeaning, second-class terms.  However, more liberal and radical strands of the Progressive movement, especially within its educational manifestations, articulated a more inclusive, community oriented, democratic, tolerant, and multicultural dimension to the Americanization program. 

Although often in paternalistic, class-based, and racist language, these more liberal rhetorics of Americanization offered up democratic ideals that inspired minority populations to challenge the rhetorical Progressive platitudes of freedom, equality, and justice against the tarnished realities of the status quo.  And arguably as minority populations mobilized, minority leaderships organized, and civil demonstrations multiplied, the more liberal Progressives began to modify their conceptions of the WASP Americanization program and replace it with a more inclusive and multicultural conception – so much so that over the course of the 20th century the liberal state’s executive, legislative, and judicial branches would actually articulate and consecrate the civil rights of all Americans for the first time in the nation’s history.  Of course the more liberal Progressive rhetoric and the rising mobilization of minorities was countered and contested by a more conservative majority, and thus ensued over the course of the 20th century and into the 21st century a struggle – a cultural war – not only for the American paideia, but for the very meaning and “sacred center” of America.  The Progressive Americanization movement is an unfinished project that defines the parameters of the 21st century, which as I write is still the outline of a contested battlefield, and education, as always, is at the center of the political struggle to define the cultural conception of a nation.  At the heart of the conflict is a WASP culture that is loosing control – loosing the ability to exclusively define and delineate the moral order that is supposed to unite a nation.  The roots of this conflict lie at the foundation of the Progressive era.  The early 20th century Progressive movement, to the extent that there was a unified movement, embraced many offensive strategies to protect and preserve their WASP culture: discrimination, segregation, centralization, corporatization, and above all else public and private programs of “Americanization.”          

 


Endnotes

[i] John, R. Commons, “Progressive Individualism,” American Magazine of Civics, 6 (June 1895), 561-74.  Albion Small, “The Meaning of the Social Movement,” American Journal of Sociology, 3 (Nov. 1897), 340-54. 

[ii] Daniel T. Rodgers, “In Search of Progressivism,” Reviews in American History 10 (Dec 1982), 113-132.  Rodgers’ discussion of the origins of the term can be found in footnote 1.

[iii] John D. Buenker, “Rejoinders,” in Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc, 1977), 113.

[iv] For a good, concise historiography of Progressivism up to the 1970s see William G. Anderson, “Progressivism: An Historiographical Essay,” The History Teacher 6 (May, 1973), 427-52.

[v] Richard Hofstadter, The Age of Reform, From Bryan to F.D.R. (New York: Vintage Books, 1955), 5.

[vi] Ibid.  3, 10-14, 23-59, 133.

[vii] A “status revolution” is perhaps Hofstadter’s most contentious argument and it has been widely criticized by later historians of the period.  Buenker, Burnham & Crunden (1977); Link & McCormick (1983); Chambers II (2000).

[viii] Ibid.  5-6, 8-9, 11, 15-17, 19, 21, 135, 149, 152, 163-64, 182, 185-87, 196, 203, 206 211-12, 216, 288-301.  These pages contain Hofstadter’s major descriptions of Progressivism, Progressives, and the Progressive Movement.  Hofstadter quotes “evangelistic psychology” from Fredric C. Howe’s The Confessions of a Reformer (1925).

[ix] Robert H. Wiebe, The Search For Order, 1877 – 1920 (New York: Hill and Wang, 1967), 112-13, 128-29, 154-56, 161-68, 170, 174, 181, 198-99.

[x] James Weinstein, The Corporate Ideal in the Liberal State: 1900-1918 (Boston: Beacon Press, 1968).

[xi] Weinstein argued that socialism was the “only serious ideological alternative to [the] politics of social responsibility” used by progressive and corporate coalitions, although he criticized the socialist tendency to place faith in the regulatory state without a full understanding of its corporate capitalist backers (117, 132).

[xii] Ibid., ix-xiii, 33, 58, 61, 143, 212, 252.

[xiii] Peter Filene, “An Obituary for ‘The Progressive Movement,’” American Quarterly 22 (1970): 20-34;  John D. Buenker, John C. Burnham, and Robert M. Crunden, “Introduction,” In Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc., 1977), iv-viii.

[xiv] John D. Buenker, John C. Burnham, and Robert M. Crunden, Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc., 1977).

[xv] John C. Burnham, “Essay,” in Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc, 1977), 3-29.

[xvi] Burnham is quoting Clyde Griffen, “The Progressive Ethos,” in The Development of an American Culture, eds. Stanley Coben and Lorman Ratner (Englewood Cliffs, N.J., 1970): 120-149.

[xvii] Burnham argued against claims linking progressivism to welfare statism: “Equating the extension of governmental power for social justice purposes, or what came to be called welfare statism, to the spirit of progressivism is therefore an error.  It is true that many Americans admired German cameralism and socialism.  And many Americans did come to think that the neutral state would have to intervene more actively to maintain traditional liberty and freedom in society and so become a service state.  But to portray the attitudes of progressives toward political activity and power as anything beyond ambivalence is to distort the movement beyond recognition” (15).

[xviii] Robert M. Crunden, “Essay,” in Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc, 1977), 71-103.

[xix] Crunden summarized Erikson’s theory in this way: “Erikson has demonstrated suggestively how crises in childhood and youth can combine especially with religious milieus to produce effective political movements, and to create moral frames of reference in which certain values and reactions seem to be taken for granted.  He has also placed his considerable prestige behind the contention that great leaders articulate and find ways of resolving the important psychological conflicts in the culture of their time” (72).  Crunden draws from Erikson, Childhood and Society (New York, 1950, 1963); Young Man Luther (New York, 1958); Ghandi’s Truth (New York, 1969).  See also Robert M. Crunden, “Freud, Erikson and the Historian: A Bibliographical Survey,” Canadian Review of American Studies vol. 4, no. 1 (Spring, 1973): 48-64.

[xx] Fredric C. Howe, The Confessions of A Reformer (1925; reprint, Chicago, 1967), 12-17.  Crunden, “Essay,” Progressivism, 98-99.

[xxi] Robert M. Crunden, Ministers of Reform: The Progressives’ Achievement in American Civilization, 1889 – 1920 (1982; reprint, Urbana: University of Illinois Press, 1984), ix-x, 39-40, 64-68,164, 274-277.

[xxii] John D. Buenker, “Essay,” in Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc, 1977), 31-69.

[xxiii] Buenker wrote: “In a larger sense, Americans turned to politics because it was the only forum the nation possessed for ameliorating the conditions wrought by industrialization, immigration, and urbanization and for accommodating the competing demands of various economic, ethnic, and geographic groups…In a highly competitive society there was not a real sense of community to sustain concern for the less fortunate.  For better or worse, only politics provided an arena where conflicting groups could face each other under established ground rules and attempt to resolve their differences.  The political system, alone of America’s institutions, was based upon the existence of pluralism and diversity; it was constructed by compromise and specially designed to provide a means of accommodating conflicting interests” (46-47).

[xxiv] Ibid., 31-40, 43, 56, 59 63.

[xxv] Daniel T. Rodgers, “In Search of Progressivism,” Reviews in American History 10 (Dec 1982), 113-132.

[xxvi] Rogers argued that Progressives did not “share a common creed or a string of common values,” but instead shared a “cluster of ideas” and “three distinct social languages.”  These languages were a “rhetoric of antimonopolism,” “an emphasis on social bonds and the social nature of human beings,” and “the language of social efficiency.”  Rodgers said the Progressives were great “users” of ideas as a “set of tools” with which they made “progressive social thought distinct and volatile” as they brought together all three of the reformist languages together into a powerful and “dynamic” “constellation” “from which they drew their energies and their sense of social ills, and within which they found their solutions” (122-27).   

[xxvii] Arthur S. Link and Richard L. McCormick, Progressivism (Wheeling, IL: Harlan Davidson, Inc., 1983), 1-10, 21-22, 72, 79, 84, 96-104.

[xxviii] Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919 (New York: W. W. Norton & Company, 1987).

[xxix] Ibid., xii-xiii, xix, xxiv, xl, xliii, 279-80.

[xxx] Ibid., xxviii, 8, 136, 258, 268.

[xxxi] Ibid., 70-71, 231-52.

[xxxii] Ibid., Ch 5 & 7.

[xxxiii] John D. Buenker, “Sovereign Individuals and Organic Networks: Political Cultures in Conflict During the Progressive Era,” American Quarterly 40 (Jun, 1988), 187-204.

[xxxiv] Alan Dawley, Struggles for Justice: Social Responsibility and the Liberal State (Cambridge: Belknap Press of Harvard University Press, 1991), 1-13, 30-31, 62, 71-73, 105, 128-38, 163-65, 370, 394.

[xxxv] Chambers noted in his bibliography that he was not able to read Dawley’s Struggles for Justice in researching the first edition of his book.  I would argue that Dawley has presented one of the clearest and most comprehensive treatments of the era and the subject of Progressivism to date.

[xxxvi] John Whiteclay Chambers II, The Tyranny of Change: American in the Progressive Era, 1890 – 1920 (1992; reprint, New Brunswick, NJ: Rutgers University Press, 2000), xi-xiii, 132-47, 150-51, 157, 169-71.  Chambers summed up nicely the “meaning of the Progressive Era:” “In the Progressive Era, large numbers of Americans concluded that the problems accompanying industrialization meant that they could no longer rely solely on Providence or evolution for automatic progress.  They lost their faith in the long-held utilitarian concept of a natural harmony of self-interests and in the functioning of a self-regulating society…With optimism and the sense of power that came from developments in science, technology, and organizational theory, the new interventionists decided that it was necessary to modify the concept of unrestricted individualism and the marketplace.  They thought that intervention and intelligent direction could ensure continued growth and progress that would be consistent with the ideal of an efficient and liberal democratic society…Interventionists created new mechanisms for dealing with the problems caused by blind social forces or powerful, self-interested individuals or groups…interventionists employed organization and intervention as tools for achieving their goals and imposing conscious direction on society…The dominant development of the era was the emergence of an interventionist mood on a national scale.  The need for some kind of purposeful, collective intervention…the organization of economic and social power.  The local, informal group so characteristic of small-town and agrarian society was superseded as the basic framework of American life by immensely larger, hierarchically structured formal organizations…the organizational or bureaucratic revolution….Although people at all levels of society sought to influence the forces affecting their lives, particularly in the immediate environment in which they lived, the poor and the unorganized had little or no influence in the national political system” (275-82).

[xxxvii] William Deverell, “The Varieties of Progressive Experience,” California Progressivism Revisited, ed. William Deverell and Tom Sitton (Berkeley: University of California Press, 1994): 1-11.

[xxxviii] Gary Gerstle, “The Protean Character of American Liberalism,” The American Historical Review 99:4 (Oct 1994): 1043-1073.

[xxxix] Richard L. McCormick, “Public Life in Industrial America, 1877 – 1917” in The New American History, ed. Eric Foner (Philadelphia: Temple University Press, 1997), 107-132.

[xl] In the same volume, Alan Brinkley described the “broad conflict” of the time as the “diverse” responses of various groups that coalesced into a “broad pattern of protest,” whereby, “’localistic’ people were struggling to preserve control of both the economic and the cultural institutions that governed their lives in the face of encroachments from the modern, bureaucratic order” (137-40).  Alan Brinkley, “Prosperity, Depression, and War, 1920 – 1945,” in The New American History, ed. Eric Foner (Philadelphia: Temple University Press, 1997), 133-158.

[xli] McCormick noted that the “concept” of “Progressivism” “still dominates the interpretive literature on the early twentieth-century United States” and that for better or worse the “concept is inescapably embedded in the language of contemporaries and the writings of historians.”  While there were “varied, fervent efforts to solve the problems caused by urbanization and industrialization,” the efforts of Progressives were distinctly powerful and long lasting.  Progressives were largely native born, urban, middle and upper middle class, and rooted in evangelical Protestantism.  They sought to use the social sciences to “eradicate social conflicts” and also to temper the excesses of capitalism (121-22).

[xlii] Robert D. Putnam, Bowling Alone: The Collapse and Revival of American Community (New York: Simon & Schuster, 2000), 367-401.  The Economist noted in 2005 that “voluntary associations have been the secret ingredient of American social dynamism since the country’s foundation…civic associations made Americans better informed, safer, richer and better able to govern themselves and create a just and stable society.”  This publication commented on Putnam’s thesis and argued for new signs of civic participation in the U.S.  “The Glue of Society: Americans are Joining Clubs Again,” in A Survey of America, The Economist, 16 July. 2005, 13-17.

[xliii] Michael McGerr, A Fierce Discontent: The Rise and Fall of the Progressive Movement in America, 1870 – 1920 (Oxford: Oxford University Press, 2003).

[xliv] McGerr described the “progressive” ideology as part of the “middle-class alienation from working-class and upper-class culture.”  He wrote, “Progressivism was the way in which these Victorian men and women came to answer the basic questions of human life that have confronted all people in all times and places: What is the nature of the individual?  What is the relationship between the individual and society?  What are the proper roles of men, women, and the family?  What is the place of work and pleasure in human life?”  The answers to these questions “added up to a novel set of guiding values, a new ideology for the middle class: Victorianism gave way to progressivism” – “Rethinking domesticity, rejecting individualism, reconsidering work and pleasure, and redesigning the body” (343 footnote 73, xiv, 42, 64).

[xlv] Ibid., xiii-xvi, 42, 64, 67-68,

[xlvi] Nancy MacLean, Behind the Mask of Chivalry: The Making of the Second Ku Klux Klan (Oxford: Oxford University Press, 1994), 79, 33.  MacLean argued that one “common core goal” of the Klan was “securing the power of the white petite bourgeoisie in the face of challenges stemming from modern industrial capitalism.  The Klan sought to deny political rights to those whom it perceived as threats to that power” (141).  MacLean also made it very clear that “extreme conditions” can very easily lead to a “reactionary politics:” “Under conditions of economic uncertainty, sharply contested social relations, and political impasse, assumptions about class, race, gender, and state power so ordinary as to appear ‘common sense’ to most WASP Americans could be refashioned and harnessed to the building of a virulent reactionary politics able to mobilize millions” (186).

[xlvii] Ibid., 10-11, 52-74.

[xlviii] Ibid., 149-73.  MacLean wrote: “Vigilante Violence was the concentrated expression of that culture, of the brutal determination to maintain inherited hierarchies of race, class, and gender that Klansmen sought to conceal with a mask of chivalry” (173).

[xlix] Ibid., 125-48, 166-67; David Roediger, “Whiteness and Ethnicity in the History of ‘White Ethnics’ in the United States,” in Towards the Abolition of Whiteness: Essays on Race, Politics, and Working Class History (London: Verso, 1994),189.  See also Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919 (New York: W. W. Norton & Company, 1987): Ch 12.

[l] C. Vann Woodward, The Strange Career of Jim Crow (1955; reprint, Oxford: Oxford University Press, 2002), 90-93; Anonymous Klansmen quoted in MacLean, Behind the Mask of Chivalry, 132-34, 161; Hofstadter, The Age of Reform, 178; John Higham, Strangers in the Land: Patterns of American Nativism, 1860-1925 (1955; reprint, New Brunswick: Rutgers University Press, 1998) 170-71, 173, 175-77; Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919, Ibid; McGerr, A Fierce Discontent, 182-218; Dawley, Struggles for Justice, 105-38, 254-94; McCormick, “Public Life in Industrial America,” 124-26; Brinkley, “Prosperity, Depression, and War,” 139-40.

[li] C. Vann Woodward noted in 1954 the related platforms of “Negrophobia and progressivism” in the South: “The omission of the South from the annals of the progressive movement has been one of the glaring oversights of American historians…The blind spot in the Southern progressive record – as, for that matter, in the national movement – was the Negro, for the whole movement in the South coincided paradoxically with the crest of the wave of racism…the typical progressive reformer rode to power in the South on a disfranchising or white-supremacy movement.” The Strange Career of Jim Crow, Ibid., 90-91. 

[lii] McGerr, A Fierce Discontent, 216-17.

[liii] David R. Roediger, Working Toward Whiteness: How America’s Immigrants Became White (New York: Basic Books, 2005), 70.

[liv] Eric Foner, The Story of American Freedom (New York: W. W. Norton & Co., 1998), 185.  Foner pointed out many criticisms of the “underside of the Progressives’ outlook,” like how “their talk of reconstructing society masked a set of managerial attitudes in which democratic values were ‘subordinated to technique.’”  He also pointed out that because of Progressive’s homogenized cultural and racial assumptions, they were “ill-prepared to develop a coherent defense of minority rights against majority or governmental tyranny” (176, 78).

[lv] Daniel T. Rodgers, “An Age of Social Politics,” in Rethinking American History in a Global Age, ed. Thomas Bender (Berkeley: University of California Press, 2002), 250-73.

[lvi] Michael Kazin argued, “On the national level, it would be hard to disentangle the history of the Left from the history of American reform.”  He also quoted Will Herberg who wrote, “It would not be too much to say that socialist agitation and propaganda have constituted the single most influential factor in the advance of American social reform.  Untiring socialist criticism of existing conditions have invariably served as the main force in opening the way for reform legislation.”  Michael Kazin, “The Agony and Romance of the American Left,” The American Historical Review, 100 (Dec 1995): 1510; Will Herberg, “American Marxist Political Theory,” Socialism and American Life, 1, Donald Drew Egbert and Stow Persons, eds. (Princeton, N.J., 1952): 521.

[lvii] Foner, The Story of American Freedom, Ibid., 141.

[lviii] Gary Gerstle, “Liberty, Coercion, and the Making of Americans,” The Journal of American History 84:2 (Sept 1997): 530.

[lix] Robert Wiebe, “Framing U.S. History: Democracy, Nationalism, and Socialism,” in Rethinking American History in a Global Age, ed. Thomas Bender (Berkeley: University of California Press, 2002), 236-49.

[lx] Daniel T. Rodgers, “An Age of Social Politics,” in Rethinking American History in a Global Age, ed. Thomas Bender (Berkeley: University of California Press, 2002), 250-73.

[lxi] Michael Kazin argued, “On the national level, it would be hard to disentangle the history of the Left from the history of American reform.”  He also quoted Will Herberg who wrote, “It would not be too much to say that socialist agitation and propaganda have constituted the single most influential factor in the advance of American social reform.  Untiring socialist criticism of existing conditions have invariably served as the main force in opening the way for reform legislation.”  Michael Kazin, “The Agony and Romance of the American Left,” The American Historical Review, 100 (Dec 1995): 1510; Will Herberg, “American Marxist Political Theory,” Socialism and American Life, 1, Donald Drew Egbert and Stow Persons, eds. (Princeton, N.J., 1952): 521.

[lxii] Foner, The Story of American Freedom, Ibid., 141.

[lxiii] Gary Gerstle, “Liberty, Coercion, and the Making of Americans,” The Journal of American History 84:2 (Sept 1997): 530.

[lxiv] Nancy MacLean, Behind the Mask of Chivalry: The Making of the Second Ku Klux Klan (Oxford: Oxford University Press, 1994), 79, 33.  MacLean argued that one “common core goal” of the Klan was “securing the power of the white petite bourgeoisie in the face of challenges stemming from modern industrial capitalism.  The Klan sought to deny political rights to those whom it perceived as threats to that power” (141).  MacLean also made it very clear that “extreme conditions” can very easily lead to a “reactionary politics:” “Under conditions of economic uncertainty, sharply contested social relations, and political impasse, assumptions about class, race, gender, and state power so ordinary as to appear ‘common sense’ to most WASP Americans could be refashioned and harnessed to the building of a virulent reactionary politics able to mobilize millions” (186).

[lxv] Ibid., 10-11, 52-74.  Richard Hofstadter, The Age of Reform, From Bryan to F.D.R. (New York: Vintage Books, 1955).

[lxvi] Ibid., 149-73.  MacLean wrote: “Vigilante Violence was the concentrated expression of that culture, of the brutal determination to maintain inherited hierarchies of race, class, and gender that Klansmen sought to conceal with a mask of chivalry” (173).

[lxvii] Ibid., 125-48, 166-67; David Roediger, “Whiteness and Ethnicity in the History of ‘White Ethnics’ in the United States,” in Towards the Abolition of Whiteness: Essays on Race, Politics, and Working Class History (London: Verso, 1994),189.  See also Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919 (New York: W. W. Norton & Company, 1987): Ch 12.

[lxviii] C. Vann Woodward, The Strange Career of Jim Crow (1955; reprint, Oxford: Oxford University Press, 2002), 90-93; Anonymous Klansmen quoted in MacLean, Behind the Mask of Chivalry, 132-34, 161; Hofstadter, The Age of Reform, 178; John Higham, Strangers in the Land: Patterns of American Nativism, 1860-1925 (1955; reprint, New Brunswick: Rutgers University Press, 1998) 170-71, 173, 175-77; Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919, Ibid; McGerr, A Fierce Discontent, 182-218; Dawley, Struggles for Justice, 105-38, 254-94; McCormick, “Public Life in Industrial America,” 124-26; Brinkley, “Prosperity, Depression, and War,” 139-40.

[lxix] C. Vann Woodward noted in 1954 the related platforms of “Negrophobia and progressivism” in the South: “The omission of the South from the annals of the progressive movement has been one of the glaring oversights of American historians…The blind spot in the Southern progressive record – as, for that matter, in the national movement – was the Negro, for the whole movement in the South coincided paradoxically with the crest of the wave of racism…the typical progressive reformer rode to power in the South on a disfranchising or white-supremacy movement.” The Strange Career of Jim Crow, Ibid., 90-91. 

[lxx] McGerr, A Fierce Discontent, 216-17.

[lxxi] David R. Roediger, Working Toward Whiteness: How America’s Immigrants Became White (New York: Basic Books, 2005), 70.

[lxxii] Eric Foner, The Story of American Freedom (New York: W. W. Norton & Co., 1998), 185.  Foner pointed out many criticisms of the “underside of the Progressives’ outlook,” like how “their talk of reconstructing society masked a set of managerial attitudes in which democratic values were ‘subordinated to technique.’”  He also pointed out that because of Progressive’s homogenized cultural and racial assumptions, they were “ill-prepared to develop a coherent defense of minority rights against majority or governmental tyranny” (176, 78).

[lxxiii] Robert Wiebe, “Framing U.S. History: Democracy, Nationalism, and Socialism,” in Rethinking American History in a Global Age, ed. Thomas Bender (Berkeley: University of California Press, 2002), 236-49.

[lxxiv] Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919, Ibid., Ch 5 & 7.  See also: George M. Fredrickson, White Supremacy: A Comparative Study in American and South African History (Oxford: Oxford University Press, 1981).

[lxxv] Nell Irvin Painter, Standing at Armageddon: The United States, 1877 – 1919, Ibid., xxviii, 8, 136, 258, 268.

[lxxvi] Gary Gerstle, American Crucible: Race and Nation in the Twentieth Century (Princeton: Princeton University Press, 2001): 4-9, 43, 46-51, 71.

[lxxvii] Ibid., 53-59, 72, 93-94.

[lxxviii] John Higham, Strangers in the Land: Patterns of American Nativism, 1860-1925 (1955; reprint, New Brunswick: Rutgers University Press, 1998): 196, 200, 204-05.

[lxxix] Ibid., 206-23.

[lxxx] Ibid., 215-16, 222-33.

[lxxxi] Gary Gerstle, “The Protean Character of American Liberalism,” The American Historical Review 99:4 (Oct 1994): 1043-1073.

[lxxxii] Noah Pickus, True Faith and Allegiance: Immigration and American Civic Nationalism (Princeton: Princeton University Press, 2005): 64-65, 71-74; Edward George Hartmann, The Movement to Americanize the Immigrant (1948; reprint, New York: AMS Press, 1967); John Higham, Strangers in the Land: Patterns of American Nativism, 1860-1925 (1955; reprint, New Brunswick: Rutgers University Press, 1998); Robert A. Carlson, “Americanization as an Early Twentieth-Century Adult Education Movement,” History of Education Quarterly 10:4 (Winter 1970): 440-64.

[lxxxiii] Noah Pickus, True Faith and Allegiance: Immigration and American Civic Nationalism, Ibid, 64-65, 73-84 (left-leaning Progressives), 85- 123 (right-leaning Progressives); Jonathan Hansen, “True Americanism: Progressive Era Intellectuals and the Problem of Liberal Nationalism,” In Americanism: New Perspectives on the History of an Ideal, Michael Kazin and Joseph A. McCartin, eds. (Chapel Hill: The University of North Carolina Press, 2006): 73-89.

[lxxxiv] Ibid., 90-123, 220 (footnote 59).

[lxxxv] Ibid., 120-23.

[lxxxvi] Lawrence A. Cremin, The Transformation of the School: Progressivism in American Education, 1876 – 1957 (New York: Vintage Books, 1961), 8-14; Lawrence A. Cremin, American Education: The National Experience, 1783 – 1876 (New York: Harper & Row, Publishers, 1980), 133-75; Jonathan Messerli, Horace Mann: A Biography (New York: Alfred A. Knopf, 1972).

[lxxxvii] Cremin, The Transformation of the School, 14-31; Lawrence A. Cremin, American Education: The Metropolitan Experience, 1876 – 1980 (New York: Harper & Row, Publishers, 1988), 158-65.  In 1871 William Torrey Harris wrote to the Board of Directors of the St. Louis Public Schools: “The spirit of American institutions is to be looked for in the public schools to a greater degree than anywhere else…If the rising generation does not grow up with democratic principles, the fault will lie in the system of popular education.”

[lxxxviii] Cremin, The Transformation of the School, 355-58; Herbert M. Kliebard, The Struggle for the American Curriculum, 1893-1958, 3rd ed (1986; reprint, New York: RoutledgeFalmer, 2004): 1-25.

[lxxxix] John Dewey, Experience and Education (New York: Macmillan, 1938); Boyd Bode, Education at the Crossroads (New York: Newson, 1938).

[xc] There were several early histories of Progressive education that were produced while the movement was still widely influential, but they were written primarily by Progressive educators who had an obvious interest in writing the history of their own cause.  The first was Edward H. Reisner’s “What is Progressive Education?” in Teachers College Record (1933-4) and then Merle Curti’s The Social Ideas of American Educators (1935).  A few years later R. Freeman Butts published The College Charts Its Course (1939).  Robert Holmes Beck wrote the first dissertation on Progressive education at Yale University in 1941, “American Progressive Education, 1875 – 1930.”  The last early history of the movement written by a partisan was Harold Rugg’s Foundations for American Education (1947).  C.A. Bowers reported in 1969 that “most of the sources that deal with Progressive education are books and articles written by professors of education.  Unfortunately, they proved little help in determining how widely their contents were accepted among classroom teachers.”  Bowers stated the “desirability” of a study on “how much influence the theoreticians actually had on the practitioners in the classroom.”  The Progressive Educator and the Depression: The Radical Years (New York: Random House, 1969), x.

[xci] Cremin, The Transformation of the School, 355-58; Larry Cuban, How Teachers Taught: Constancy and Change in American Classrooms, 1890 – 1980 (New York: Longman, 1993), 75; Bowers, The Progressive Educator and the Depression, 11;  David Tyack, Robert Lowe, and Elisabeth Hansot, Public Schools in Hard Times: The Great Depression and Recent Years (Cambridge, MA: Harvard University Press, 1984): 152, 158.  Tyack et. al. also note: “To the degree that Progressive educators succeeded in retaining old programs or installing new ones, they had to work within severe fiscal constraints in most districts.  And the success of publicized reforms probably obscured the conservatism of the great mass of American public schools.”

[xcii] David Tyack, Robert Lowe, and Elisabeth Hansot have succinctly criticized the Social Reconstructionist agenda: “The reconstructionists challenged the existing order by a powerful alternative vision of America, but their strategy seemed naïve to many radicals, their goal seemed dangerous to many conservatives, and their grasp of educational realities seemed tenuous to many fellow school people.  Socialism was the road not taken.” Public Schools in Hard Times: The Great Depression and Recent Years, 47-48.

[xciii] Peter Novick, That Noble Dream: The “Objectivity Question” and the American Historical Profession, 332, 370.  Novick used the term “counterprogressive” to characterize primarily the change of interpretive framework within the historical community, which was reacting against the Progressive historiography of Charles Beard and Carl Becker.  But he also extended its use to include the reaction against Progressive educationalists like John Dewey: “By the 1950s counterprogressivism extended to the conviction that John Dewey had had a pernicious influence on American education, and that to combat ‘populist’ anti-intellectualism, one had to return to a more traditional curriculum, and restore the authority of academic elites.”  C. A. Bowers noted, “A heavy barrage of criticism was being leveled at Progressive education by an awakened and highly concerned public.  Dissatisfaction with Progressive education had been growing among interested and vocal members of the American public since the early forties, but it was not until 1949 that they began a direct assault on the philosophy and practice of Progressive schools.  The attack was so sweeping that little escaped condemnation.”  The Progressive Educator and the Depression, 242. 

[xciv] Ibid., 347-53; Daniel Tanner, Crusade for Democracy: Progressive Education at the Crossroads (New York: State University of New York Press, 1991).  C. A. Bowers, The Progressive Educator and the Depression, 242.  Bowers noted: “The idea that the schools should be used to overcome the problems of racial integration, a high divorce rate, and chronic poverty, as well as to help American beat the Russians to the moon indicates that at least part of the social reconstructionist philosophy of education has become accepted as the ‘conventional wisdom’ of our society” (253).

[xcv] Thomas Woody earned his PhD in the History of Education in 1918 at Teachers College and went on to become an early and prolific writer of educational history.  He wrote many books on the history of education, both European and American.  James Mulhern, “Perspectives,” History of Education Quarterly 1 (June 1961): 1-4.

[xcvi] Peter Novick, That Noble Dream: The “Objectivity Question” and the American Historical Profession (Cambridge: Cambridge University Press, 1988).

[xcvii] Ellen Condliffe Lagemann, “Does History Matter in Education Research? A Brief for the Humanities in an Age of Science,” Harvard Educational Review 75 (Spring, 2005), 9-24. 

[xcviii] The History of Education Society transformed an earlier publication, History of Education Journal, which was founded in 1951 under the editorship of Claude Eggertson, into a more academic organ with the launching of History of Education Quarterly in 1961 under the editorship of Ryland W. Crary at the University of Pittsburgh.

[xcix] For historiographical debate see Novick, That Noble Dream; Robert Harrison, “The ‘new social history’ in America” in Making History: An Introduction to the History and Practices of a Discipline, ed. Peter Lambert and Phillipp Schofield (London: Routledge, 2004): 109-20; Peter Charles Hoffer, “Part I: Facts and Fictions” in Past Imperfect: Facts, Fictions, Fraud – American History from Bancroft and Parkman to Ambrose, Bellesiles, Ellis, and Goodwin (New York: PublicAffairs, 2004): 11-130; Gary B. Nash, Charlotte Crabtree, and Ross E. Dunn, History on Trial: Culture Wars and the Teaching of the Past (1997; reprint, New York: Vintage Books, 2000).  For some specific mention of this debate within educational historiography see Diane Ravitch, The Revisionists Revised: A Critique of the Radical Attack on the Schools (New York: Basic Books, 1978); Jeffrey E. Mirel, “Introduction” in William J. Reese, Power and the Promise of School Reform: Grassroots Movements During the Progressive Era (1986; reprint, New York: Teachers College Press, 2002): ix-xvi; Herbert M. Kliebard, “Afterword: The Search for Meaning in Progressive Education: Curriculum Conflict in the Context of Status Politics” in The Struggle for the American Curriculum, 1893 – 1958, 3rd ed (New York: RoutledgeFalmer, 2004).

[c] C.A. Bowers called Cremin’s book “the most important history of the Progressive education movement, particularly in its early phases.”  The Progressive Educator and the Depression, 259.  Hebert M. Kliebard argued, “Cremin succeeded in establishing history of education as an integral part of cultural and social history, and the writing of history of education has never really been the same since his book appeared.”  The Struggle for the American Curriculum, 272.  On the 30th anniversary of the work, John L. Rury argued that the book’s “appearance did much to make educational history a credible subfield of American history, and one open to new research and interpretation.” “Transformation in Perspective: Lawrence Cremin’s Transformation of the School,” History of Education Quarterly 31 (Spring 1991): 66-76.

[ci] It won the Bancroft Prize in American History in 1962.

[cii] Cremin,  The Transformation of the School, viii-x, 88-89.

[ciii] Ibid., 240-45.

[civ] Ibid., 306-8.

[cv] Ibid., 347-51.

[cvi] The second volume, American History: The National Experience, 1783 – 1876, was awarded the Pulitzer Prize for History in 1981.

[cvii] Lawrence A. Cremin, American Education: The Metropolitan Experience, 1876 – 1980 (New York: Harper & Row, 1988), 10-14, 110, 150, 178, 196, 228, 442-44.  Cremin also used the term “American Victorianism” to describe the Americanization program of “standardizing” culture based on “ethnic, religious, and racial ethnocentrism” so as to “convey its outlook upon the world and thereby enforce its standards and patterns of behavior” (442-44).

[cviii] Cremin quoted Hannah Arendt, “The Crisis in Education,” Partisan Review (Fall 1958): 494-95.

[cix] Lawrence A. Cremin, “Education as Politics” in Popular Education and Its Discontents (New York: Harper & Row, 1990), 85-127.  The three essays in this book were based on lectures given at the Harvard Graduate School of Education in 1989.

[cx] David B. Tyack, The One Best System: A History of American Urban Education (Cambridge, MA: Harvard University Press, 1974), 3-12.

[cxi] Ibid., 19-27.

[cxii] Tyack quoted William T. Harris, St. Louis School Report for 1871, 31-32.

[cxiii] Ibid., 28-43, 60-65, 72-77, 109, 127-131, 146-47.

[cxiv] David Tyack and Elisabeth Hansot, “From Social Movement to Professional Management: An Inquiry into the Changing Character of Leadership in Public Education,” American Journal of Education 88 (May 1980): 291-319.

[cxv] David Tyack and Elisabeth Hansot, Managers of Virtue: Public School Leadership in America, 1820 – 1980 (New York: Basic Books, 1982).  Tyack and Hansot wrote, “Many people (ourselves included) have become newly aware, thanks to the radical analysis, of ideological frameworks and class interests too much taken for granted.”  They mention in particular Michael B. Katz, The Irony of Early School Reform: Educational Innovation in Mid-Nineteenth Century Massachusetts (Cambridge: Harvard University Press, 1968).  In summarizing the “radical historians” Tyack and Hansot wrote: “They have sought to demystify public education, to scatter the fog of sentiment that covered harsh realities.  They have argued that its basic structure was hierarchical and elitist, not democratic; that its operation was class-biased, racist, and sexist; that it was imposed by elites, not created democratically by educational statesmen and their allies; that its ideology was suffused with notions of social control, often covert; that tinkering with minor improvements would not set it right; and that, most important, its claim of being able to right the basic inequities of American life was a legend” (9).

[cxvi] Ibid., 5, 17, 21-22, 73-76.

[cxvii] Ibid., 3-8.

[cxviii] Ibid., 106-111, 206, 226.  David Tyack and Thomas Timar reiterate much of this argument in their brief for the National Commission on Governing America’s Schools, “The Invisible Hand of Ideology: Perspectives from the History of School Governance,” Education Commission of the States (Jan 1999): 1-23.

[cxix] David Tyack, Robert Lowe, and Elisabeth Hansot, Public Schools in Hard Times: The Great Depression and Recent Years (Cambridge, MA: Harvard University Press, 1984): 56-57, 91, 150, 162-63, 180, 189-90.

[cxx] C. A. Bowers, The Progressive Educator and the Depression: The Radical Years (New York: Random House, 1969): ix-x, 4-5, 15, 20, 41.  Bowers noted that the editors of The Social Frontier did not agree with Roosevelt’s New Deal plan to implant a welfare state within a capitalistic society.  The plan was to organize teachers and then participate with the labor movement in larger unionizing efforts, while also giving students in the classroom a “labor orientation” towards the issues of the day.  They even warned their readership that there may be violence, in which case, teachers should feel justified that the “onus will fall on the shoulders of those few who cannot gracefully surrender their privileges in the face of a popular decision” (134,140).

[cxxi] Ibid., 48-51, 144, 151, 181, 201-54.

[cxxii]Joel Spring, “Education and Progressivism,” History of Education Quarterly 10 (Spring 1970): 53-71.

[cxxiii] Herbert M. Kliebard, The Struggle for the American Curriculum, 1893 – 1958, 3rd ed. (1986; reprint, New York: RoutledgeFalmer, 2004), xiv, xviii-xix, 1-52.

[cxxiv] Herbert M. Kliebard, “Afterword: The Search for Meaning in Progressive Education: Curriculum Conflict in the Context of Status Politics,” in The Struggle for American Curriculum, 1893 – 1958, 3rd ed. (1993; reprint, New York: RoutledgeFalmer, 2004), 271-92.  Kliebard emphasized his point of educational curriculum being a “battleground:” “Whatever else the curriculum may be in terms of what actually gets taught to children, it is also the arena where ideological armies clash over the status of deeply held convictions…The question of whose cultural and moral values will emerge as dominant…the curriculum in any time and place becomes the site of a battleground where the fight is over whose values and beliefs will achieve the legitimation and the respect that acceptance into the national discourse provides.”

[cxxv] Paul E. Peterson, The Politics of School Reform, 1870-1940 (Chicago: The University of Chicago Press, 1985), 4, 15, 22-23, 207.  Peterson nicely summarized the complicated notion of reform in this complex environment: “Reformers’ policies were as often rejected as approved.  When adopted, they were frequently amended; when promulgated, they were not always implemented” (203).

[cxxvi] Peterson conceptual included race and ethnicity under the heading of social “status.”  He explained that although “school politics had become increasingly marked by class conflict in the first decades of this century, questions of race and ethnicity did not instantly disappear.  Especially in the South, race relations remained so significant a concern that class issues were never vigorously articulated:” “ethnic conflicts could interrupt a politics of class” (18-19).  

[cxxvii] Ibid., 6, 8-9, 12, 21-23.

[cxxviii] Ibid., 6, 53-71, 73-75, 92.  Peterson argued that “rather than a long-term pattern of favoritism, we see early discrimination giving way to increasing acceptance of the larger immigrant groups” (91).

[cxxix] Ibid., 95-117.

[cxxx] William J. Reese, “’Partisans of the Proletariat’: The Socialist Working Class and the Milwaukee Schools, 1890 – 1920,” History of Education Quarterly, 21 (Spring 1981): 3-50.

[cxxxi] He went on to write: “But what is missing even in recent historiography is an appreciation of the radical politics and third-party movements which periodically swept many cities in the early 1900s; a recognition of how people from many different social classes and ethnic backgrounds once struggled collectively, if for different reasons and with sometimes contrary results, for reforms easily dismissed by some historians today as examples of ‘social control’; and a sense of how immigrants and the urban poor themselves shaped the social life of the school and the contours of the past” (5).

[cxxxii] Reese argued, “the ‘working class’ has never been a single, monolithic, or static entity.  Since America was populated by individuals with diverse ethnic, religious, and racial backgrounds, several working class populations have always existed simultaneously.  It is therefore impossible for a historian to identify a single ‘working class’ influence on education, for none has ever existed…In Milwaukee, the Socialist working class grew by accretion, increased its ideological sophistication over time, and represented diverse, shifting elements of laboring people” (6).

[cxxxiii] Kenneth Teitelbaum and William J. Reese, “American Socialist Pedagogy and Experimentation in the Progressive Era: Te Socialist Sunday School,” History of Education Quarterly 23 (Winter 1983): 429-454.

[cxxxiv] William J. Reese, Power and the Promise of School Reform (1986; reprint, New York: Teachers College Press, 2002), 1-2, 123, 130, 213-14.

[cxxxv] Reese characterized these grassroots reformers as a “multidimensional political movement:” “A variety of motivation, perceptions, personalities, and interests converged in the making of grassroots Progressivism … Grassroots Progressivism, therefore, had its middle-class and feminine as well as working-class and Socialist roots, growing together in the 1890s like entangling vines that crossed but did not always join.  The Social Gospel and Progressive religion added the final stimulus to the growth of municipal reform” (123, 70).

[cxxxvi] Ibid., 9, 70, 80-81, 118, 121-23, 133, 214, 222-226.

[cxxxvii] Horace Mann Bond, “Education in the South,” Journal of Educational Sociology 12 (Jan 1939): 264-74.

[cxxxviii] James D. Anderson, The Education of Blacks in the South, 1860 – 1935 (Chapel Hill, NC: The University of North Carolina Press, 1988), 1-2, 279.

[cxxxix] Ibid., 13, 15, 20-21 67, 92, 285.

[cxl] Ronald K. Goodenow, “The Progressive Educator, Race and Ethnicity in the Depression Years: An Overview,” History of Education Quarterly 15 (Winter 1975): 365-94; “The Progressive Educator on Race, Ethnicity, Creativity, and Planning: Harold Rugg in the 1930s,” Review Journal of Philosophy and Social Science 1 (Winter 1977): 105-28; “The Progressive Educator as Radical or Conservative: George S. Counts and Race,” History of Education Quarterly 17 (Winter 1977): 45-57; “Racial and Ethnic Tolerance in John Dewey’s Educational and Social Thought: The Depression Years,” Educational Theory 26 (Winter 1977): 48-64; “The Paradox in Progressive Educational Reform: The South and the Education of Blacks in the Depression Years,” Phylon 39 (March 1978): 49-65; “The Southern Progressive Educator on Race and Pluralism: The Case of William Heard Kilpatrick,” History of Education Quarterly 21 (Summer 1981): 147-70.

[cxli] Historians of the Progressive era and Progressive education began to take more concerted note of ethnic minorities by the 1970s.  David Tyack for one has devoted much space to ethnic minorities, including blacks, within many of his educational histories.  Ronald E. Butchart has traced the rich historiography of African American education, and expertly categorized and analyzed the subject up until the late 1980s.  Ronald E. Butchart, “’Outthinking and Outflanking the Owners of the World”: A Historiography of the African American Struggle for Education,” History of Education Quarterly 28 (Autumn 1988): 333-66.

[cxlii] Mustafa Emirbayer, “Beyond Structuralism and Voluntarism: The Politics and Discourse of Progressive School Reform, 1890 – 1930,” Theory and Society 21 (Oct 1992): 621-64.

[cxliii] Emirbayer often made statements or used the pronoun “they” to refer to “Progressives” and then made generalizations that are highly suspect, given that not all “Progressives” would have agreed with or argued for a particular position.  For instance, he claimed “they proposed the reorganization of classroom instruction so that it would promote each student’s capacities for social interaction and creative problem-solving” (625).  For a more complicated conception of “Progressive” education see Kliebard, The Struggle for the American Curriculum.  Since Emirbayer claimed that “educational research has neglected the microscopic domain of curriculum and pedagogy,” it is curious that he did not find, read, or reference Kliebard’s groundbreaking book.  Even the conservative Diane Ravitch referenced Kliebard in her summary book on the subject, Left Back: A Century of Battles Over School Reform (New York: Touchstone, 2000), 33, 54, 529.  The omission of Kliebard is also troubling given the close similarity between Emirbayer’s “struggle” thesis and Kliebard’s conception of curricular “struggle.”

[cxliv] Emirbayer agued that Progressive “discourse” was a “major element behind the transformation of public school systems and of moral and civic education:” “to formulate precisely such a discourse, to refashion old symbols, images, and ideals into a new agenda for redeeming the unfulfilled promise of American education.”  See also Daniel T. Rogers, “In Search of Progressivism.”

[cxlv] Marjorie Lamberti, “Radical Schoolteachers and the Origins of the Progressive Education Movement in Germany, 1900 -1914,” History of Education Quarterly 40 (Spring 2000): 22-48.

[cxlvi] Jurgen Herbst, review of Progressive Education Across the Continents: A Handbook, ed. Hermann Rohrs and Volker Lenhart, History of Education Quarterly 37 (Spring 1997): 45-59.

[cxlvii] Carl F. Kaestle, “Ideology and American Educational History,” History of Education Quarterly 22 (Summer 1982): 123-37.  Kaestle defined the progressive ideology as a “moral culture based on Anglo-American Protestantism, republicanism, and capitalism” that asserted “centralist, assimilationist, and moralistic” values and “cultural preferences.”  He called progressive reformers “hegemonic” because “they were didactic and ethnocentric” and tried to “promote publicly” their cultural value system through public education (128, 130).

What Was Americanization?

An Historiography of a Concept, Social Movement, and Practice

originally written 2011

While the use of the term “Americanization” has increased in academia, scholarly study of the Americanization movement (1910 – 1920) and specific Americanization practices have been largely neglected, especially by historians.  There was a flood of writing on the subject during the first couple decades of the 20th century as it became one of the dominant political discourses of the time.  Industrial, government, and education policy makers rushed to create national, state, and local coordinating bodies operating both within and independently of government agencies.  A veritable flood of political, educational, and editorial documents filled the popular, scholarly, and government media of the day.  There were also several scholarly studies and evaluations of various Americanization efforts, which were conducted from different disciplinary perspectives: educational studies of the development and effectiveness of Americanization programs; political studies of the administrative networks, public policies, and political ramifications of various Americanization initiatives; and there were sociological studies of Americanization as both a socio-political movement and as an socio-cultural phenomenon otherwise labeled as “assimilation” or “acculturation.”[i]  One important collection of studies was sponsored by the Carnegie Corporation, which commissioned a 10 volume series called Americanization Studies: The Acculturation of Immigrant Groups Into American Society.  This collection was published from 1920 to 1924 at the cost of some $200,000.[ii]  

Historical treatment of the Americanization movement, however, was slow in coming.  There was some initial treatment by Merle E. Curti in The Roots of American Loyalty (1946). Curti’s work would later influence his graduate student, Edward G. Hartmann.  Hartmann, under the direction of Curti, wrote The Movement to Americanize the Immigrant in 1948.[iii]  Hartmann’s book would become the definitive history of the Americanization movement, and it remains to this day the best of the few historical monographs on the subject. 

Hartmann’s rather narrow focus chronicled the rise of private reform and educational policy organizations who were concerned about the assimilation of the immigrant during the first decade and a half of the 20th century.  These reformers were able to influence the Bureau of Naturalization and the Bureau of Education in order to create formal governmental agencies to oversee and coordinate state and local Americanization initiatives, organize Americanization conferences, and also supply posters, pamphlets, and textbooks.  This “social movement” or “crusade” began as a “positive program” of education to meet the “problem” of immigration in the U.S. and it reached its pinnacle in the years 1915-16 as the U.S. geared toward entry into the war.  But a national hysteria concerning foreigners and anti-Americanism swept the country during and after the war up until 1920, which created more a “negative,” fearful, and coercive focus to Americanization initiatives.  It was also during this time, specifically in 1919, that federal funding for Americanization efforts were cut back, which caused the Bureau of Education to discontinue Americanization activities, and which left the Bureau of Naturalization as the sole federal body in charge of Americanization programs.  The Bureau of Naturalization’s activities were confined primarily to creating and distributing published materials (including textbooks), monitoring local Americanization activities, and working with the public schools to incorporate Americanization programs in the standard national curriculum.  It was also during 1918-21 that Americanization efforts became more professionalized through academic departments of education and sociology, and Americanization was grafted as one plank in a broader public school initiative of creating an adult educational system, which Hartmann argued was perhaps the greatest legacy of the movement.  The ideology of “Americanism” (and its various rhetorical forms) was rarely defined by reformers except in terms of the foreigner leaving behind the old ways in order to adopt a vague American identity.  Hartmann argued that this lack of definition underscored a cultural common identity shared by reformers and their audience, whereby, the “mission” of Americanization and the values of Americanism where taken for granted as self-evident norms, and thus, Hartmann compared it to other idealistic national “crusades” like abolitionism, woman’s suffrage, civil service reform, and the common school movement.[iv]   

The next historical treatment of the subject was published via a chapter in John Higham’s superb book on U.S. nativism and nationalism, Strangers in the Land (1955).[v]  Higham discussed how the broader currents of xenophobia, nativism, and nationalism during the 1890s concealed into a rampant and rabid nationalist crusade of “America for Americans” and “100 per cent Americanism” during and after World War I.  Fear of the foreigner gave way to a more ambiguous fear of “disloyalty,” “the gravest sin in the morality of nationalism,” which was any thought that might question the “Absolute and Unqualified Loyalty to Our Country.”  This search for disloyalty focused uncomfortably on “hyphenated Americans” (German-Americans in particular) and their ability to support not only the war effort, but the greater cause of American nationalism.  Infusing the search for disloyalty was a “positive and prescriptive” rhetorical abstraction that did not rise “to the dignity of a systematic doctrine:” “100 per cent Americanism.”  While there was no specific dogmatic or programmatic ritual to prove one’s “Americanism,” there were several assumptions underlying this phrase.  One was a “belligerent” demand for “universal conformity” to the “spirit of nationalism” and total national loyalty” to the State, which was regulated through “the pressure of collective judgment.”  It was during 1917 that “The American’s Creed” (“I pledge allegiance to the flag…”) was introduced as a classroom ritual in public schools to remind children of the object of their loyalty, but more so to rhetorically instill the virtue of “right-thinking, i.e. the enthusiastic cultivation of obedience and conformity.” 100 per cent Americanism, as Higham argued, was primarily a rhetorical affair of “propaganda” and “exhortation,” but with the onset of the war nationalists supported the expansion of state powers and “the punitive and coercive powers” of the state to support if not mandate loyalty and conformity.[vi] 

The work of Higham and Hartmann are still the definitive historical treatments of Americanization and the Americanization movement, but neither one of them bothered to historically analyze or reconstruct actual Americanization programs at the micro level of local and institutional practice.  There have been few scholarly treatments since Hartmann and Higham that have revisited the Americanization movement, and fewer yet that have conducted original analysis of the extensive primary documents on either federal, state, local or institutional levels.[vii]  There have been no books or scholarly monographs on the subject of the Americanization movement since 1948, and Hartmann and Higham remain to this day the most cited references in relation to this subject.  By the 1960s and 70s there was a rise of scholarly activity on numerous subjects related to the concept of Americanization, the Americanization movement, or various types of Americanization practices and programs.  But in order to find this literature researchers must range over many academic disciplines and disciplinary fields of study, and one finds mostly fragmented and narrow treatments that have little if any connection beyond disciplinary discourses.  

There is one monograph of note during this time that must be mentioned because it has gained a reputation in the literature.  Robert A. Carlson’s The Quest for Conformity: Americanization through Education (1975; revised and expanded in 1987) was a somewhat influential study and it has been moderately cited by various scholars. [viii]  However, its reputation is somewhat baffling because this book is severely flawed as a work of scholarship.  For one, Carlson’s title is misleading because the book is not really about education.  It is a revisionist history of Americanization as a broader form of cultural indoctrination.  Although the book does discuss education throughout, it is highly generalized and it does not actually analyze educational processes per se except in noting that cultural indoctrination was conducted through schools and other educational forms.  The whole book suffers from a penchant for overgeneralization (many intricate topics get rushed over in a few paragraphs), and in light of Hartmann’s and Higham’s work, Carlson’s book has nothing really to offer except its theoretical framework.  Carlson’s book is really the only work to fully contextualize Americanization within the full scope of American history, and thereby, argue that “Americanization” has been a central preoccupation of political and cultural leaders.  But his basic argument consists of condemning all Americanizers as agents of “cultural genocide.”  In Carlson’s formulation, Americanization (and seemingly the whole of U.S. educational history) was nothing but a “policy of genocide of non-Caucasians.”  Carlson’s thesis is an over-generalized, and in light of further contemporary scholarship, false, structural account of U.S. history positing a singular and monolithic WASP society “Americanizing” non-whites via a one-way process of cultural imperialism.[ix]  

Carlson’s work highlights an important problematic within the literature on Americanization.  The very term “Americanization” has always been, and continues to be, substantially ambiguous.  It is akin to other widely used socio-political slogans like “republican,” “progressive,” or “liberal.”[x]  Most historians have treated the term “Americanization” as synonymous with the assimilation or integration of immigrants into mainstream American socio-political culture.  Richard Hofstadter offhandedly linked “naturalization and Americanization” in his work on the progressive era.[xi]  Alan Kraut broadly situated the term “Americanization” as an “ideology of mobility” permeating discussions of both cultural assimilation and specific forms of socialization via the institution of schooling.[xii]  Kraut’s use of “Americanization” signified the gradual and conflicting process (the “cultural tug of war”) of assimilating the immigrant within American society.  Kraut’s broad usage is representative of the majority of scholars in history, literature, and the social sciences.[xiii]  Since the mid 1990s historians and social scientists have also acknowledged Americanization-as-assimilation as a racialized process imbued with white supremacy.  David Roediger has argued, “The process of Americanizing European immigrants acquired a sense of whiteness and of white supremacy,” and thus, there was a general conflation of “whiteness with Americanism.”[xiv]

Gary Gerstle’s “Liberty, Coercion, and the Making of Americans” (1997) traced out the origins of the Americanization-as-assimilation concept all the way back to the 18th century French-American farmer, J. Hector St John De Crevecoeur.[xv]  Gerstle argued that Crevecoeur’s conception of assimilation in Letters from an American Farmer (1782) was one of the “most influential mediations on what it means to become an American.”  Not only did the Crevecoeurian myth help define the early 20th century ideal of the “melting pot,” but it also influenced the way 20th century sociologists and historians conceptualized theories of assimilation (often using the term “Americanization”), which in tern had an influence on public policy and national debates. 

Crevecoeur’s conception of Americanization pervaded the work of Robert E. Park and the Chicago school sociologists, but this school also criticized one part of the myth: the notion that assimilation was quick and easy.  Their work took place during a general “recoil” of liberal social scientists disturbed by the “reaction and intolerance” of the Americanization drives during World War I and the Red Scare.  Because the term Americanization gained such “a bad, nativist odor” after the war, it was dropped from the vocabulary of many liberal reformers and social scientists.  Many social scientists also began to believe that immigrant cultures were “resistant to assimilation,” which meant “no magical fusing” of cultures via the melting pot was taking place. The work of Oscar Handlin was a product of this critical environment, and he would serve as an important transitional figure leading to the eventual dismantling of the Crevecoeurian assimilationist theory by the “new” immigrant historians of the 1960s and 70s (Frank Thistlewaite, Rudolph J. Vecoli, and Herbert Gutmann).  However, some scholars have been charged with “resurrecting” parts of the Crevecoeurian myth, such as assimilation’s “emancipatory impulse.”[xvi]

 Invoking radical scholars of the 1960s and the new scholarship of David R. Roediger and others, Gerstle criticized neo-Crevecoeurian scholars for not focusing enough on the complexity and constraints (class, gender, race, nation) of the Americanization process by which “social forces external to the immigrant” play a very significant, if not the most significant, role in the Americanization of immigrants.  Gerstle argued that “structure[s] of power” limited the options (and also often coerced) immigrants during the assimilation/Americanization process.  He reviewed the work of more critical “new” immigrant scholars (like Gutmann, Vecoli, Hoerder, Bodnar, Morawska, and Gabaccia) who viewed Americanism as a “cultural strategy” deployed by the wealthy and powerful (employers, natives, ethnic middle-class allies) to “augment” their privileged position.  Americanization was thus a “surrender” or “capitulation” to “a capitalist order,” which could have positive effects for some immigrants (those few who could “make capitalism work for them”), but which had the negative effect for most by way of “acquiescing” in their own “oppression.”  Gerstle argued: “The elites were intent on becoming Crevecoeurian ‘new men’; the masses wanted to remain who they were.” 

Gerstle also acknowledged the complexity of Americanization because “as indifferent or hostile to America” as immigrants could be, “a majority of the new immigrants stayed” and many of them went on to acquire not only an “American identity,” but also a “profound patriotic awakening.”  Gerstle criticized the overly optimistic accounts make by Fuchs, Sollors and Hollinger who seemed to argue for a theory of personal agency and a fluidness to identity that did not take into account the restrictiveness of structural constraints (especially race).  Gerstle argued, “race, even more than class and gender, still limits the options of those who seek to become American.”  Gerstle clearly believed that “historical circumstances and social structures undermined experiments in the fashioning of identity,” and he looked to newer studies on gender and working class Americanism (including his own), which have created a “synthesis between agency and structure” and, thereby, demonstrated how “Americanization involves both inventiveness and constraint:” America was not “simply a Crevecoeurian land of possibility,” it was also “a land of constraint.” 

Despite the important social scientific and historical usage of Americanization as a term for assimilation, there have been some hidden costs in this terminology.  What has been lost is the specific historical context in which the term “Americanization” gained its wide currency.  Part of the difficulty for a contemporary historian researching the Americanization movement and specific Americanization educational practices is the wide ahistorical usage of the term “Americanization” in a diverse array of studies on immigration, assimilation, nationalism, and cultural socialization.[xvii]  “Americanization” is anything and everything concerned with the social, cultural, and political transformations of individuals and ethnic groups in America.  This ambiguous usage has also been revitalized in the late 20th century culture wars and given new currency as either a generalized act of cultural imperialism or an equally generalized act of national solidarity.[xviii]  While the usage of the term in its assimilationist sense is important in order to study U.S. nationalism and conflict over national identity and culture, it has also distracted from if not distorted our knowledge of the historical emergence and evolution of the early 20th century Americanization movement and its specific educational, social, legal, and institutional practices.  There is much work to be done on Americanization as a diverse and contradictory “progressive” social movement tied to specific macro historical contexts and micro institutional and individual practices.  The Americanization movement and specific Americanization practices have been largely neglected by social scientists and historians over the last fifty years.

While there has been some very good work done in various areas, almost all scholarship is overly narrow, fragmented, and alienated from the larger body of diverse and disconnected literature. The most visible scholarship on the Americanization movement is found within the numerous histories and historiographies on the “Progressive Era” of U.S. history.  While most historical works on the Progressive era give some treatment of the Americanization movement it is often very briefly and generally mentioned, and sometimes this subject is blurred within a more general discussion (as noted above) of immigration and assimilation.[xix]  There are many good historical treatments of the Americanization movement in various scholarly articles and in parts of historical books; however, these treatments as noted are fragmented along disciplinary lines and very partial in their accounts.  For instance, there have been important political treatments of the Americanization movement in relation to law or the federal government.[xx]  There have been several treatments of the Americanization movement in relation to education, usually elementary education, but sometimes in relation to adult education or more to just educational processes generally.[xxi]  There have also been historical treatments of immigrants and education, which do not directly confront or even mention the Americanization movement.[xxii]  There have been several treatments of the Americanization movement in relation to citizenship or citizenship education.[xxiii]  There have been many treatments of the Americanization in relation to economic institutions, like businesses, factories, and labor camps.[xxiv]  Historians have also explored the Americanization movement in relation to institutions and organizations like Social Settlement houses and the Catholic Church.[xxv]  There have also been many studies of Americanization in relation to gender and minority cultures.[xxvi]  This particular literature has seen the largest growth in the last quarter century, but it also suffers from the most fragmentation as many of these studies are completely isolated from each other and the larger national and international (U.S. Imperialism) Americanization movement(s).  And finally Americanization as assimilation as well as the Americanization movement make many appearances in various histories of immigration and histories of ethnic groups in America.[xxvii] 

What one learns in this vast and fragmented literature is that there has been little attempt to bridge disciplinary lines or topical studies in order to articulate a full and complex understanding of both Americanization as an international, national, state, and local movement, and also Americanization as a concrete historical practice on the institutional, programmatic, and individual level.  Hartmann’s seminal treatment, and many other important works since, have focused specifically on national and state level activities with very little attempt to integrate national, state, and local levels together.  The Americanization movement was a highly localized affair and, as Hartmann demonstrated, both government and private Americanization agencies on both the national and state levels did their best to coordinate an ungovernable and highly dispersed grassroots movement.  Most historical treatments of the Americanization movement either give a monolithic WASP society trying to Americanize various ethnic groups or a highly detailed and localized history of a specific Americanization program with no mention of larger state and national affairs.

Given the resurgence of immigration as a national issue and the armed intervention and social reconstruction of American troop in the Middle East, there needs to be greater awareness and understanding of the Americanization movement and its legacy.  But in order to articulate the importance of this subject with the aim of directing further studies, the field desperately needs a synthesis that can tie the myriad studies of the national, state, and local levels to the splintered studies of various “Americanized” ethnic groups and the specific programs of socio-political localities.  But before this work can take place there needs to be a systematic search for all the scholarly work done on this historical movement, which is currently buried in various disciplinary and topical niches. 

Once some synthetic historiographical and historical works appear, then it will be important to start filling in the gaps of what we don’t know.  For instance, greater attention needs to be paid to the connection between individual, institutional, local, state, national initiatives, and international initiatives in order to more fully explain how the Americanization movement emerged, “moved,” evolved, and transformed educational policy and practice.  New studies would also need to incorporate the work of sociologists, political scientists, and psychologists in order to analyze historical data via new social scientific theories.  Some of this theoretical work could include conceptions of ideology and nationalism, organizational and state theory, and critical race theories.  Social movement theory would be one especially useful tool with which to explain “structures and processes, established and emergent organizations, institutionalized authority, and transgressive contention” as well as the “connections between local or specialized fields and broader societal systems.”[xxviii] 

Another theoretical framework that would be useful is Kevin J. Doughtery’s relative autonomy of the state theory, which argues that state officials (including educational administrators and teachers) had their own agendas that were “relatively autonomous” of interest group pressure (business, foundations, professional organizations, and popular coalitions), but were also influenced indirectly and directly by these interest groups through resource dependence and ideology.[xxix]  Using Doughtery’s theory, a historian could demonstrate how national and state organizations propagated potent nationalist and cultural ideologies of Americanism and offers of financial support for local Americanization initiatives, while as the same time demonstrate the “relatively autonomous” decisions and programs actually conducted by educational administrators, teacher training programs, and individual teachers in specific programs. 

There also needs to be much more detailed study of localized contexts of Americanization as an educational and not just a political or cultural endeavor.  This means a more systematic study of the educational processes that took place, which were used by specific educators in order to attempt to Americanize specific ethnic groups in specific localities.  This also means a more detailed and focus look at teachers, teaching methods, curriculum, curriculum designers, educational materials and contexts, and funding.  It also means looking into the institutional and organizational histories of normal schools and teacher training programs in order to see how teachers were prepared to become Americanizers.  Educational and curricular purposes also need to be explored in relation to theories of ideology and, more particularly, theories of nationalism as an ideology, cultural system, and site of cultural conflict.[xxx]

There also has been almost no work done on the larger effects or antecedents of the Americanization movement.  Very few studies link the continental Americanization movement to U.S. imperialism.  Robert A. Carlson’s The Quest for Conformity touched on this connection, and several disconnected articles focusing on single colonized groups have discussed the issue of cultural and martial imperialism via Americanization in U.S. colonies like Puerto Rico, Hawaii, Guam, the Philippines, and the continental Americanization of Native Americans and African Americans.  An important research question, which has not been full addressed by any historian, is the antecedent relationship between the 20th century Americanization movement and the 19th century Americanization efforts forced on Hawaiians, Philippinos, Native Americans, and freed black slaves.[xxxi]  Another important focus that has been almost completely ignored is the effect of the Americanization movement on the development of civic education in the public schools, and the development adult education in the newly formed community colleges.[xxxii]  While there has been some treatment of the Americanization movement in the public schools by historians such as David Tyack and others (see footnotes 16 and 17), there has been no historical studies conducted to my knowledge, which have linked the Americanization movement to the widespread emergence of adult education or the origins of the community college.[xxxiii]

Understanding the emergence of American national identity, the expansion of the liberal state, the institutionalization of American identity in Americanization programs, and the growing geo-political power of the United States are all interconnected issues for 20th century historians and 21st century policy makers.  David A. Hollinger used the older warning of David M. Potter as a branching off point to reevaluate the historian’s usage of the nationalist paradigm.[xxxiv]  Hollinger warned, “Nations can easily turn historians into tools,” but he added, “Nations are not the only formations that threaten to turn historians into tools.  Nonnational and antinational movements and solidarities can do the same.”  Historians are always negotiating ideological, cultural, personal, or conceptual allegiances and they “select and deselect with every sentence.”  Besides, Hollinger noted, “there is still substantial room for a national narrative that speaks to the American public, and that even has among its several purposes the critical maintenance of the United States considered as a political solidarity.”  Hollinger stated quite directly: “to study the nation is not necessarily to be an ideological nationalist.”  And he explained how there are many opportunities to describe power, inequality, and human agency within national narratives, and there is also room to deconstruct and historicize national identities as well: “How has the United States drawn and redrawn its social borders to accommodate, repel, or subjugate this or that group, in defiance of its egalitarian and individualistic self-image?”  The study of Americanization, the early 20th century Americanization movement, and the multiple practices of Americanization programs by institutions and individuals comprise a complex ecology whereby issues of nationalism, internationalism, citizenship, patriotism, education, social control, exclusion, inequality, and social justice all come to the fore.  This Gordian knot waits for future historians not to cut, but to unwind in order to trace the complicated contradictions of American nationalism and progressive politics which haunt this country still.



Endnotes

[i] Herbert A. Miller, The School and the Immigrant (Survey Committee of the Cleveland Foundation, 1916); Howard C. Hill, “The Americanization Movement,”  American Journal of Sociology, XXIV (May 1919): 609-627; Isaac B. Berkson, Theories of Americanization: A Critical Study with Special Reference to the Jewish Group (New York: Teachers College Press, 1920); Frank V. Thompson, Schooling of the Immigrant, Carnegie Corporation Americanization Studies (New York: Harper & Brothers, 1920).

[ii] This series was reissued in toto under the editor William S. Bernard: Publication No. 125, Patterson Smith Reprint Series in Criminology, Law Enforcement, and Social Problems (Montclair: Patterson Smith, 1971).  For a review and short history of this series see: Milton M. Gordon, “The American Immigrant Revisited,” Social Forces 54:2 (Dec 1975): 470-74.

[iii] Edward George Hartmann, The Movement to Americanize the Immigrant (1948; reprint, New York: AMS Press, 1967).

[iv] Hartmann, Ibid., 236, 252-53, 261-70.

[v] John Higham, Strangers in the Land: Patterns of American Nativism, 1860-1925 (1955; reprint, New Brunswick: Rutgers University Press, 1998).

[vi] Higham, ibid., 196, 200, 204-05.

[vii] See footnote 19.

[viii] Robert A. Carlson, The Quest for Conformity: Americanization through Education (New York: John Wiley and Sons, 1975).  Carlson expanded this book somewhat into a more general treatment of Americanization as a whole: The Americanization Syndrome: A Quest for Conformity (New York: St. Martin’s Press, 1987).  However, the later book is essentially the same work with the same basic portrait and relying on the same basic secondary and primary source material.  See John W. Briggs, review of The Quest for Conformity, by Robert A. Carlson, History of Education Quarterly 28:4 (Winter 1988): 689-91.

[ix] Carlson, ibid., 12, 15, 93, 141.

[x] For discussions of the term “republicanism” see: Linda K. Kerber, “The Republican Ideology of the Revolutionary Generation,” American Quarterly 37 (Autumn 1985): 474-495; Joyce Appleby, “Republicanism and Ideology,” American Quarterly 37 (Autumn 1985): 461-473; Daniel T. Rodgers, “Republicanism: The Career of a Concept,” The Journal of American History 79 (June 1992): 11-38.  For a discussion of the term “progressivism” see: John D. Buenker, John C. Burnham, and Robert M. Crunden, Progressivism (Cambridge, MA: Schenkman Publishing Company, Inc., 1977); Daniel T. Rodgers, “In Search of Progressivism,” Reviews in American History 10 (Dec 1982), 113-132; Arthur S. Link and Richard L. McCormick, Progressivism (Wheeling, IL: Harlan Davidson, Inc., 1983); James T. Kloppenberg, Uncertain Victory: Social Democracy and Progressivism in European and American Thought, 1870 – 1920 (Oxford: Oxford University Press, 1986); John Whiteclay Chambers II, The Tyranny of Change: American in the Progressive Era, 1890 – 1920 (1992; reprint, New Brunswick, NJ: Rutgers University Press, 2000); Alan Dawley, Struggles for Justice: Social Responsibility and the Liberal State (Cambridge: Belknap Press of Harvard University Press, 1991).  For a discussion of the term “liberalism” see: Gary Gerstle, “The Protean Character of American Liberalism,” The American Historical Review 99:4 (Oct 1994): 1043-1073; James T. Kloppenberg, Uncertain Victory: Social Democracy and Progressivism in European and American Thought, 1870 – 1920 (Oxford: Oxford University Press, 1986); James T. Kloppenberg, The Virtues of Liberalism (Oxford: Oxford University Press, 1998).

[xi] Richard Hofstadter, The Age of Reform: From Bryan to F.D.R. (New York: Vintage Books, 1955), 181.

[xii] Historians have also focused on other cultural media, which socialized immigrants.  Jackson Lears described how “ethnocentrism reinforced professionalism” in the advertising business and how advertisements as medium of “manipulation” and “control” “showed recent immigrants how to assimilate to ‘American’ ways.”  Fables of Abundance: A Cultural History of Advertising in America (New York: Basic Books, 1994), 205, 253.  Eric Foner noted, “The department store, dance hall, and motion picture theater were as much agents of Americanization as the school and workplace. Eric Foner, The Story of American Freedom (New York: W. W. Norton & Co., 1998), 191.  See also Rob Kroes, “American Empire and Cultural Imperialism: A View from the Receiving End,” Rethinking American History in a Global Age, Thomas Bender, ed. (Berkeley: University of California Press, 2002): 295-313.

Lawrence Cremin also focused on several educative media during the early 20th century.  “Media of Popular Communication,” American Education: The Metropolitan Experience, 1876-1980 (New York: Harper & Row, 1988), 322-72.

[xiii] Alan M. Kraut, The Huddled Masses: The Immigrant in American Society, 1880 – 1921, 2nd ed. (1982; reprint, Wheeling: Harlan Davidson, Inc., 2001), 120, 125, 128-29, 155.

[xiv] David R. Roediger, Towards the Abolition of Whiteness (1994; reprint, London: Verso, 2000), 187-90; David R. Roediger, Working Toward Whiteness: How America’s Immigrants Became White (New York: Basic Books, 2005), 84-85, 91, 143.

[xv] Gary Gerstle, “Liberty, Coercion, and the Making of Americans,” The Journal of American History 84:2 (Sept 1997): 524-58; J. Hector St John De Crevecoeur, Letters from an American Farmer (1782; reprint, Oxford: Oxford University Press).

[xvi] Gerstle specifically indicts Lawrence H. Fuchs and Werner Sollors, but Donna R. Gabaccia placed Gerstle in the same camp.  Gabaccia argued that Gerstle “remains as much a neo-Crevecoeurian as the scholars he criticizes” because Gerstle approached Crevecoeurian assimilationist theory too much on its “own terms.”  For instance, Gabaccia points to Gerstle’s focus on only Europeans (ignoring other minorities like Blacks, Native Americans, and Latinos), and his focus on the nation state (ignoring transnational and diaspora elements).  Gabaccia argued that Gerstle should have extended his critique of assimilation, coercion, structural constraints, and the power of the nation state to include a “critique of national historiography” and “the nation itself.”  Donna R. Gabaccia, “Liberty, Coercion, and the Making of Immigration Historians,” The Journal of American History 84:2 (Sept 1997): 570-75.

[xvii] Milton M. Gordon, Assimilation in American Life: The Role of Race, Religion, and National Origins (New York: Oxford University Press, 1964); Nathan Glazer and Daniel Patrick Moynihan, Beyond the Melting Pot: The Negroes, Puerto Ricans, Jews, Italians, and Irish of New York, 2nd ed. (1963; reprint, Cambridge: MIT Press, 1974); Nathan Glazer, “Is Assimilation Dead?” Annals of the American Academy of Political and Social Science 530 (Nov 1993): 122-36; Russell A. Kazal, “Revisiting Assimilation: The Rise, Fall, and Reappraisal of a Concept in American Ethnic History,” The American Historical Review 100:2 (Apr 1995): 437-71; Dennis J. Downey, “From Americanization to Multiculturalism: Political Symbols and Struggles for Cultural Diversity in Twentieth-Century American Race Relations,” Sociological Perspectives 42:2 (Summer 1999): 249-78; For a good sample of this literature see: George E. Pozzetta, ed.  Assimilation, Acculturation, and Social Mobility (New York: Garland Publishing, 1991).

[xviii] For liberal condemnation of the term see: Michael Walzer, “What Does it Mean to Be an ‘American?’” Social Research (1990); Reprinted in Michael Walzer, What It Means to Be an American: Essays on the American Experience (New York: Marsilio, 1996).  For conservative use of the term see: Arthur M. Schlesinger, Jr., The Disuniting of America: Reflections on a Multicultural Society, revised ed. (1991; revised, New York: W. W. Norton, 1998); E. D. Hirsch Jr., “Americanization and the Schools,” The Clearing House 72:3 (Jan/Feb, 1999): 136-39; Samuel P. Huntington, Who Are We? The Challenges to America’s National Identity (New York: Simon & Schuster, 2004).

[xix] There is only book length work on the Progressive era that gives substantial treatment to the Americanization movement: John F. McClymer, War and Welfare: Social Engineering in America, 1890-1925 (Westport, CT: Greenwood, 1980).  McClymer has also published an important article “The Federal Government and the Americanization Movement, 1915-1924” Prologue: The Journal of the National Archives 10 (Spring 1978): 23-41.  This article was republished along with several other important articles on the Americanization movement in an anthology edited by George E. Pozzetta called Americanization, Social Control, and Philanthropy (New York: Garland Publishing, 1991).  Gary Gerstle is one of the few historians of 20th century America that has given the Americanization movement extended treatment in several works, especially in American Crucible: Race and Nation in the Twentieth Century (Princeton: Princeton University Press, 2001).  See also: Desmond King, Making Americans: Immigration, Race, and the Origins of the Diverse Democracy (Cambridge: Harvard University Press, 2000).  There is a problem, however, with all these fine treatments of the Americanization movement.  All of these works rely primarily on the federal archives of the departments of Immigration and Naturalization (Record Group 85, National Archives), Labor (Record Group 174, National Archives), and Education (Record Group 12, National Archives).  Thus, the historical treatment of the Americanization movement in these works focuses mostly on the federal level with little or no treatment of state, local, or institutional/organizational levels (although there is often mention of national institutions like the public school system or national organizations like the Daughters of the American Revolution).  For a brief discussion of the usage of these federal archives see: Noah Pickus, True Faith and Allegiance: Immigration and American Civic Nationalism (Princeton: Princeton University Press, 2005): 206-07, footnote 36.  One should also note that most of the literature on the Americanization movement focuses on persons, organizations, and events from the East Coast or Midwest, but there were active Americanization campaigns in the West and Southwest, especially California.  Noah Pickus argued incorrectly that “Americanization was primarily an eastern and midwestern phenomenon,” which “largely ignored other [non-European] immigrants, such as those from China or Mexico (Ibid., 206, footnote 35).  There is a large, but fragmented body of research on the Americanization of peoples from Mexico, Japan, Guam, Philippines, Puerto Rico, Hawaii, and Native Americans.  Often this literature links (although rather generally) the Americanization movement to U.S. imperialism.  See footnote 26 in this essay.

[xx] Kenneth B. O’Brien Jr., “Education, Americanization and the Supreme Court: The 1920’s,” American Quarterly 13:2 (Summer, 1961): 161-171; John F. McClymer, “The Federal Government and the Americanization Movement, 1915-1924” Prologue: The Journal of the National Archives 10 (Spring 1978)P 22-41.

[xxi] Lawrence A. Cremin, The Transformation of the School: Progressivism in American Education, 1876 – 1957 (New York: Vintage Books, 1961); Robert A. Carlson, “Americanization as an Early Twentieth-Century Adult Education Movement,” History of Education Quarterly 10:4 (Winter 1970): 440-64; David Tyack, The One Best System: A History of American Urban Education, Part 4.4, “Americanization: Match and Mismatch” (Cambridge: Harvard University Press, 1974); Robert A. Carlson, The Quest for Conformity: Americanization through Education (New York: John Wiley and Sons, 1975); John F. McClymer, “The Americanization Movement and the Education of the Foreign-Born Adult, 1914-25,” In American Education and the European Immigrant: 1840-1940, edited by Bernard J. Weiss.  Urbana, IL: University of Illinois Press, 1982.  Paper originally prepared for 12th annually Duquesne History Forum, Oct. 18-20, 1978; Vincent P. Franklin, “Ethos and Education: The Impact of Educational Activities on Minority Ethnic Identity in the United States,” Review of Research in Education 10 (1983): 3-21;

David Tyack, Thomas James, and Aaron Benavot, “Moral Majorities and the School Curriculum: Making Virtue Mandatory, 1880-1930.” Law and the Shaping of Public Education, 1785-1954 (Madison: University of Wisconsin Press, 1987):154-76; Michael R. Olneck, “Americanization and the Education of Immigrants, 1900-1925: An Analysis of Symbolic Action,” American Journal of Education 97 (Aug 1989): 398-423.

[xxii] Michael R. Olneck and Marvin Lazerson, “The School Achievement of Immigrant Children: 1900-1930,” History of Education Quarterly 14:4 (Winter 1974): 453-82; Raymond A. Mohl, “The International Institutes and Immigrant Education, 1910-40,” In American Education and the European Immigrant: 1840-1940, edited by Bernard J. Weiss.  Urbana, IL: University of Illinois Press, 1982.  Paper originally prepared for 12th annually Duquesne History Forum, Oct. 18-20, 1978; Nicholas V. Montalto, “The Intercultural Education Movement, 1924-41: The Growth of Tolerance as a Form of Intolerance,” In American Education and the European Immigrant: 1840-1940, edited by Bernard J. Weiss.  Urbana, IL: University of Illinois Press, 1982.  Paper originally prepared for 12th annually Duquesne History Forum, Oct. 18-20, 1978; Nicholas V. Montalto, A History of the Intercultural Education Movement, 1924-1941 (New York: Garland Press, 1982).

[xxiii] Michael Kammen, A Machine That Would Go of Itself: The Constitution in American Culture (New York: Vintage Books, 1987): 235-48.  Rogers M. Smith, Civic Ideals: Conflicting Visions of Citizenship in U.S. History (New Haven: Yale University Press, 1997): Ch 12; Jeffrey Mirel, “Civic Education and Changing Definitions of American Identity, 1900 – 1950,” Educational Review 54:4 (2002): 143-152. Noah Pickus, True Faith and Allegiance: Immigration and American Civic Nationalism (Princeton: Princeton University Press, 2005): Ch 4-6.

[xxiv] Gerd Korman, Industrialization, Immigrants, and Americanizers: The View from Milwaukee, 1866 – 1921 (Madison: The State Historical Society of Wisconsin, 1967); Stephen Meyer, “Adapting the Immigrant to the Line: Americanization in the Ford Factory, 1914-1921,” Journal of Social History 14 (1980): 67-82; James R. Barrett, “Americanization from the Bottom Up: Immigration and the Remaking of the Working Class in the United States, 1880 – 1930,” The Journal of American History 79:3 (Dec 1992): 996-1020; Gilbert G. Gonzalez, “Labor and Community: The Camps of Mexican Citrus Pickers in Southern California,” The Western Historical Quarterly 22:3 (Aug 1991): 289-312.

[xxv] Rivka Shpak Lissak, Pluralism and Progressives: Hull House and the New Immigrants, 1890 – 1919 (Chicago: University of Chicago Press, 1989); Ruth Hutchinson Crocker, Social Work and Social Order: The Settlement Movement in Two Industrial Cities, 1889 – 1930; Philip Gleason, “The Catholic Church in American Public Life in the Twentieth Century,” Logos: A Journal of Catholic Thought and Culture 3:4 (2000): 85-99.

[xxvi] On gender see: Eileen Boris, “Reconstructing the ‘Family’: Women, Progressive Reform, and the Problem of Social Control,” in Gender, Class, Race and Reform in the Progressive Era, Noralee Frankel and Nancy S. Dye, eds. (Lexington, KT: The University Press of Kentucky, 1991): 73-86.  On Latinos see: George Sanchez, “’Go After the Women:’ Americanization and the Mexican Immigrant Woman, 1915 – 1929,” Stanford Center for Chicano Research, Working Paper Series No. 6 (June 1984): 1-32. Revised and reprinted in Unequal Sisters: A Multi-Cultural Reader in U.S. Women’s History, ed. Ellen Carol DuBois and Vicki L. Ruiz (New York: Routledge, 1990); Reinhard R. Doerries, “The Americanizing of the German Immigrant: A Chapter from U.S. Social History,” American Studies 23:1 (1978): 51-59; Mario T. Garcia, “Americanization and the Mexican Immigrant, 1880-1930,” Journal of Ethnic Studies 6:2 (Summer 1978): 19-34; George J. Sanchez, Becoming Mexican American: Ethnicity, Culture and Identity in Chicano Los Angeles, 1900 – 1945 (Oxford: Oxford University Press, 1993): Ch 4 & 5; Guadalupe San Miguel Jr. and Richard R. Valencia, “From the Treaty of Guadalupe Hidalgo to Hopwood: The Educational Plight and Struggle of Mexican Americans in the Southwest,” Harvard Educational Review 68:3 (1998): 353-412.  On Native Americans see: David Wallace Adams, “Fundamental Considerations: The Deep Meaning of Native American Schooling, 1880 – 1900,” Harvard Educational Review 58:1 (Feb 1988): 1-28; Michael C. Coleman, American Indian Children at School, 1850-1930 (Jackson: University Press of Mississippi, 1993); David Wallace Adams, Education for Extinction: American Indians and the Boarding School Experience, 1875-1923 (Lawrence: University Press of Kansas, 1995); Donal F. Lindsey, Indians at Hampton Institute, 1877-1923 (Urbana: University of Illinois Press, 1995).  On Japanese see: David K. Yoo, Growing Up Nisei: Race, Generation, and Culture among Japanese Americans of California, 1924-49 (Urbana: University of Illinois Press, 2000).  On Puerto Ricans see: Pedro Caban, “Subjects and Immigrants During the Progressive Era,” Discourse 23:3 (2001): 24-51.  On Filipinos see: Anne Paulet, “To Change the World: The Use of American Indian Education in the Philippines,” History of Education Quarterly 47:2 (May 2007): 173-202.  On Hawaiians see: Manette K. P. Benham and Ronald H. Heck, Culture and Educational Policy in Hawai’i: The Silencing of Native Voices ()Mahwah, NJ: Lawrence Erlbaum, 1998); C. Kalani Beyer, “The Connection of Samuel Chapman Armstrong as Both Borrower and Architect of Education in Hawai’i,” History of Education Quarterly 47:1 (Feb 2007): 23-48.  On African Americans see: James D. Anderson, The Education of Blacks in the South, 1860-1935 Chapel Hill: University of North Carolina Press, 1988).

[xxvii] Leonard Dinnerstein and David M. Reimers, Ethnic Americans: A History of Immigration and Assimilation (New York: New York University Press, 1977); Thomas J. Archdeacon, Becoming American: An Ethnic History (New York: The Free Press, 1983); John Bodnar, The Transplanted: A History of Immigrants in Urban America (Bloomington: Indiana University Press, 1985); Leonard Dinnerstein and David M. Reimers, Natives and Strangers: Blacks, Indians, and Immigrants in America (Oxford: Oxford University Press, 1990); Alan M. Kraut, The Huddled Masses: The Immigrant in American Society, 1880 – 1921, 2nd ed. (1982; reprint, Wheeling: Harlan Davidson, Inc., 2001); David R. Roediger, Working Toward Whiteness: How America’s Immigrants Became White (New York: Basic Books, 2005).

[xxviii] Doug McAdam and W. Richard Scott, “Organizations and Movements” in Social Movements and Organization Theory.  Gerald F. Davis, Doug McAdam, W. Richard Scott, and Mayer N. Zald, eds.  (Cambridge: Cambridge University Press, 2005): 38.

[xxix] Kevin J. Dougherty, The Contradictory College: The Conflicting Origins, Impacts, and Futures of the Community College (1994; reprint, Albany, NY: State University of New York Press, 2001): 15-39, 105-6, 183-88, 239-42, 273-86.

[xxx] On ideology see several essays by Clifford Geertz, especially “Ethos, World View, and the Analysis of Sacred Symbols,” “Ideology As a Cultural System,” “The Politics of Meaning,” and “Common Sense as a Cultural System” in The Interpretation of Cultures: Selected Essays by Clifford Geertz (New York: Basic Books, 1973) and Local Knowledge: Further Essays in Interpretive Anthropology (New York: Basic Books, 1983).  See also: John B. Thompson Studies in the Theory of Ideology (Berkeley: University of California Press, 1984) and Ideology and Modern Culture (Stanford: Stanford University Press, 1990); J. M. Beach, Studies in Ideology: Essays on Culture and Subjectivity (Lanham: University Press of America, 2005): Part I and II.  On social scientific theories and histories of nationalism see: Liah Greenfeld, Nationalism: Five Roads to Modernity (Cambridge: Harvard University Press, 1992); Guido Zernatto, “Nation: The History of a Word,” Review of Politics 6 (1944): 351-66; Max Weber, Wirtschaft und Gesellschaft in From Max Weber: Essays in Sociology, H. H. Gerth and C. Wright Mills, eds. (1946; reprint, Oxford: Oxford University Press, 1958), 171-79; Louis Wirth, “Types of Nationalism,” The American Journal of Sociology 41 (May 1936): 723-37; Hans Kohn, “The Nature of Nationalism,” The American Political Science Review 33 (Dec 1939): 1001-21; Chong-Do Hah and Jeffrey Martin, “Toward a Synthesis of Conflict and Integration Theories of Nationalism,” World Politics 27 (April 1975): 361-86; Isaiah Berlin, “Nationalism: Past Neglect and Present Power,” Against the Current: Essays in the History of Ideas, in The Proper Study of Mankind: An Anthology of Essays, Henry Hardy and Roger Hausheer, eds. (1979; reprint, New York: Farrar, Straus and Giroux, 1997): 581-604; Benedict Anderson, Imagined Communities: Reflections on the Origin and Spread of Nationalism (1983; reprint, London: Verso, 1991); Eric Hobsbawm, Nations and Nationalism since 1780: Programme, Myth, Reality (1990; reprint, Cambridge: Cambridge University Press, 2000); Liah Greenfeld, “The Trouble with Social Science,” Critical Review 17:1-2 (2005): 101-16.

[xxxi] C. Kalani Beyer wrote an interesting article connecting Samuel Chapman Armstrong to both the Americanization of blacks and Hawaiians in “The Connection of Samuel Chapman Armstrong as Both Borrower and Architect of Education in Hawai’i,” History of Education Quarterly 47:1 (Feb 2007): 2348.  There is also an interesting article, which links social studies curriculum and civic training in the public schools to the Americanizing efforts done on African Americans at the manual training Hampton Institute in Virginia.  This raises interesting questions about Americanization and social control in relation to oppressed, non-white minorities/non citizens and similar and/or differentiated treatment of citizen and/or “white” children in public schools.  Michael Lybarger, “Origins of the Modern Social Studies: 1900 – 1916,” History of Education Quarterly 23:4 (Winter 1983): 455-68.

[xxxii] Emory S. Bogardus, Essentials of Americanization (Los Angeles: University of Southern California Press, 1919); Lawrence A. Cremin, The Transformation of the School: Progressivism in American Education, 1876-1957 (New York: Vintage, 1961): 66-75; Morris Janowitz, The Reconstruction of Patriotism: Education for Civic Consciousness (Chicago: University of Chicago Press, 1983); Amy Gutmann, Democratic Education (Princeton: Princeton University Press, 1987): 104-107; Joel Westheimer, ed., Pledging Allegiance: the Politics of Patriotism in America’s Schools (New York: Teachers College Press, 2007).

[xxxiii] Paul H. Sheats, “Adult Education for Victory and Peace,” Journal of Educational Sociology, special issue on The Foreign Born – Their Citizenship 17:1 (Sep, 1943): 28-35; Caroline A. Whipple, “Adult Education and the Public Schools,” Journal of Educational Sociology 19:1 (Sep, 1945): 20-26.

[xxxiv] David M. Potter, “The Historian’s Use of Nationalism and Vice Versa,” The American Historical Review 67 (July 1962): 924-50; David A. Hollinger, “The Historian’s Use of the United States and Vice Versa,” in Rethinking American History in a Global Age, Thomas Bender, ed. (Berkeley: University of California Press, 2002): 381-95.