A Reckoning in Higher Education

In a recent eventcast, City Journal contributing editor Kay HymowitzWall Street Journal reporter Josh Mitchell, and Ohio University economist Richard Vedder joined Brian Anderson to discuss student debt, degree inflation, and policy solutions to bring down higher-education costs and improve outcomes for all.





Source link

Review of “The Aristocracy of Talent” by Adrian Wooldridge


The Aristocracy of Talent: How Meritocracy Made the Modern World, by Adrian Wooldridge (Skyhorse, 504 pp., $24.99)

Olympics fans aside, meritocracy doesn’t have many friends these days. Social-justice advocates view the meritocracy as a swindle, giving white people an excuse to hoard their privilege and leaving minorities only crumbs. On the right, populists look at a recession, forever wars in Iraq and Afghanistan, and a globalized economy that has taken away their jobs and destroyed their towns and regard the designated experts with disgust. Crème-de-la-crème meritocrats such as Harvard professor Michael Sandel and Yale law professor Daniel Markovits decry the smugness, entitlement, and soul-draining rat race promoted by our machinery of higher education, the very system that gives them their own prestige.

The great virtue of The Aristocracy of Talent: How Meritocracy Made the Modern World, by longtime Economist editor and writer Adrian Wooldridge, is that while acknowledging the harsh truths of these critiques, it forces us to ponder the next question only tepidly addressed by others on this beat: If not meritocracy, then what? How should societies allocate status and the power to make the big decisions?

Scour Wooldridge’s expansive history of the conundrum, and you’ll be hard-pressed to find satisfying alternatives. In fact, from his telling, you might conclude that raw evolutionary psychology rather than studied political science or ethics best explains how most societies have operated. Families and clans, not individuals, were “the basic unit of society,” he writes. Sons inherited land and titles, daughters were bartered for more, and both passed their unearned privileges onto their own children regardless of their progeny’s character, intelligence, or interest in doing certain jobs. Likewise, serfs grew their superiors’ food, and servants dressed their masters in silk breeches for no reason other than that they were born to do so. If a peasant was blessed with Einstein’s brains or Lincoln’s political wisdom, it would make no difference in his or her life path; “tillers tilled and thatchers thatched,” as Wooldridge writes. No doubt the hoi polloi grumbled about their masters, but the arrangement was widely accepted as natural and just. It was individual ambition that represented a danger to what was thought to be a God-given social order.

Of course, the powers-that-were had every reason to advance the idea that inherited privilege was divinely ordained—but the truth is, for many centuries favoring kin was about the only game around. Plato was the first to imagine a system that would give power to those more worthy. The guardians of his Republic would be “men of gold”: those with natural talents, pedigree be damned. Notably, he believed the family was the biggest threat to the just polity; kin would always embrace kin. To short-circuit this stubborn fact, he proposed taking future guardians of each generation away from their parents in order to prepare them for leadership through intensive physical, intellectual, and philosophical training. (Plato doesn’t specify how he would locate the prodigies.)

The only other proto-meritocratic social order came from Asia. As early as the tenth century, the Chinese developed their famous exam system that, with many modifications, continues to sort the wheat from the chaff today. A grueling, multiyear preparation, it allowed farmer’s sons the chance to escape the dead-end bleakness of village life and become “mandarins” in the Forbidden City. Still, the emperor inherited his position.

In other parts of the world, a few exceptional low-born strivers could bypass the ancient barriers of entrenched hierarchy. England had an unofficial system of “sponsored social mobility,” in which a lord or church worthy would take notice of a clever plebian and mentor him to prominence. Cardinal Wolsey, the son of a wool seller, groomed Thomas Cromwell, who later became Henry VIII’s consigliere and fixer—a stunning rise for the son of a blacksmith. Noblemen lent support to talented artists and thinkers from undistinguished backgrounds. The Duke of Buccleuch was patron to the philosopher Adam Smith; ironically, his protégé was part of the Scottish Enlightenment, which would weaken the logic of inherited nobility that had given the Duke his riches. Wooldridge speculates that the emergence of larger and more complex states also challenged the ancien regime as states found themselves in need of more capable bureaucrats than a pampered, inbred aristocracy could produce. After all, kings and dukes needed shrewd minions to administer and collect taxes to support their palaces and wars.

Understood in the context of this tenacious history, meritocracy was a genuinely radical idea. It took bloody revolutions, religious wars, and centuries of dispute to undo the old system of lineage and shift toward modern thinking about merit. The Protestant Reformation played a role in this process by releasing individual conscience from the all-powerful word of priests and kings. Calvinism moved individual hard work to the center of moral and religious life. Enlightenment philosophers harped on the corruption and mediocrity of titled aristocrats of their day. The philosophes began the nature–nurture debate that continues today: were the talented just born that way—“natural aristocrats”—or were their gifts the result of training and education? In either case, the individual was to be judged and rewarded according to his merit, not family ties.

The American Revolution was a hinge moment in this history. Though the fact has become obscured amid today’s race-based revisionism, the revolution was first and foremost dedicated to overthrowing the dynastic habits of the Old World and its fusty, corrupt lords and dukes. Enlightenment-influenced American revolutionaries and the French who followed soon after believed that ordinary folks of talent and excellence should be recognized and given opportunities to lead. It helped that geography removed settlers in the new world from old lineages. With the important exceptions of African slaves and American Indians, Americans were all immigrants who chose to move to a distant, unknown wilderness; they had reason to be impressed by innovators who would help them adapt to their environment and create new ways of dealing with old problems. Even as they have never fully surmounted either nepotism or racial bias, they had to be a pragmatic people.

Wooldridge makes the case that meritocratic ideals were crucial to the emergence of the modern world. Without a commitment to recognizing and rewarding individual ambition and talent, there would be no mass education to sort and develop each person’s abilities; no Industrial Revolution; no important innovations in transportation, medicine, or construction; and perhaps only a forme fruste of liberal democracy. The British introduced grammar schools to educate bright working- and middle-class kids; France expanded its baccalaureate to include ambitious country boys or girls, and as it industrialized, the U.S. became more meritocratic. In 1883, Congress passed the Pendleton Act, establishing a Civil Service Commission intended to weed out Tammany Hall–style favoritism and discover competent bureaucrats; England had a similar law.

The meritocracy reached its zenith in the early and mid-twentieth century. One accelerant was the emergence of IQ and other aptitude tests between the two world wars. The tests remain extremely controversial, especially on the left. But Wooldridge notes that until the 1960s it was radicals who were most enthusiastic about the science of intelligence. Socialists such as Beatrice and Sidney Webb viewed it as a way to smash class barriers and expose those hidden diamonds in the rough. Many successful people today who grew up poor and might have missed out on scholarships and elite schooling but for standardized testing would agree with them.

Whatever its potential advantages, the advent of IQ testing also signaled a coarsening of meritocratic ideals. From Plato on, merit had a moral as well as an intellectual dimension. When the founders spoke of talent, it was frequently accompanied by the word “virtue”; the Victorians emphasized the “duty” required of their elites. Public service remained an elite profession well into the twentieth century. But by World War II, testing enthusiasm and an increasingly technocratic, bureaucratic, and globalized economy left the moral dimension homeless and built up the prestige of raw mental dexterity. This technological hubris has given us a society dominated by Silicon Valley titans and Ivy League royalty.

Further demoralizing meritocracy today is the enduring human instinct to put family and clan first. By the turn of the twenty-first century, the rich had turned elite higher education into a gilded ghetto for their children via an extravagant menu of private schooling, tutoring, internships, service trips to Central America, and letters of recommendation from well-connected family friends. (Wooldridge mentions that Christopher Hitchens told him how much he hated having to write letters of recommendation to D.C.’s elite preschools for friends.) Wooldridge argues that the “marriage of merit and money” and the ruling class’s success in rigging the system for their friends and family is at the root of populist revolts like Brexit and the election of Donald Trump.

Given its decadent state, is meritocracy, like the ancien regime in the eighteenth century, heading for the guillotine? It sometimes seems that way. The education establishment is losing confidence in its meritocratic mission. “We reject ideas of natural gifts and talents,” declares the current draft of the California Math Framework. Gifted programs and selective-exam schools are being hunted down like big game. Higher education wobbles between its established purpose of finding and growing young talent and the conflicting goal of advancing social-justice egalitarianism. Wooldridge holds out hope that a “wiser,” “remoralized” meritocracy, cleansed of nepotism and elite hoarding, is still possible. He points to the success of Asian countries like China and Singapore newly committed to their own forms of merit-based hierarchies.

The question for Americans: Which approach is better-suited to confront the immense technological, governmental, and global challenges that we face?

Photo: HOMONSTOCK/iStock





Source link

Dr. Biden’s Lesson | City Journal



Among the many eruptions of outrage that distracted us from the dread of the past year was one provoked in December by an 800-word Wall Street Journal op-ed titled “Is There a Doctor in the White House? Not if You Need an M.D.” Written by prolific author and veteran wit Joseph Epstein, it mocked First Lady–to–Be Jill Biden for wanting to be called “Dr. Biden.” The honorific should be reserved for the kind of doctor who could save your life when your appendix bursts, he wrote, not a doctor of education or, as they are commonly known, an Ed.D.

Epstein touched a cultural nerve. A Biden administration spokesperson described the article as a “disgusting and sexist attack” and demanded an apology and a retraction. The Guardian, late-night host Stephen Colbert, and MSNBC all jumped in to defend the First Lady’s honor. Northwestern University, where Epstein taught for 30 years, still has a message on its website assuring visitors that his “misogynistic views” are not its problem, since Epstein hasn’t been a lecturer there since 2003. Social media added to the chorus: Dr. Biden “worked [her] rear end off for years to earn that,” tweeted Audrey Truschke, an associate professor of South Asian history at Rutgers University. Let her “shout it from the rooftops.”

The outrage was ephemeral but also revealing. Stuck in the stock framework of sexism and unduly reverent of academic title and prestige, Team Dr. was tone-deaf to the cultural and political moment. The controversy unfolded a mere four years after a presidential election that exposed an ominous social and economic chasm between college-educated and less diplomaed Americans. It coincided with a lame-duck president who “love[d] the poorly educated” rallying his base to help him undermine the results of an election that had not gone his way. But instead of minding the polarizing education gap, Dr. Biden and her advocates stood staunchly on the side of a powerful education-industrial complex and the professional-managerial class that it nurtures. In fact, the First Lady’s Ed.D. epitomizes today’s rampant degree inflation and meritocratic jockeying, which—ironically, given the politics of her husband’s administration—weighs most heavily on young adults, especially the least advantaged.

Start with those degree-inflation numbers. Between 1980 and 2017, the share of adults with at least a four-year college degree doubled, from 17 percent to 34 percent. The Great Recession intensified the trend, since people often choose to return to school to burnish their résumé when finding jobs is tough. From 2010 to 2019, the percentage of people 25 and older with a bachelor’s degree or higher increased by 6 percentage points, to 36 percent, where it sits today.

The more surprising part of the story is that the college degree is declining in status: postgraduate degrees are now where the real action is. The coveted B.A. from all but the most elite schools has become a yawn, a Honda Civic in a Tesla world. It’s not just metaphorical to say that a master’s degree is the new bachelor’s degree: about 13 percent of people aged 25 and older have a master’s, about the same proportion that had a bachelor’s in 1960. Master’s mania began to spread through the higher-education world in the later 1990s, but it picked up steam during the Great Recession, even more than the bachelor’s did. From 2000 to 2012, the number of M.A.s granted annually jumped 63 percent; bachelor’s degrees rose only 45 percent. In 2000, higher-ed institutions granted an already-impressive 457,000 master’s degrees; by last year, the number had grown to 839,000. And while the Ph.D. remains a much rarer prize, its numbers have also been setting records. Some 45,000 new doctoral degrees were awarded in 2000, a number that, by last year, had more than doubled, to 98,000.

People find graduate degrees enticing for various reasons. Someone with a graduate degree will be more likely to find stable employment, will get more of a boost into higher-level positions than will B.A.s, will have bragging rights, and, most decisively, will earn fatter paychecks. Even as the growth of the college premium—the difference between expected earnings for a worker with a college degree and one with a high school diploma—shows signs of slowing down, the master’s premium has marched ever upward. According to “The Economic Value of College Majors,” a 2015 study by Georgetown University, college grads with a bachelor’s degree earn an average annual salary of $61,000 over their career. Strivers with a graduate degree can look forward to about $78,000. The earning advantage of college graduates over high school graduates increased only 6 percent from 2000 to 2013, but those with graduate degrees saw a 17 percent increase in their relative earnings over college grads.

The premium varies by field, of course. A master of fine arts (MFA) is a notoriously poor bet, though that doesn’t appear to have scared off many aspiring poets and painters (who tend to have wealthier parents). The number of MFAs has risen every year in the last decade, even though, in a number of states, early-career MFA recipients earn less than people with only an associate’s degree. Still, most fields can advertise a nice return for their students: a master’s in computer systems administration, for instance, will earn its beneficiaries a median annual salary of $88,000; a bachelor’s in that subject earns only $70,000. Education administrators see a 44 percent median boost if they get a master’s; preschool and kindergarten teachers, 43 percent; librarians, 30 percent; and flight engineers, 20 percent. Those who go to elite schools might earn somewhat more.

For individual students, a graduate degree can be a solid bet. For American society as a whole, though, hyper-credentialism has been a slow-motion disaster. The paper chase has put young people and their parents in a demoralizing, self-perpetuating arms race. Early in the twentieth century, high school diplomas were relatively rare and their recipients well compensated: in 1915, high school grads earned 45 percent more than dropouts. But by 1950, 59 percent of Americans were making it to the graduation ceremony. As the supply of high school–educated workers swelled, their wage premium fell to about 20 percent.

When every person has a high school diploma, the shrewd 18-year-old will find some other way to stand out. At first, this meant going to college. Today, however, a college degree is de minimis for high-paying jobs, so even college students need to look for ways to distinguish themselves. For a while, the double major served that purpose. These days, it won’t get you very far: the percentage of students at higher-ranked schools majoring in two different subjects is nearing half. In The Case Against Education, George Mason University economist Bryan Caplan likens the dynamic to people in the front row of an audience standing up to get a better view. That leads those behind them to stand, and those behind them, and so on. In the end, no one is better off.

This all-too-human behavior is a major force behind the master’s degree gold rush. With more college grads crowding the high-end labor market, the degree loses its prestige. As the college-wage premium weakens for recent graduates, students clamor for master’s degrees, and with a growing number of people with such degrees knocking on their doors, employers look at applicants with a bachelor’s as second class. A growing number of job listings include a master’s as the “expected or preferred level of entry,” particularly in STEM occupations and health care. In 2016, the Bureau of Labor Statistics predicted that employment in master’s-level occupations would grow almost 17 percent by 2026.

The credential arms race is also a culprit in “elite overproduction.” Coined by Peter Turchin, a professor of ecology and evolutionary biology at the University of Connecticut, the term refers to the excess number of job-searching college graduates compared with the positions in the high-paying managerial, technical, and professional vocations that they’ve been banking on. A lot of disappointed aspiring elites go back to school for postgraduate work; others take jobs that, in a different time, they might have gotten straight out of high school. The New York Fed reports that rates of underemployment—sometimes called “malemployment,” referring to grads working jobs that don’t require a college degree—have hovered between 40 percent and 50 percent since the 1990s; that’s more than 8 percentage points higher than for older grads. Ohio University economist Richard Vedder finds legions of “surplus elites” working as parking-lot attendants, bartenders, salespeople, and janitors.

Postgraduate degrees are expensive, which usually means student loans. The financial burden falling on graduate students plays a far bigger role in the student debt crisis than people realize. The growth in the numbers of master’s degrees and Ph.D.s has closely paralleled the growth in student debt. Over the past 25 years, net tuition and fees have risen more steeply for master’s students than for undergrads. Though graduate students are only 19 percent of student borrowers, they account for 40 percent of student debt. Graduate students can borrow almost unlimited funds in unsubsidized loans, for which they will be burned by interest rates two times higher than those paid by undergrads. (The government pays interest for subsidized loans while a student is in school.) Twenty-five percent of graduate students borrow almost $100,000; 10 percent of them borrow more than $150,000. Many will have a ball and chain of debt following them for years, even decades, of their adult lives—which, as we’re now witnessing, leads to demands for new government programs.

College graduates may have plenty to dislike about the arms race, but the biggest losers are workers at the bottom of the labor market. The more credentials, the less hope for the child of a single mother who works as a health-care aide or for a father disabled in a mining accident. The more time and money needed to get a mid-skilled job, the more likely that people will give up and continue packing orders at an Amazon warehouse. Degree inflation widens the nation’s class divide.

Research by Julie Posselt and Eric Grodsky shows that, as of 2010, 45 percent of people with a master’s degree or higher came from families in the highest income quartile. Twenty years earlier, that figure was only 30 percent. “Educational inheritance is striking not just at the college level,” they write. “Those from homes in which the more educated parent had a doctorate or professional degree are increasingly overrepresented” among those who get Ph.D.s and professional degrees. “Their share of the top 1 percent of the income distribution is greater still, at 62 percent,” they write. The rich get richer; they also get Ph.D.s, M.D.s, and J.D.s. Meantime, the other two-thirds of Americans get a high school diploma or, at best, an associate’s degree.

Those who do find a way to continue up the degree ladder pay a high price. First, consider the opportunity costs: while they’re sitting in a statistics classroom, they aren’t earning money to take care of younger siblings, grandparents, and perhaps children of their own. The more direct costs, of course, are tuition and living expenses. The prospect of debt deters low-income students from pursuing degrees that could lead them to a lucrative career, according to a 2020 paper, “Inequality and Opportunity in a Perfect Storm of Graduate Student Debt.” Those who do take out loans for a degree start off their careers in the red, which can suppress wealth accumulation. About half of all master’s and Ph.D. students drop out before completing their degrees; the numbers are higher for those in the humanities and for racial minority and lower-income students. Dropping out often means the worst of both worlds: plenty of debt but no degree to show for it.

Worse still, credential inflation is squeezing less educated young adults out of jobs on which they once relied. Many jobs were available to high school grads only a decade or two ago but now demand at least a four-year degree, bringing more despair in working-class communities and more polarization in the country. According to the Wall Street Journal, more than 40 percent of manufacturing workers now have a college degree, up from 22 percent in 1991. Experts predict that college-educated workers will fill more than half of American manufacturing positions within the next three years.

Mid-skilled non-factory jobs are becoming equally inhospitable to someone without the means or interest to spend four more years (at a minimum) in a classroom. The Accreditation Council for Occupational Therapy Education, for example, plans to increase entry-level degree requirements for occupational therapy assistants from an associate’s to a bachelor’s degree and to raise the requirement for full-fledged occupational therapists from a bachelor’s to a doctorate. Respiratory therapists might still squeak by with an associate’s degree in smaller markets, but the American Association of Respiratory Care is also proposing such a shift. Of the two nursing accrediting organizations, one permits an associate’s for practical nurses, but the other accredits only bachelor’s and master’s degree programs. Ohio University’s Vedder cites a quip predicting that within a decade or two, a master’s degree in “janitorial studies” will be needed to get a job as a custodian.

What’s driving this hyperactive credentialism? Why would young people want to prolong their years in a droning classroom at so much cost to themselves? Why would employers be willing to pay more for workers if they don’t have to?

The most benign answer is the most widely repeated: jobs require more education in today’s high-skilled economy. Postgraduate degrees in computer and biological sciences are among the most sought after; in both fields, students must deal with complex new technologies, specialties, and innovations. The numbers of master’s and Ph.D.s in nursing have exploded for similar reasons. Nurses don’t just train to take pulses and monitor blood pressure; they specialize in anesthesia, sports medicine, management, research, pediatrics, and midwifery, among other areas. In 2004, a mere 170 nurses received a doctor of nursing degree from a mere four schools. Fifteen years later, 357 schools awarded the degree to 7,037 students, with another 36,000 in the pipeline. Another 124 schools are in the planning stages, and, given the army of aging baby boomers, more will undoubtedly be needed.

But increasingly complex technological skills are only one part of a destructive feedback loop—in which colleges and universities, students, employers, unions, and meritocratic anxiety mutually reinforce one another.

Universities are the prime mover. Over the past decades, many have struggled to balance their books amid state cuts, regulatory requirements, and a customer base resistant to hikes in already obscene tuitions. Graduate programs are a good answer to the cash shortage: graduate students ease the strain on expensive infrastructure, they generally don’t need dormitories, and they often prefer evening, part-time, or online classes. Between 2008 and 2016, the share of students pursuing a master’s degree entirely online tripled, from 10 percent to 31 percent. After the Covid-19 pandemic, those numbers will likely rise. Graduate students can borrow considerably more federal money than undergrads can, and they don’t create the same level of inter-institution competition, relieving pressure for elaborate gyms, student centers, and other amenities. “Master’s degrees represent the strongest opportunity for revenue growth for many institutions,” promises a report from EAB, a university advisory firm. The market for the master’s, the organization estimates, adds up to a hefty $43.8 billion.

Small wonder that higher-ed institutions have embarked on a graduate program development spree. According to the Urban Institute report “The Rise of the Master’s Degree,” the number of distinct master’s fields granting at least 100 degrees per year rose from 289 in 1995 to 514 in 2017. Anthony Carnevale, director of the Georgetown Center on Education and the Workforce, describes the process: “Educators say: ‘Let’s offer a program in this. Let’s offer a program in that.’ If they can get students to sign up, it’s a money-maker. . . . Every institution, especially universities, has to offer every credential. And if somebody down the road invents a new one, they have to replicate it.” He adds a surprising afterthought: “All this goes on with no actual proof of outcome.”

Master’s disciplines are often divided into subspecialties that may or may not actually prepare people for available jobs. You don’t just get a master’s in computer science, for example; you get a degree in AI, video-game design, machine learning, “human–computer interaction,” cloud computing, cybersecurity, data science, digital-interactive media, information systems, hardware and computer architecture, medical image computing, or something else. Degrees in the health sciences swell master’s course catalogs into the size of a 1940 New York City telephone book. At George Washington University, one can choose among degrees in clinical operations and health-care management, clinical research administration, clinical and translational research, correctional health administration, health-care quality, regulatory affairs, and biomedical informatics, among others.

Chart by Alberto Mena

University postgraduate catalogs are like curiosity shops that reflect administrators’ marketing calculations as much as student need or aspiration. Consider just a few of the curios for sale: a master’s of planning and management of natural hazards at the University of New England; a master’s in outdoor adventure and expedition leadership at Southern Oregon University; the University of Arizona offers a master’s in racetrack industry. Vincennes University in Indiana had a bowling management master’s that was axed in 2015. (Apparently, millennials and Zoomers find bowling too analogue.) Even the most classroom-avoidant students might stop to consider a master’s in sexuality studies at San Francisco State University. Visual and performing arts departments have been at the drawing board as well: Iowa State University gives an MFA in “creative writing and environment,” and New York University and Boston University offer video-editing degrees. Try the University of Connecticut if you’re looking to get a higher degree in puppetry. Arts-management programs are as easy to find as a McDonald’s. Florida State University offers a unique museum education and visitor-centered curation degree. Also popular are master’s degrees (or Ph.D.s) in student affairs for people who want to, among other things, advise students whether to get a master’s (or a Ph.D.). Can a master’s in master’s degrees be far behind?

Doctoral studies have also taken a turn toward the puzzling. Ph.D. programs in recreation and leisure studies have popped up in almost every part of the country. The University of Utah has a Ph.D. program in parks, recreation, and tourism. Ph.D.s in enology—i.e., wine—are also popular. Michigan State University offers a Ph.D. in packaging (at the university’s School of Packaging). Less practical is the Ph.D. in the history of consciousness at Santa Cruz, though it could be useful for a career in radical activism—one of the early graduates was Huey Newton, founder of the Black Panthers.

This is not to say that students in even the most curious-sounding programs aren’t learning anything. It’s just that they’re learning things that shouldn’t require prepping for the GRE, years of study for credits of dubious value, and sweat-inducing amounts of debt—not to mention a metastasizing higher-ed apparatus. In fact, today’s postgraduate structure is a relatively recent invention. Back in the day, postgraduate programs were a rarefied business designed to prepare students to become professors or to do high-level research at universities and pharmaceutical and technology companies. The point was to master a discipline, to add to its body of knowledge, and, if teaching, to pass it along to a new generation of students. Postgraduate academic work wasn’t something that the large majority of graduates going into white-collar jobs had any interest in. They knew that they could count on their employers to do the training.

No longer. For a long time, companies “hired for potential,” not for specific job skills, explains Peter Cappelli, Wharton professor of management and author of Will College Pay Off? Everyone understood that firms would train new employees who would stay on until it was time to receive the proverbial gold watch at the retirement party. Companies didn’t do much outside recruitment, since almost all hiring for more senior positions occurred in-house. The mutually satisfying arrangement fell apart in the 1980s as companies started restructuring and laying off workers. Executives concluded that training was a disposable and risky expense. Cappelli quotes one executive asking: “Why should I train my employees when my competitors are willing to do it for me?”

Universities jumped into the vacuum. The potential for new sources of income sent them scurrying to design novel products to compensate for the business sector’s retreat. That helped transform campuses into “training machines for American industry at the high-skill end,” as Katherine Newman, formerly a dean at Johns Hopkins University, has put it. Universities add new courses and degrees to satisfy a chronically shifting labor market, whether or not they know exactly what companies will be looking for. Even worse, the new system passed along the substantial cost of early-career training to young adults at a point in life when their parents and grandparents would have been earning a paycheck, starting a family, and buying a home. Now, barring a cushy trust fund, their kids are poring over the terms not of mortgages but of student loans.

Chart by Alberto Mena

How much of this “up-credentialing” reflects a genuine need for higher-level skills that can be learned only in a university classroom? Surely some of it. Advanced manufacturing really does require workers who understand advanced math and physics and who can do sophisticated coding for machines and robots.

But perverse market forces are at play, too. Employers tend to increase degree requirements when they have plenty of candidates to choose from—that is, when unemployment is high—because they can, not because of productivity. Employers, universities, and students seem to have convinced themselves that credentials are a proxy for skills. But as the Urban Institute’s Robert Lerman, perhaps the country’s leading specialist on apprenticeships, observes, that is an academic’s understanding of “skills.” It doesn’t describe much of the flexible problem-solving, careful focus, and experience that trade workers—or, say, occupational therapists—need in order to do their jobs well.

A 2014 study by Burning Glass Technologies, an analytics software company, supports the idea that, in many cases, up-credentialing has nothing to do with hard skills. The report discovered a large gap between the degree qualifications listed in help-wanted ads and the degrees held by those already employed. Sixty-five percent of online postings for executive assistant and secretarial positions, for example, required a bachelor’s, even though only 19 percent of those currently doing those jobs had the degree. Another study found that job descriptions requiring a bachelor’s often look virtually identical to those that don’t. Most large firms now use automated hiring software that simply filters out people without a college degree, no matter their relevant experience. It doesn’t seem to matter that it takes longer to find a new employee with a degree than one without, that it will cost the company 11 percent to 30 percent more in salary and benefits, and that employers often find that nongraduates with experience perform nearly as, or equally, well.

And don’t forget status. Every organization, from the Senate to the local McDonald’s, has its hierarchy; but higher education, the linchpin of meritocracy, increasingly shapes American status. Its intricacies could fill the residents of the House of Windsor with envy: adjunct, instructor, assistant professor, associate professor, professor, university professor, chaired professor, non-tenure track, tenure track, tenured . . . not to mention the de facto prestige rankings of the institutions themselves and the degrees they bestow on aspiring workers. Education has become central to sorting America’s young people into their proper socioeconomic position, and few educated people are immune to the glamour of the right degree from the right university.

Which brings us back to l’affaire Dr. Biden. One of the striking, though unspoken, facts about the controversy was the déclassé reputation of the degree that was the object of the dispute. Among the academic cognoscenti, schools of education are generally looked down on as the barefoot country cousin of the arts and sciences. In knowledgeable circles, the Ed.D., the diploma proudly claimed by the First Lady, was not something to brag about. H. L. Mencken, who never spared a rotten word for American education, reserved particular scorn for schools of education, or “normal schools,” as they were known at the time: “What you will find is a state of mind that will shock you. It is so feeble that it is scarcely a state of mind at all,” he snarled in a 1928 essay. Biden had already collected two master’s degrees before she began her doctoral work at the University of Delaware. Ed.D. training would not improve her teaching, since the program is largely for people pursuing careers in school administration. She couldn’t have been doing it for the money. So what was the point?

According to her husband, she wanted the degree because she liked the title.

“She said, ‘I was so sick of the mail coming to Senator and Mrs. Biden,’ the president has recounted. ‘I wanted to get mail addressed to Dr. and Sen. Biden.’ That’s the real reason she got her doctorate.” Dr. Biden might not fully agree with this account, but the foofaraw over Epstein’s article suggests that her real goal was what she wrongly imagined to be a respectable place in the meritocratic hierarchy.

This is not to say that postgraduate education is nothing more than an exercise in social climbing; social climbing is built in to the system that’s supposed to prepare graduates for adult life. Unlike a Ph.D., a master’s won’t get you called “Dr.” But degrees stratify groups by their very nature. A master’s in business administration may not be much to brag about to outsiders, but it signals higher rank, however modest in the scheme of things, to the mere B.A.s at office meetings. Nurses might pursue higher degrees for good reasons—more challenging work, more control over work hours—and they may not care about status at all. But prestige cares about a Ph.D.

Degrees determine whether you can write prescriptions, conduct research, manage and teach other nurses, and shape hospital or public-health policy, or whether you can simply take patients’ temperature and pulse, change bandages, and file paperwork. These differences shape people’s understanding of their place in a given hierarchy. They also determine whether you can be called “Dr.,” an honorific that confuses outsiders. In fact, M.D. organizations have fought for sole possession of the title in clinical settings and have persuaded legislators in some states to pass laws to that effect.

“Biden wants to provide 17 free years of education. That’d be like adding four stories to a crumbling building.”

One final force driving hyper-credentialism: the profound failure of our K–12 system. Complaints about the dearth of workplace-ready high school grads suffuse employer surveys. The kids are grammatically illiterate; they stutter and look at their feet during interviews; they don’t come to work on time or finish a project as scheduled. In a Burning Glass survey of mid-skilled companies, one HR staffer explained: “There’s something that comes with being a college student, a lot of maturity and knowing how to work with different people. They know how to communicate and to express themselves.” High school graduates should know how to communicate and to express themselves, to cooperate within a team, and to follow through on assignments. But they don’t.

The Biden administration wants government to provide 17 free years of education. That’d be like adding four stories to a crumbling building. A recent National Bureau of Economic Research working paper shows that college GPAs have been rising since the 1990s, as have college graduation rates. The authors conclude that grade inflation is behind both trends. Colleges may think that they’re doing their students a favor by making it easier for them to get their diplomas; they’re actually just intensifying a credential arms race. Poorly educated college grads mean more master’s students, more student debt, and more years of quasi-adult status. Those who don’t go to college will never be able to achieve the polish that their school systems have failed to demand of them.

Jeremiads about higher education’s role in perpetuating inequality almost always include recommendations for more career and technical education and apprenticeships for kids for whom the college-for-all model is not working. I won’t break the pattern. The next generation desperately needs these alternatives, and parents want them. Gallup found that 46 percent of parents prefer other secondary options “even if there were no barriers to their child earning a bachelor’s degree.” Yet those parents say that they don’t hear much about such options; it’s always college, college, college. In a plea for more “work-based programs” for non-college-graduating kids—remember, that’s two-thirds of them—Annelies Goger of the Brookings Institution points out that federal funding for public colleges and universities was $385 billion in 2017–18, compared with $14 billion for employment services and training. The math doesn’t make sense.

The kids who could benefit most from work-based programs will never be called “Dr.” Maybe that’s not such a bad thing.

Photo: Recent college graduates are having a harder time in the job market as the value of a bachelor’s degree declines. (JOSE LUIS MAGANA/REUTERS/NEWSCOM)





Source link

Why the New Child-Tax Credit Won’t Live Up to the Hype


If the experts are right, the United States is about to perform one of the great policy feats in the nation’s history. Starting in July, under a newly retrofitted Child Tax Credit (CTC), the IRS will begin sending monthly checks of $250 per child ($300 for children under six) to families with adjusted gross incomes under $150,000. Part of the Covid-era American Rescue Plan, the revamped CTC will benefit the large majority of American families with kids, but its biggest impact will be felt by the approximately 10 million children below the poverty line. The prediction is eye-popping: the new law, the consensus has it, will slash child poverty in half.

No question, that would be transformational. Up until now, the U.S. has held the dubious distinction among wealthy nations of having the most austere social policies and one of the highest child-poverty rates. The revised CTC could bring the U.S. more in line with its peer countries by supporting parents with a child allowance and assuaging some of poverty’s more palpable threats to children’s well-being like hunger, malnutrition, and homelessness. Poverty researchers point to studies—some applicable, some less so—suggesting that the money will dramatically change the futures of America’s poor kids.

I’m skeptical.

My doubts owe to something most Americans don’t like talking about: children in the U.S, particularly in lower-income households, are far more likely to grow up in unstable families—with a revolving cast of stepparents, half-siblings, stepsiblings, divorces, separations, and short-term romantic partners—than kids anywhere else. These “complex families,” as they’re sometimes called, can be every bit as damaging to children as poverty itself.

Sound exaggerated? It’s not. True, complex families have been a growing part of American life since the social revolutions of the late 1960s and 1970s, but few outside the small circle of family researchers have understood the extent of this change. Up until 2007, the Census Bureau asked respondents only whether children in the household were living with “one parent” or “two parents.” Two parents could refer to anything from a married couple celebrating their silver anniversary to a mother and a cohabiting partner who might or might not be the child’s father. Thus, the data gave a misleadingly benign impression of family arrangements. In 2014, the bureau began collecting data on multiple-partner fertility—that is, children by more than one partner. With these upgrades, we now have a clearer picture of the role complex families play in American poverty.

When it comes to multi-partner fertility and complex families, the U.S. is truly number one. More than one in six American kids live with a step- or half-sibling by age four. Middle-class readers might assume that this has to do with divorce and remarriage, but for most children, instability begins with cohabiting or single parents. Family complexity is especially common among single—or never-married—men and women, regardless of whether they were cohabiting at the time of first birth. Half the children born to cohabiting but unmarried parents will see them break up by their third birthday, compared with only 11 percent of kids born to married parents. About 21 percent of married parents report having a child with another partner, but 59 percent of unmarried couples have at least one child with another partner. Sixty percent of single parents go on to have a second child with another partner within ten years of the first birth.

Of course, affluent parents also divorce and remarry, and many middle-class couples make happy second marriages, with thriving children and warm relationships between ex-spouses and step- and half-siblings. Plenty of single mothers, including those with low incomes, also manage to give their kids a stable home life. But like seemingly everything else in America these days, complex families tend to be a class marker—one that helps produce and perpetuate inequality.

Child poverty closely tracks nonmarital childbearing and multi-partner fertility. About 58 percent of poor children live in households headed by unmarried mothers; 60 percent of the firstborn children of those mothers will have at least one half-sibling by age ten. Meantime, middle-class and wealthy kids are almost always born to married parents, whose divorce rates have declined markedly over the past decades. Their chances of growing up with the two parents who carried them home from the maternity ward are fairly high. However, those middle-class children whose married parents do divorce are at high risk of downward mobility; in fact, 28 percent of poor adults spent at least some of their childhoods in a two-parent middle-class family. As adults, they are at a higher risk of becoming single parents themselves and having children with two or more partners. In short, multiple-partner fertility, family instability, and poverty all appear to be passed into future generations.

Predictably, the complex-family income gap parallels an education gap. Men with bachelor’s or graduate degrees are considerably more likely to be living with their biological children than less educated men. In one study, 64 percent of male participants with a high school degree or less had a child with more than one partner (almost three-fourths of those births were nonmarital), compared with 36 percent of men with some college. (Interestingly, men having children with more than one partner are neither more nor less likely to be employed than men having children with just one partner.) Men who have spent time in prison are two times as likely to have children by multiple partners.

Along with education and income, race is also part of the complex-family divide; blacks are twice as likely as whites to have children with two or more partners (29.6 percent vs. 14.7 percent). Half of all black children live with a single parent, compared with 28 percent of Hispanics, 18 percent of whites, and 9 percent of Asians. Black men are also more likely to live with a partner’s minor children (16.4 percent) than with their biological child (9.9 percent). Black children are more likely to see their married parents divorce than kids of other demographic groups, which may help explain a troubling trend in the downward mobility of black men.

If higher poverty rates were the only downside of complex families, we could simply make the CTC permanent (as Democrats want) and watch it work its magic. But studies of children in complex families are all but unanimous in finding that instability, whether attached to poverty or not, damages kids in ways that ripple into their adult lives. Children suffer a break with a loved parent, usually a father. Unmarried fathers tend to become less involved with a child after the relationship with the mother ends. When the mother enters a new romantic relationship, the father retreats even further out of the picture. Child-support payments become spotty or stop altogether. The same thing happens when the father finds another partner, especially one with whom he has another child. Nor is a mother’s new partner likely to be considered Father of the Year. Such parents are less likely to spend time with their partner’s kids, or to do the ordinary things parents do with kids, like eat dinner with them or take them to a park or a movie. If stepfathers go on to have a biological child of their own with the mother, they often lose interest in their stepchild.

More worrying still, sexual and physical abuse are more common in homes with an adult male biologically unrelated to the children. In complex families, shared custody between biological parents—the enlightened answer to the daddy problem created by divorce and separation—is, well, complex. The chemistry between new spouses and siblings can be toxic, and visitation can be fraught for both logistical and emotional reasons. Parents or their new partners may need to move to new cities for a job, for extended family needs, or just to start over. Weekend scheduling is hard enough for intact families, with extracurricular activities and family birthdays; try adding another child in a separate household with another set of siblings and some stepparents.

Researchers studying children who are coping with family turmoil of this sort see behavioral problems identical to those they find when they study poor children. Several studies have found that children in elementary school who have experienced two “transitions”—a separation, a re-partnering, the introduction of stepsiblings—are not only more impulsive and aggressive than kids who experience no family disruptions; they also have lower grades and achievement scores. The family-go-round, to use sociologist Andrew Cherlin’s phrase, has been associated with lower verbal ability, attention-deficit problems, and poorer overall school readiness. Multiple-partner fertility was “robustly related” to delinquency and other behavior problems at age nine, regardless of whether a mother was married at that time. In later childhood and adolescence, having a parent with children from multiple partners correlates with early sexual activity and pregnancy. Studies have found a “dose effect” for transitions—that is, the more transitions a child experiences, the more the child’s risk grows. More surprisingly, perhaps, studies show that kids react to new stepsiblings in the household by becoming more aggressive—and that’s apart from the effect of the mother’s or father’s new partner. A child whose parents have divorced and whose mother remarries may be at higher risk of negative outcomes, but a child whose mother has another child after remarrying is at higher risk still.

Several theories, some overlapping, look to explain why the family-go-round is so hard on kids. The kids could be simply reacting to their mothers’ own stress and depression, both of which are more common among women with children by multiple partners. It makes sense that a mother preoccupied by a missing child-support check, the excitement of a new romance, or a stepchild’s temper tantrums would become less emotionally available to her own child. Another theory has it that complex families scramble conventional roles and norms, leaving children disoriented and anxious about whom they can count on and for what. When a new male partner with his own child moves in with a mother and her biological child, it confuses understandings of parent-child relationships, discipline, routines, and care. Adults themselves are in new, unregulated territory. Should a mother expect her partner to be a substitute father to her child, a friend, or more of a roommate? Should he discipline a child for cursing him out—and how? What kind of support should she expect from the extended family of her child’s father if he’s no longer living with her, especially once he has a new partner and child? And what are a stepfather or new partner’s financial responsibilities to his new family? The questions proliferate when a new family forms, but the answers are elusive.

A related, evolution-influenced theory rests on the assumption that the young of all species are wired to adapt to their environment. When that environment is unpredictable, they develop strategies that may not work in a more reliable ecosystem. Children whose family attachments and household arrangements are volatile have no reason to trust routines or develop the kind of foresight that might lead to a better future. That explains the impulsive and aggressive behaviors researchers find in kids from complex families. If life throws dangerous curveballs at you no matter what you do, why not have sex without a condom with the girl you met last week, steal that bike you want, or punch out a guy who has been bothering you? “The lesson of an unpredictable environment is that if the future is more rather than less uncertain, then efforts to mitigate risk are less likely to pay off in terms of enhancing reproductive fitness,” observes psychologist Jay Belsky and coauthors in a paper on unpredictable childhoods and their relation to early pregnancy. “After all, efforts toward such ends take time, effort and, critically, energy, so if such investments are less likely to generate anticipated payoffs, as would be the case in more rather than less unpredictable environments, then parents should adjust their rearing accordingly. The same goes for children when it comes to regulating their own development.”

Can the extra income promised by the CTC change the life trajectories of kids in complex families? An optimist might argue that more money will lower stress levels in the household and improve the chances that parents will stay together. This could be true in some cases. But we have little reason to believe that lifting people from poverty to near poverty—the most realistic outcome—will solidify most parents’ relationships or lessen their all-too-human longing for a new partner and another baby. In fact, multi-partner fertility is more cause than effect of money woes. “[R]elative economic well-being is not predictive of a birth to a second partner,” Lindsay Monte, a scholar at the Census Bureau, writes in a recent paper. “However, women are subject to significantly greater economic stress after the transition into multiple partner fertility.”

Helping struggling families keep their refrigerators stocked and their rent money paid may be the humane and socially beneficial thing to do. But the same goes for giving kids as much family stability as possible so that they can develop the strengths they need to meet the challenges of adult life. There’s no silver-bullet policy solution for that one.

Photo by Melina Mara/The Washington Post via Getty Images





Source link

Richard Alba’s Book on Demographic Change Challenges Conventional Wisdom


The Great Demographic Illusion: Majority, Minority, and the Expanding American Mainstream, by Richard Alba (Princeton, 336 pp., $29.95)

A few weeks ago, Variety published a short news item about the cat-eyed beauty Anya Taylor-Joy after she won a Golden Globe for her starring role in The Queen’s Gambit, a Netflix series. The piece was entirely forgettable, but one bit got my attention. An italicized correction at the end of the article read: “A previous version identified Anya Taylor-Joy as a person of color. She has said she identifies as a white Latina.” Two questions: a person of color? Unless my television needs adjusting, Taylor-Joy is alabaster pale. “White Latina?” According to Wikipedia, the actress’s father is of Scottish heritage. But her mother is part English and part Spanish, and the family lived in Buenos Aires during her early childhood. That explains it . . . I guess.

Richard Alba’s new book, The Great Demographic Illusion: Majority, Minority, and the Expanding American Mainstream, helps untangle identitarian knots of this sort. The City University of New York sociologist and immigration scholar shows how official policies have, intentionally or not, distorted America’s debates about racial and ethnic identity as well as the country’s self-understanding. Alba is clearly a man of the Left, but readers will still come away questioning important parts of conventional progressive wisdom.

The author begins by recalling a series of reports from the Census Bureau during the 2000s that announced that the United States was well on its way to becoming a “majority minority” country. The announcement was clickbait. The Left met it with barely disguised celebration, seeing a satisfying comeuppance for those anti-immigrant bigots on the wrong side of history and anticipating the end of oppressive white dominance. For its part, the provincial (as opposed to cosmopolitan) Right, having already been sucker-punched by globalization, responded to the news with what columnist Charles Blow called “white extinction anxiety.” Alba implies that the two sides fed off each other: hearing words like “extinction” and sensing the schadenfreude from the other side, this portion of the white Right grew more brazenly resentful, which, in turn, lent more credence to the Left’s charges of right-wing racism. The rest is Trumpian history.

Thankfully, Alba is more interested in the majority-minority claim than in the president who proved adept at exploiting it—and, as his title foretells, that claim is illusory. Ten percent of all married people now have a partner of a different race or ethnicity, he finds; likewise, more than 10 percent of babies born in the United States have mixed parentage. Fully 80 percent of those polyglot children have one non-Hispanic white parent. Yet official Census data categorize those children as minorities even when they identify as white.

This little-understood accounting trick shows just how much seemingly small political-bureaucratic decisions shape public perception, not to mention Twitter wars. As the country’s mixed-race and ethnic population grew, complaints about the check-one-race instructions became harder to ignore and officials began to consider adding a mixed-race option. According to the author, civil rights groups, fearing a loss of clout in anti-discrimination cases, strongly objected to the idea. A compromise was reached just in time for the 2000 census: yes, mixed-race Americans could check more than one box (usually meaning white and a racial or ethnic minority), but mixed-race people would still be counted as minorities. “Just as with the one-drop rule” that had once applied to blacks, Alba observes acidly, “the minority side was to take precedence over the white side.” It’s worth noting, though Alba does not, that college-admissions offices also appear to use the one-drop rule when it advances their diversity goals.

To be fair to the bureaucrats, the question of how to categorize population groups was bound to be a difficult one, given the demographic churn of the United States. This was especially true after the 1965 Hart-Celler Act abolished earlier immigration quotas for non-Europeans. Up until the mid-nineteenth century, the Census designers didn’t think very hard about race; they defined Americans as either free people, which generally meant “white,” or slaves, synonymous with “black.” It was only in 1860 that officials added a separate category—for American Indians. By the end of the century, both Chinese and Japanese had made their way onto the checklist; Filipinos, Koreans, and “Hindus,” as Indians were categorized, arrived a few decades later. In the 1960s, activists introduced “Asian-American” as a gesture of empowerment and solidarity with other nonwhite people. As an official classification, the term is problematic, to say the least; it packs proudly disparate groups, such as Chinese, Thais, Indians, and Filipinos, into one. Officials compromised by providing checkboxes for six of the largest ethnicities as well as a box marked “Other Asian” for smaller ones. There is also a separate box for “Native Hawaiian and Pacific Islander.”

Hispanics, by far the nation’s largest minority group, have also been a consistent headache for Census demographers. Unlike blacks or Asians, Hispanics are not a racial group and can be black, white, or (part) Asian. In the early days, Mexicans were simply written down as “white” on Census forms. In 1930, the bureau decided to give Mexicans their own racial classification. The gesture didn’t go over well. Both the Mexican government and Mexican-American leaders insisted that they be classified as white and the bureau returned to the status quo ante. Then, just as it had for Asians, the civil rights movement of the 1960s changed Hispanic attitudes. Seeing strength in numbers, Mexican and Puerto Rican activists lobbied for a new shared designation; after 40 years of minor adjusting, the entry now appears as “Spanish, Latino, or Hispanic origin.” Why Asians identify themselves through their country of origin and Hispanics do not is unclear.

Hispanics are not only the largest ethnic group but also one of the most likely to intermarry. About 40 percent of all recent intermarriages and mixed births involve a Hispanic of any race and a non-Hispanic white; mixed-race children are likely to marry whites, further diluting their Latino “blood.” (In one of a number of gestures toward wokeness, Alba uses the gender-neutral label, “Latine,” now popular among LGBT activists.) In large measure due to intermarriage, such “ethnic attrition” has been common among Hispanics over generations, particularly among educated Mexicans. Pew estimates that 11 percent of those with Hispanic origins don’t identify as such.

And Hispanics are not alone in this process of attrition. Asians, particularly well-educated ones, very commonly marry whites. (Those couples are far more likely to be Asian female–white male than the other way around.) More surprisingly, given their geographic isolation, over half of Native Americans intermarry. The children of American Indian and white parents are far more likely to identify as white than as Indian. Seventy percent of white–American Indian and white–Asian children eventually marry white Americans. The trend diminishes tribal numbers and threatens the federal benefits that sustain them.

So how can our Babel of a country adapt to all this? Alba examines two ways of thinking about the question. The first—and, on the left, the most entrenched—is critical race theory. CRT theorists argue that whites, relying on assumptions of their innate racial superiority, enslaved, colonialized, and ruled people of color for centuries and continue to dominate minority populations in both the U.S. and throughout the West. If they’re right, then barring a profound break with a racist past, the U.S. will continue to be a country where white supremacy reigns.

Alba believes that this model does not accurately describe what’s happening in American life. The children of new immigrants of nonwhite descent are becoming socially integrated into a shifting mainstream; they are moving up in education, income, and class status, and they vote in large numbers, just as previous major waves of immigrants did. In the post–World War II era, a rapidly expanding economy eased the assimilation of formerly excluded Jews and Italians. Growing numbers of white-collar jobs and new housing developments made it possible to have what Alba calls “non‐zero‐sum” upward mobility; no group had to lose in order for another to rise. Today, he argues, something similar is happening as baby boomers age out of the labor force, making room for immigrant newcomers. More than half of new Ph.D.s are from immigrant-origin groups. Top corporate positions have diversified.

This is not assimilation in the conventional sense. Minorities are not being recruited into a white power structure, as critical race theorists describe in the case of earlier Southern and Eastern European immigrants. “Assimilation” implies that a minority group melds into the majority and becomes part of a homogenized whole. Instead, Alba argues, immigrants and their children are joining a mixed, visibly nonwhite mainstream and changing that mainstream in noticeable ways. Groups are “de-categorizing” themselves or, as in the case of “white Latina” Anya Taylor-Joy, blurring familiar boundaries.

Blacks are the one exception. They remain more segregated than other minority groups. They intermarry at lower rates and, even when they do intermarry, they tend to stay among their own. They are more likely to describe themselves as black, to have black friends, to remain closer with their black family than their white one, and to have children who go on to marry other blacks. They are not complete outliers to the assimilation model; their intermarriage rates have risen significantly, if from a lower base. Moreover, middle-class and higher-income blacks tend to live in integrated neighborhoods. But their movement into the mainstream has been slow and piecemeal. They are not reaching top jobs to the same extent that immigrant-origin groups are. While elite colleges have seen their immigrant numbers soar, the African-American numbers have barely budged since 1980; those who do enroll are disproportionately from African-immigrant families. While more blacks are getting college degrees, half come from lower-ranked, for-profit institutions.

Alba is somewhat coy about the depth of his challenge to the current regime of identity politics. In a rational world, his book could help clear our toxic air of the promiscuous use of terms like “white supremacy” and “people of color.” I’m not counting on it, though. The Wall Street Journal ran a profile of the author, but The Great Demographic Illusion has not been reviewed by any major news outlet since its September publication. It seems that many Americans prefer their illusions.





Source link

There Goes the Neighborhood School


I’ve always been a sucker for the romance of the neighborhood school. When I was a child, I walked by myself to the old stone building in suburban Philadelphia that housed the nearest elementary school. I memorized multiplication tables and traced cursive letters in classes filled with kids who lived within a mile or so of me; I ate after-school snacks in their kitchens, and on snow days, met up with them for sledding on a hill that our older brothers and sisters had told us about before they started acting cool. This all sounds like cornball nostalgia, I know, but a neighborhood school really does have the potential to turn strangers into neighbors and friends.

So when I moved to Brooklyn with my husband and our young children in the early 1980s, I didn’t hesitate to register at the local elementary school. The school looked like a jail, with its severe brick facade and barred windows, but it was known as a star in a struggling education system that had few of them. People paid heart-pounding sums of money to live in the catchment area; those who couldn’t afford it tried using an aunt’s or a friend’s address. In some respects, and despite its appearance, the school delivered on my hopes. I waited outside at the 3 PM pickup time and shared local gossip with other parents; on the way home, we stopped at the supermarket, where we sometimes ran into teachers grabbing groceries; my kids went for playdates to classmates’ homes, all within walking distance of our house. My husband and I still get together for every Super Bowl with a couple we met when our sons were in kindergarten together. Those sons are now 42.

A few years into our family’s relationship with that neighborhood school, Big New Ideas started percolating in the brains of New York City educators. By the time my second child started second grade, the traditional classroom had been discarded in favor of reading and writing “workshops.” Students chose the books that they wanted to read. They kept journals in which they wrote about those books but also about themselves, their families, and their thoughts. “Every child is an author,” a teacher explained as I stared at the jumble of barely decipherable words in my daughter’s journal during a conference. Grammar and spelling didn’t matter for now, she assured me; the important thing was that my daughter felt “good about being a writer.” To be fair, the class did do a “unit” on penguins. They read books with penguin characters, drew pictures and wrote stories about penguins, and they learned how penguins keep warm in the Arctic. Why penguins? I haven’t a clue.

I learned about the next Big New Idea—multicultural education—one day when that same daughter burst into the house. “Mommy!” she exclaimed. “Did you know that Thomas Jefferson had slaves?” How was that solitary historical fact the one that had wandered into her “student-centered” classroom? Were there penguins at Monticello? “Yes, I knew that,” I answered. “Do you know what else he did?” She did not. We had been told that multiculturalism would teach children respect for other cultures, a reasonable goal in our diverse city. How telling a child who still clung to her “binky” at bedtime about Jefferson’s slaves would accomplish this was beyond me. Our son had recently graduated from our little elementary school, and with some parental help, we placed him in a private middle school; we did the same with the Jefferson expert after fourth grade; when our youngest could barely read in third grade, we bolted the neighborhood school for good.

Now one of my daughters lives nearby, with her own children. Her eldest started kindergarten this year at the neighborhood school in a turn-of-the-century building at the end of her block. The schools chancellor had urged teachers to confront “white supremacy culture,” which includes “worship of the written word.” My grandson is white—the only white child in his immigrant-filled class, as it happens—and though he can be a little tyrant, the chances of his joining the Proud Boys are negligible.

No matter. I predict he’ll be taking a bus to a private school in some other part of Brooklyn soon.

Photo: Lya_Cattel/iStock





Source link