Philosophy instructor, recreational writer, humorless vegetarian.
737 stories
·
6 followers

15 years after a viral tweet, Detroit has its RoboCop statue

1 Share
funded in 2011, the Kickstarter project overcame a host of problems, including securing the rights, finding a site, and the sculptor's battle with colon cancer #
Read the whole story
istoner
15 hours ago
reply
Saint Paul, MN, USA
Share this story
Delete

Elite Colleges Have an Extra-Time-on-Tests Problem

1 Comment

Administering an exam used to be straightforward: All a college professor needed was an open room and a stack of blue books. At many American universities, this is no longer true. Professors now struggle to accommodate the many students with an official disability designation, which may entitle them to extra time, a distraction-free environment, or the use of otherwise-prohibited technology. The University of Michigan has two centers where students with disabilities can take exams, but they frequently fill to capacity, leaving professors scrambling to find more desks and proctors. Juan Collar, a physicist at the University of Chicago, told me that so many students now take their exams in the school’s low-distraction testing outposts that they have become more distracting than the main classrooms.

Accommodations in higher education were supposed to help disabled Americans enjoy the same opportunities as everyone else. No one should be kept from taking a class, for example, because they are physically unable to enter the building where it’s taught. Over the past decade and a half, however, the share of students at selective universities who qualify for accommodations—often, extra time on tests—has grown at a breathtaking pace. At the University of Chicago, the number has more than tripled over the past eight years; at UC Berkeley, it has nearly quintupled over the past 15 years.

The increase is driven by more young people getting diagnosed with conditions such as ADHD, anxiety, and depression, and by universities making the process of getting accommodations easier. The change has occurred disproportionately at the most prestigious and expensive institutions. At Brown and Harvard, more than 20 percent of undergraduates are registered as disabled. At Amherst, that figure is 34 percent. Not all of those students receive accommodations, but researchers told me that most do. The schools that enroll the most academically successful students, in other words, also have the largest share of students with a disability that could prevent them from succeeding academically.

“You hear ‘students with disabilities’ and it’s not kids in wheelchairs,” one professor at a selective university, who requested anonymity because he doesn’t have tenure, told me. “It’s just not. It’s rich kids getting extra time on tests.” Even as poor students with disabilities still struggle to get necessary provisions, elite universities have entered an age of accommodation. Instead of leveling the playing field, the system has put the entire idea of fairness at risk.

Forty years ago, students with disabilities could count on few protections in higher education. Federal law prohibited discrimination against disabled students, but in practice schools did little to address their needs. Michael Ashley Stein, a disability-rights expert who teaches at Harvard Law, recalled the challenges of attending law school as a student using a wheelchair in the 1980s. “I sat in the back of the classroom, could not enter certain buildings in a normal way, became the first person on the law review with a disability, and dragged myself up the stairs,” he told me.

The Americans With Disabilities Act, passed in 1990, was meant to make life fairer for people like Stein. The law required public and private institutions to provide reasonable accommodations to individuals with “a physical or mental impairment” that “substantially limits one or more major life activities.”

Change was slow at first, in part because Supreme Court rulings narrowed the scope of the law. Professors I spoke with told me that, even in the early 2000s, they taught only a handful of students with disabilities. Then, in 2008, Congress amended the ADA to restore the law’s original intent. The government broadened the definition of disability, effectively expanding the number of people the law covered. It also included a list of major life activities that could be disrupted by a disability (“learning, reading, concentrating, thinking,” among others) and clarified that individuals were protected under the ADA even if their impairment didn’t severely restrict their daily life.

[Read: The slow death of special education]

In response to the 2008 amendments, the Association on Higher Education and Disability (AHEAD), an organization of disability-services staff, released guidance urging universities to give greater weight to students’ own accounts of how their disability affected them, rather than relying solely on a medical diagnosis. “Requiring extensive medical and scientific evidence perpetuates a deviance model of disability, undervalues the individual’s history and experience with disability and is inappropriate and burdensome under the revised statute and regulations,” AHEAD wrote.

Schools began relaxing their requirements. A 2013 analysis of disability offices at 200 postsecondary institutions found that most “required little” from a student besides a doctor’s note in order to grant accommodations for ADHD. At the same time, getting such a note became easier. In 2013, the American Psychiatric Association expanded the definition of ADHD. Previously, the threshold for diagnosis had been “clear evidence of clinically significant impairment.” After the release of the DSM‑5, the symptoms needed only to “interfere with, or reduce the quality” of, academic functioning.

Recently, mental-health issues have joined ADHD as a primary driver of the accommodations boom. Over the past decade, the number of young people diagnosed with depression or anxiety has exploded. L. Scott Lissner, the ADA coordinator at Ohio State University, told me that 36 percent of the students registered with OSU’s disability office have accommodations for mental-health issues, making them the largest group of students his office serves. Many receive testing accommodations, extensions on take-home assignments, or permission to miss class. Students at Carnegie Mellon University whose severe anxiety makes concentration difficult might get extra time on tests or permission to record class sessions, Catherine Samuel, the school’s director of disability resources, told me. Students with social-anxiety disorder can get a note so the professor doesn’t call on them without warning.

The types of accommodations vary widely. Some are uncontroversial, such as universities outfitting buildings with ramps and providing course materials in braille. These allow disabled students to access the same opportunities as their classmates. Some students get approved for housing accommodations, including single rooms and emotional-support animals.

Other accommodations risk putting the needs of one student over the experience of their peers. One administrator told me that a student at a public college in California had permission to bring their mother to class. This became a problem, because the mom turned out to be an enthusiastic class participant.

Professors told me that the most common—and most contentious—accommodation is the granting of extra time on exams. For students with learning disabilities, the extra time may be necessary to complete the test. But unlike a wheelchair ramp, this kind of accommodation can be exploited. Research confirms what intuition suggests: Extra time can confer an advantage to students who don’t have a disability.

[Read: The time crunch on standardized tests is unnecessary]

Complicating matters is the fact that the line between having a learning or psychological disability and struggling with challenging coursework is not always clearly defined. Having ADHD or anxiety, for example, might make it difficult to focus. But focusing is a skill that the educational system is designed to test. Some professors see the current accommodations regime as propping up students who shouldn’t have perfect scores. “If we want our grades to be meaningful, they should reflect what the student is capable of,” Steven Sloman, a cognitive-science professor at Brown, told me. “Once they’re past Brown and off in the real world, that’s going to affect their performance.”

No one is more skeptical of the accommodations system than the academics who study it. Robert Weis, a psychology professor at Denison University, pointed me to a Department of Education study that found that middle and high schoolers with disabilities tend to have below-average reading and math skills. These students are half as likely to enroll in a four-year institution as students without disabilities and twice as likely to attend a two-year or community college. If the rise in accommodations were purely a result of more disabled students making it to college, the increase should be more pronounced at less selective institutions than at so called Ivy Plus schools.

In fact, the opposite appears to be true. According to Weis’s research, only 3 to 4 percent of students at public two-year colleges receive accommodations, a proportion that has stayed relatively stable over the past 10 to 15 years. He and his co-authors found that students with learning disabilities who request accommodations at community colleges “tend to have histories of academic problems beginning in childhood” and evidence of ongoing impairment. At four-year institutions, by contrast, about half of these students “have no record of a diagnosis or disability classification prior to beginning college.”

No one can say precisely how many students should qualify for accommodations. The higher prevalence at more selective institutions could reflect the fact that wealthy families and well-resourced schools are better positioned to get students with disabilities the help they need. Even with the lowered bar for a diagnosis, obtaining one can cost thousands of dollars. And as more students with disabilities get help in middle and high school, that could at least partially explain their enrollment at top colleges.

Still, some students are clearly taking advantage of an easily gamed system. The Varsity Blues college-admissions scandal showed that there are wealthy parents who are willing to pay unscrupulous doctors to provide disability diagnoses to their nondisabled children, securing them extra time on standardized tests. Studies have found that a significant share of students exaggerate symptoms or don’t put in enough effort to get valid results on diagnostic tests. When Weis and his colleagues looked at how students receiving accommodations for learning disabilities at a selective liberal-arts school performed on reading, math, and IQ tests, most had above-average cognitive abilities and no evidence of impairment.

A parent in Scarsdale, New York, who works in special education told me that it’s become common for parents of honors students to get their kids evaluated so they can have extra time on tests. The process usually starts when kids see that their peers have accommodations— or when they bring home their first B. “It feels in some ways like a badge of honor,” she said. “People are all talking about getting their children evaluated now.” In 2019, a Wall Street Journal analysis found that one in five Scarsdale High School students was considered disabled and eligible for accommodations on college entrance exams—a rate more than seven times higher than the national average.

Several of the college students I spoke with for this story said they knew someone who had obtained a dubious diagnosis. Hailey Strickler, a senior at the University of Richmond, was diagnosed with ADHD and dyslexia when she was 7 years old. She was embarrassed about her disabilities and wary of getting accommodations, until her sophomore year of college. She was speaking with a friend, who didn’t have a disability but had received extra time anyway. “They were like, ‘If I’m doing that, you should definitely have the disability accommodations,’” Strickler told me.

“We know that people will act as they are incentivized to act,” Brian Scholl, a Yale psychology and cognitive-science professor, told me. “And the students are absolutely incentivized to have as much extra accommodations as they can under any circumstances.” Students who receive extra time on the LSAT, for example, earn higher average scores than students who don’t.

Even if students aren’t consciously trying to gain an unfair edge, some seem to have convinced themselves that they need extra help. Will Lindstrom, the director of the Regents’ Center for Learning Disorders at the University of Georgia, told me that the fastest-growing group of students who come to him seems to be those who have done their own research and believe that a disability is the source of their academic or emotional challenges. “It’s almost like it’s part of their identity,” Lindstrom said. “By the time we see them, they’re convinced they have a neurodevelopmental disorder.”

Lindstrom worries that the system encourages students to see themselves as less capable than they actually are. By attributing all of their difficulties to a disability, they are pathologizing normal challenges. “When it comes to a disorder like ADHD, we all have those symptoms sometimes,” Lindstrom told me. “But most of us aren’t impaired by them.”

One recent Stanford graduate told me that when she got mononucleosis as a freshman, she turned to the disability office: Because she couldn’t exercise, she was struggling to focus in class. Though she’d always been fidgety, she’d never had academic issues in high school—but high school had been easier than Stanford. The office suggested that she might have ADHD, and encouraged her to seek a diagnosis. A psychiatrist and her pediatrician diagnosed her with ADHD and dyslexia, and Stanford granted her extra time on tests, among other accommodations.

Collar, the University of Chicago physics professor, said that part of what his exams are designed to assess is the ability to solve problems in a certain amount of time. But now many of his students are in a separate room, with time and a half or even double the allotted time to complete the test. “I feel for the students who are not taking advantage of this,” he told me. “We have a two-speed student population.”

Most of the disability advocates I spoke with are more troubled by the students who are still not getting the accommodations they need than by the risk of people exploiting the system. They argue that fraud is rare, and stress that some universities maintain stringent documentation requirements. “I would rather open up access to the five kids who need accommodations but can’t afford documentation, and maybe there’s one person who has paid for an evaluation and they really don’t need it,” Emily Tarconish, a special-education teaching-assistant professor at the University of Illinois at Urbana-Champaign, told me. “That’s worth it to me.”

Tarconish sees the growing number of students receiving accommodations as evidence that the system is working. Ella Callow, the assistant vice chancellor of disability rights at Berkeley, had a similar perspective. “I don’t think of it as a downside, no matter how many students with disabilities show up,” she told me. “Disabled people still are deeply underemployed in this country and too often live in poverty. The key to addressing that is in large part through institutions like Berkeley that make it part of our mission to lift people into security.” (One-third of the students registered with Berkeley’s disability office are from low-income families.) At the University of Chicago, members of a committee to address the surge in accommodations don’t even agree on whether a problem exists, Collar told me.

The surge itself is undeniable. Soon, some schools may have more students receiving accommodations than not, a scenario that would have seemed absurd just a decade ago. Already, at one law school, 45 percent of students receive academic accommodations. Paul Graham Fisher, a Stanford professor who served as co-chair of the university’s disability task force, told me, “I have had conversations with people in the Stanford administration. They’ve talked about at what point can we say no? What if it hits 50 or 60 percent? At what point do you just say ‘We can’t do this’?” This year, 38 percent of Stanford undergraduates are registered as having a disability; in the fall quarter, 24 percent of undergraduates were receiving academic or housing accommodations.

Mark Schneider, the former head of the educational-research arm of the Department of Education, told me that three of his four grandkids have “individualized education programs,” the term of art for accommodations at the K–12 level. “The reward for saying that you have a disability, versus the stigma—the balance between those two things has so radically changed,” he said. Were it not for that shift, he added, his grandchildren may not be receiving benefits and services they need. But at the very least, the rewards are not evenly distributed. As more elite students get accommodations, the system worsens the problem it was designed to solve. The ADA was supposed to make college more equitable. Instead, accommodations have become another way for the most privileged students to press their advantage.


This article appears in the January 2026 print edition with the headline “Accommodation Nation.”

Read the whole story
istoner
2 days ago
reply
It's true in my experience that gaming the accommodations system is not a big problem at public 2-years. But there's still a problem. Accommodations are overwhelmingly for mental health and learning disabilities. (As in, close to 100%. Over the last 8 years I've had one wheelchair user and one Deaf student who used ASL interpreters. Out of roughly a dozen accommodations/semester.)

The trouble is that the some of the most common accommodations for mental health and learning disabilities directly undermine student learning. One of the most common is permission to turn in ANY assignment up to 2 or 3 days late. That renders all formative assessments and class preparation assignments useless. Another common accommodation is permission to miss class, which is about the worst thing a struggling student can do.

Supporting students with mental illness and learning disabilities is a genuinely tough nut to crack, but I worry that current approach, more often than not, does more harm than good.
Saint Paul, MN, USA
Share this story
Delete

Doctor, Doctor

2 Shares

1. People are arguing again about whether or not people with academic doctorates can call themselves doctor. Both sides are wrong. The correct position is 1. Yes, academic doctorates are every bit as real as medical doctorates. Academic doctors should insist on the right to be called doctor, but 2. Although you should insist on the right to be called doctor, you shouldn’t actually ask people to call you doctor 99% of the time.

2. Like most doctors, I would never ask to be called doctor except in the following circumstances:

This Substack is reader-supported. To receive new posts and support my work, consider becoming a free or paid subscriber.

a. Casually mentioning that I am a doctor when asked, and then, with a genial hand swish, indicating that, of course, such formalities will not be necessary.

b. Correcting someone who is being rude to me and throwing them off balance. “Well, Mr Bear, if you paid attention, you would…” “It’s doctor Bear, actually”.

c. In the title section of a few forms, because sometimes people treat you nicer and give you freebies. I ain’t going to turn that up.

d. In situations of extreme formality, meeting a president or prime minister, an awards ceremony, that sort of thing.

3. Insisting on formality is rather against the playful enterprise of inquiry, particularly in my native discipline of philosophy. If Socrates had, say, insisted that people acknowledge his daemonion when speaking with him, it would have held up the process and ruined the mood.

4. I have heard the argument from some members of minorities that they like to use the title doctor as a way of gaining respect. I certainly don’t begrudge them that choice, but I tend to think the strategy backfires, even when used for an understandable reason. Take this poem by Dr Susan Harlan:

No, you can’t call me

By my first name,

And yes, I know that

A male professor

Told you that titles

Are silly

Because a certain genre

Of man

Is always dying

To performatively

Divest himself

Of his easily won

Authority

5. Pinning anything on a poem is hard, but it seems to me arrogant to claim any stranger’s academic status was easily won. I am a person who will happily discard any title in most contexts. There is nothing easily won about my authority, such as it is. I could tell you my story about collapsing on the floor of the philosophy common room, terrified by the bodies circling me, near psychotic with OCD. I could tell you many such stories. None of those difficulties I faced stopped me from waving off authority.

6. Leave that aside, though. The more basic problem is that, particularly in the world of etiquette, power is in the hands of the giver [something that, incidentally, I was taught on my first day of university by a classicist riffing off Homer]. The powerful person is the generous person. Thus, while I don’t begrudge insisting on one’s title as a strategy to try to win respect when faced with oppressive disrespect, I doubt it will work. There’s a Catch-22 here. Speaking in favour of dispensing with a form of respect is all good and well when you have enough respect to dispense with. But does it work if that respect is in question? On balance, though, while structural oppression creates a terrible situation for those seeking respect, I really don’t think correcting people for addressing you as they’d address anyone else, or giving a speech about how they must address you, will help at all. I’m almost certain it will make things worse. But again, do what you have to do. Outside the oppressed, however, anyone else insisting on the use of a title gets an eye roll from me.

7. Finally

Please stop

Writing poetry

That is just

Ordinary language

But with

Line

Breaks

8. But although I may not use the tile much, the right to do so is important. It is important that the title exists, even though using it would be gauche. It is right that the title exists, but I don’t use it much because the title doesn’t exist to honour me; it exists to honour the institution.

9. Like any self-respecting doctor (an actual doctor, not one of those apple-phobic medical quacks), I will not allow my thousand-year-old title to be stolen by the white coats, who were only granted the right to use the title as a courtesy in the 18th century. HURRUMPH. Ribbing aside, medical doctorates are legitimate- and the story of the thousand-year-old academic doctorate usurped by medical upstarts is rather oversimplified. However, medical doctors, like academics, also shouldn’t insist on being called doctor, except perhaps in a limited range of circumstances. No one likes the surgeon or family doctor who is fastidious about this stuff in a social setting. Even in a hospital, no one likes the doctor who insists on the title from the nurses, even when the patients aren’t around. Anyone who thinks it’s fine for a medical doctor to insist on a title in a social context, but not an academic is simply a chaser after the approval of those with conventional social status- weak, weak.

10. The history of the term matters only in what it illustrates. And what it illustrates is precisely why I do insist on the right. As a society, we do not take knowledge — and knowledge and skills moulded into the very particular shapes necessary to make new knowledge — seriously enough. We have been taking it less seriously over time- hence the suggestion, previously unthinkable, that there is nothing exceptional about having completed a doctorate. That change says something troubling about the decline in our respect for organised knowledge.

11. I use the term organised knowledge deliberately. I do not say the decline in our respect for knowledge, because if I did so, someone would say “I respect knowledge plenty, just not these stuck-up bufoonish apparatchiks called academics”. Yes, academics often are bufoons, but no, you don’t get to respect the abstract idea of knowledge, while having no respect for the flawed, limited, faltering, but nonetheless wondrous human institutions that try to grasp at it. Disagree with them as much as you like- as much as you can even! But respect the attempt, and what it has got us- organised knowledge matters.

12. I’m going to say some stuff that will sound melodramatic, because it is; however, it is literally true. I heard someone say during the recent debate that because medical doctors have power over life, they deserve a title, whereas academic doctorates are not so serious. Academic inquiry – whether physics, theology, literary criticism, biochemistry, psychology, or engineering- has power over the lives of civilizations, and over the arc of the future. If you look around you, if you are in an ordinary room. Almost everything you see will trace its history back to academic inquiry through multiple lines of descent. The plan of the wall, the translation of the bible on the shelf, the chemical composition of the paint, the ancestors of the boilerplate novel, and, of course, the ideas in the heads of all the room’s inhabitants. This is especially true (a la Keynes’ quip) of the minds of those who think themselves too practical for such things. The process of inquiry into which a doctorate is an apprenticeship is foundational, in some way or another, to all you see around you. The institutional systems, the culture, the artifacts, and the understanding of the natural world so integral to all of it. A doctorate means that someone has devoted, at a minimum, twenty years of study to join the community of academic inquiry. Yes, I make you cringe, but I’m not wrong. The failure to see the stakes in academic doctorates in comparison to medicine or law amounts to seeing things in days, months, and years, not decades, centuries, and millennia.

13. And what of the humanities? Occasionally, I meet someone who finds it odd that my doctorate is in philosophy, “that’s a fake subject”. No, bud, we’ve been here since the beginning- we existed before all other disciplines. Name another subject that you think is more real than ours- we philosophers invented it. I didn’t write something the length of two books to put up with this.

14. But what’s the point of appealing to history when you can appeal to the present? I could refer, for example, to this:

Philosophy Majors and the GRE: Updated Data (w/updates) - Daily Nous

To show bona fides. But that won’t do, I’d like to vindicate not just philosophy, but the humanities as a whole.

15. I will concede that much humanities coursework should be harder- sure. However, you do not get to the end of a decent program, though, without working hard with skills that most people lack. “Ah, but some people have fake academic doctorates”. I mean, sure, some probably do have fake doctorates, but they’re the exception. Moreover, people from the disciplines people think of when they say this- gender studies, anthropology, continental philosophy, etc., are, in my experience, typically intellectually serious people who have thought hard and gained a lot of insights. That’s not to deny there are gradients of intellectual seriousness among doctors and their specialisations. Some people with a PhD make me raise an eyebrow or two, but then again, every fifth medical doctor is a dullard, too.

16. Ultimately, the people who disrespect the humanities do not dislike the humanities because they are weak, but because they are strong. They resent that what they regard as fluffy nonsense has so much power over culture. Indeed, a lot of groups that have often resented the humanities- STEMlords, Republicans, weave seamlessly between calling it a useless area of study- a road out of the job market leading its walker to ruin- and a mysteriously powerful, ever-present force that, for reasons they cannot grok, controls all of culture with its irrelevant nonsense. Right now, Republicans are angry that, cringe and disorganized as the left is, the right could not hold onto a sense of cool for even six months. Remember this cover of New York magazine?

Thoughts on the returning cultural dominance of "The Spoiled Rich Kid"  archetype? : r/GenZ

They are angry that the creators of all their favourite products, from Warhammer 40k to Trench Crusade, hate their guts. They are continuously frustrated at some level by the fact that everything they cling to, from Christianity to the American founding myth, was set up by the progressives of those eras. They have about five authors they cycle through relentlessly (Mishma, Evola, Spengler, Schmitt, Nietzsche- sub out for Chesterton depending on religious taste). Animated, but without life as such- needing to suckle on culture that was made for others- made against them- a kind of socially boorish vampire. Perhaps they should listen to the wordcels, they might learn something.

This Substack is reader-supported. To receive new posts and support my work, consider becoming a free or paid subscriber.

Read the whole story
istoner
3 days ago
reply
Saint Paul, MN, USA
denubis
7 days ago
reply
Share this story
Delete

Photos: When the Polar Bears Move In

1 Share
A polar bear sits on grassy ground in front of an abandoned house-sized research station.
Vadim Makhorov / AP
A polar bear rests in front of an abandoned research station on Kolyuchin Island, in the country’s Far East, off Chukotka, Russia, on September 14, 2025.
At least two polar bears stand on the porch of an abandoned research station that resembles a house.
Vadim Makhorov / AP
Polar bears stand on the porch of an abandoned research station on Russia’s Kolyuchin Island, on September 14, 2025.
An aerial view of a cluster of small abandoned buildings on a treeless island with a steep cliff shoreline
Vadim Makhorov / AP
An aerial view of the abandoned research station on Kolyuchin Island.
Four polar bears stand and sit together on a porch and inside the open door of an abandoned building.
Vadim Makhorov / AP
Polar bears gather inside part of the abandoned research station.
A polar bear peers out a window of an abandoned structure.
Vadim Makhorov / AP
A polar bear peers out from one of the abandoned structures on Kolyuchin Island, on September 18, 2025.
An aerial view of a group of small abandoned structures on a treeless island with steep cliff shorelines
Vadim Makhorov / AP
An aerial view of the abandoned research facility on Kolyuchin Island, seen on September 14, 2025.
A polar bear at rest, among rocks and grass
Vadim Makhorov / AP
A polar bear rests near the abandoned structures, on September 18, 2025.
A polar bear yawns on the porch of an abandoned building.
Vadim Makhorov / AP
A polar bear yawns on the porch of one of the abandoned buildings, on September 18, 2025.
Polar bears rest and walk around a decaying structure.
Vadim Makhorov / AP
Polar bears rest and walk around a decaying structure on Kolyuchin Island, on September 18, 2025.
A pair of polar bears stand on the porch of an abandoned building.
Vadim Makhorov / AP
Two of the bears gather on a porch, seen on September 14, 2025.
Read the whole story
istoner
15 days ago
reply
Saint Paul, MN, USA
Share this story
Delete

What Are We Going to Do With 300 Billion Pennies?

1 Comment

What, exactly, is the plan for all the pennies?

Many Americans—and many people who, though not American, enjoy watching from a safe distance as predictable fiascoes unfold in this theoretical superpower from week to week—find themselves now pondering one question. What is the United States going to do with all the pennies—all the pennies in take-a-penny-leave-a-penny trays, and cash registers, and couch cushions, and the coin purses of children, and Big Gulp cups full of pennies; all the pennies that are just lying around wherever—following the abrupt announcement that the country is no longer in the penny game and will stop minting them, effective immediately?

The answer appears to be nothing at all. There is no plan.

The U.S. Mint estimates that there are 300 billion pennies in circulation—which, if true, means that the Milky Way galaxy contains about three times more American pennies than stars. How, you ask, could the plan for 300,000,000,000 coins be “nothing”? The Mint, you say, issued a formal press release about striking the final cents. Surely, you insist, that implies some sort of strategy, or at least is evidence of logical human thought and action?

Wow—you are talking like a baby angel raised by puppies in a beachfront palace with no right angles, who has never attempted to wrench useful information out of a government agency’s public-affairs officer. I would give anything to spend 30 narcotic minutes in your gumdrop world. Let me take your round little face between my hands and squeeze it tight as I scream this:

That’s not how things work with pennies.

It is my miserable fate to possess more miscellaneous information about U.S. one-cent coins than, possibly, any other person on this planet. This is not a boast. The information I command is data no one without a neurodevelopmental disorder would ever yearn to know; it is a body of knowledge with no practical use for anyone. I contracted this condition last year, as I spent several months attempting to ascertain why, in the year 2024, one out of every two coins minted in the United States was a one-cent piece, even though virtually no one-cent pieces were ever spent in the nationwide conduction of commerce, and, on top of that, each cost more than three cents apiece to manufacture.

In search of the answer, I interviewed former directors of the Mint, members of Congress, professors of metallurgical engineering and of law, economists, charity workers, multiple manufacturers of those machines that transform regular pennies into souvenir smushed pennies, scrap-metal recyclers, historians, lobbyists, the CEO of Coinstar, coin collectors, sociologists, government auditors, and the paranoid goblins who perform the opaque work of the Federal Reserve. The initial draft of the story I filed for a popular New York City–based publication was 20,000 words long. (Sadly, all of the best parts everyone would have loved were cut by my psychotic editor, whose No. 1 passion in life was removing 13,000 perfect words from my first drafts; I’m not worried about him reading these words, because a low-class butcher like that doesn’t possess enough humanity to subscribe to The Atlantic—though, if you happen to know William, I would thank you not to send him a gift link to this article.) And what I learned was that there was no sane reason why.

[Watch: Death to pennies]

The simplest way to say this is that everyone directly involved in making billions of pennies every year knew that it was pointless to do so, and also thought that it was legally impossible to stop. Specifically, they thought they were bound to make pennies until Congress issued a law ordering them to cease (which, everyone agreed, was unlikely to ever happen). This, I discovered months into my research, did not appear to be true.

I published my theory—that Title 31, Section 5111 of the Code of Laws of the United States of America empowers the secretary of the Treasury to order that no pennies be minted—last September. On a U.S. Mint webpage that appears to have been created earlier this week (simultaneously with the announcement that the final penny coins intended for circulation had been struck in Philadelphia), this long-overlooked section of the U.S. Code is cited as the legal justification for halting penny production.

Another thing I learned daily over the course of my reporting: No one cares about pennies. Finding people to speak with me about them was nearly impossible, even, as in the case of the U.S. Mint’s public-affairs officers, when it was ostensibly their job. (The Mint, which has annually unleashed billions of unwanted and unused pennies upon the nation, gave me the strong impression of being embarrassed to be associated with the coins in any way. Indeed, a retired Mint spokesperson confirmed this.)

There were logical reasons not to care: 300 billion pennies—all of them still and indefinitely legal currency—constitute approximately zero percent of the total money supply of the United States (0.0 percent if rounding to one decimal place). The millions of dollars the government loses by paying more than three cents to manufacture one-cent coins represents an infinitesimal fraction of 1 percent of the government’s several-trillion-dollar budget. And these days, most people barely encounter the coins. According to government reports, the large majority of the pennies that have ever been minted in this country either have undergone what is termed “disappearance” (yes, this is the official wording) or are “sitting” in Americans’ private homes.

The core problem with the pennies turned out to be a mostly psychological horror. In my article, I described this situation as “the dumbest thing I ever heard” and “the Perpetual Penny Paradox”—both terms I stand by a year later.

Most pennies produced by the U.S. Mint are given out as change but never spent; this creates an incessant demand for new pennies to replace them, so that cash transactions that necessitate pennies (i.e., any concluding with a sum whose final digit is 1, 2, 3, 4, 6, 7, 8 or 9) can be settled. Because these replacement pennies will themselves not be spent, they will need to be replaced with new pennies that will also not be spent, and so will have to be replaced with new pennies that will not be spent, which will have to be replaced by new pennies (that will not be spent, and so will have to be replaced). In other words, we keep minting pennies because no one uses the pennies we mint.

But the problem was not solely the theoretical terror of the infinite. These tokens are also a physical burden, adding 2.5 grams of weight apiece to Americans’ cup holders and winter-coat pockets and junk drawers. In a practical sense, penny coins are worthless. (Many people, who do not themselves purchase anything with pennies, assume that very poor people probably use them—disregarding the fact that they have never seen a poor person settling a bill with hundreds of pennies, which would take a very long time to amass, on top of being cumbersome to carry around; the sociologists I interviewed, who study extreme poverty, expressed skepticism that anyone lives off hordes of one-cent coins.) Effectively, they are trash—trash that Americans pay the government (via taxes) to manufacture, at a loss, and then foist back on us; millions of pounds of trash for which we, every time we have ever accepted a penny coin at checkout, have tacitly agreed to provide free private stowage, in perpetuity.

So we’ve stored the pennies—under the floor mats in our RAV4s, in our empty water-cooler jugs. We’ve had to. Mint officials told federal auditors in 2019 that, if even a fraction of the nation’s never-spent pennies were simultaneously spent or cashed in, the deluge of change would be “logistically unmanageable” for the federal government. For one thing, there would likely not be enough space to store them in our nation’s bank vaults.

That’s the first thing I thought of when I read the news that the Mint had produced its last pennies for circulation: What are they going to do about the vaults? I went to the Mint website and read its press release. Then I read through every item on the neatly formatted Penny FAQs page. Then I realized they weren’t going to do anything about the vaults, because there was no plan at all to do anything except stop making pennies.

This isn’t how it usually works when a smoothly running country elects to retire some portion of its currency. Canada, unsurprisingly, provides a seemingly perfect model: When the cost to manufacture Canadian pennies reached 1.6 cents apiece, in 2012, the government announced that it would cease production of the coins and gradually withdraw them from circulation. Simultaneously, the government debuted a robust public-information campaign, explaining to Canadians the logic behind its decision and publishing guidance (including little pictures) for how to round out cash transactions in the absence of pennies. To date, the Canadian Mint has recycled more than 15,000 tons of pennies, redeemed by the public for their face value. Recycling the metal from Canadian pennies (mainly copper and steel) helped offset the cost of trucking billions of unwanted pennies across the nation. And, of course, it kept the coins out of landfills.

[Derek Thompson: The amazing history of the most notorious U.S. coin]

But it’s unclear if anyone would bother recycling U.S. pennies, which, although copper-plated, are made mostly of zinc. Recycled zinc is worth only about a quarter of recycled copper; nearly 1 million tons of copper are recycled in the U.S. each year, versus only about 165,000 tons of zinc. On top of this, a Canadian Mint official told me, copper and zinc are “very hard” to separate.

The good news, which is also very bad news, is that smelting zinc—extracting it from rock—is what a professor from the Colorado School of Mines described to me as “a very unclean, toxic process.” The best-case scenario for recycling U.S. pennies, then, would perhaps be to find industrial manufacturers willing to pay for old penny material in exchange for avoiding the hassle, expense, and danger of harvesting fresh zinc. The worst-case scenario would seem to be that we have only just a few days ago stopped manufacturing billions and billions and billions of hazardously produced zinc disks with no practical use that are also unsellable as scrap.

Incredibly, the penny’s end seems poised to be more ignominious than even the phrase worst-case scenario might suggest. A scenario is a sketch of a possible future event; worst-case implies some sense of order and codification—that multiple scenarios have been considered, and ranked by degrees of badness. Worst-case scenario implies that someone is thinking, beyond the present moment, about what can and should be done; it implies an intention, or at least a wish, to avoid the worst-case scenario.

That option, like all others, appears to have been removed from the table. The tabletop, in fact, is entirely bare. The government has issued no guidance about how cash transactions calculated to the cent should work in what will soon be a cent-coin-less country (as soon as the final batch of pennies plunges out of circulation—which will happen with shocking speed). “We are not aware of any plans to issue rounding guidance,” Andrew Von Ah, the director of physical infrastructure for the Government Accountability Office, told me on Friday; in 2019, the GAO released a report in which an association representing the nation’s banks especially emphasized the need for public education and rounding guidance “before suspending the penny.” The GAO, Von Ah said, is likewise “not aware of any plan to remove pennies from circulation”—nor indeed “of any plan to mitigate any potential issues with penny suspension.”

So: No one is coming to collect all the useless pennies. And no one is explaining how to get along without them. The government, in other words, is treating the pennies the way it has for decades: by making them Americans’ problem.

But perhaps the most alarming thing Von Ah told me was not about the lack of planning for the penny’s sudden demise. It was about the possibility that our pointless cent might someday rise from the dead. “The Mint could decide to restart production of the penny in the future,” Von Ah said—or was this a warning?—“if it is determined there is a need to do so.”

Read the whole story
istoner
18 days ago
reply
"This isn’t how it usually works when a smoothly running country elects to retire some portion of its currency."
Saint Paul, MN, USA
Share this story
Delete

Hiss

2 Shares
Read the whole story
istoner
21 days ago
reply
Saint Paul, MN, USA
Share this story
Delete
Next Page of Stories