Philosophy instructor, recreational writer, humorless vegetarian.
768 stories
·
6 followers

Saturday Morning Breakfast Cereal - Hierarchy

2 Comments


Click here to go see the bonus panel!

Hovertext:
No, it doesn't belong in the bodily needs section. If anything, it's in opposition to bodily needs.


Today's News:
Read the whole story
istoner
3 hours ago
reply
The most brutally efficient gag panel ever?
Saint Paul, MN, USA
Share this story
Delete
1 public comment
Lythimus
1 hour ago
reply
I think a medspa would have been more apt. Still, *chef's kiss*
Destrehan, LA

How AI Killed a 133-Year-Old Princeton Tradition

1 Share

In 1876, an editorial in Princeton’s newly founded campus newspaper, The Princetonian, argued against the use of proctors to monitor exams. Proctoring was “a means of bad moral education,” the author wrote. Treat students as presumptively dishonest, and some would become so; treat them as honorable, and they would learn to behave honorably. And so the editorial board suggested a different approach: “Let every man write at the end of his paper a pledge that he has neither given nor received help, and let professors and tutors address themselves to some better business than watching for fraud.”

That proposal was eventually embodied in Princeton’s famous Honor Code, adopted in 1893 and modified only lightly in the ensuing 133 years. When students take their final exams, professors leave the room. Students write down a pledge not to cheat. They are expected to report anyone who does. Any student accused of impropriety comes before a jury of their peers.

The Honor Code had a good run. F. Scott Fitzgerald (who enrolled at Princeton in 1913 but did not graduate) once wrote that violating it “simply doesn’t occur to you, any more than it would occur to you to rifle your roommate’s pocketbook.” The code lasted through two world wars, the upheaval of the 1960s, the disillusionment of Watergate, and even the rise of search engines and SparkNotes. It finally met its match in generative AI. Yesterday, after the rise of AI-facilitated cheating became too obvious to ignore, Princeton’s faculty voted to begin proctoring exams again. Technically, the Honor Code is still in place. Students will still sign a pledge that they didn’t cheat. But now professors will be watching to make sure they’re telling the truth. The Honor Code can’t run on the honor system anymore.

[Rose Horowitch: What an Ivy League education really gets you]

Even at Princeton, obviously, some students have always cheated. Fitzgerald himself was scandalized when, during a campus visit a decade after his time at the university, a member of the football team told him that his roommate knew of unreported Honor Code violations. (Shortly thereafter, a fellow alumnus shared the same suspicion with the famous novelist.) “The implication was that these were many,” Fitzgerald wrote to the dean. Back then, however, academic dishonesty was constrained not only by codes of conduct but by the amount of effort it required. A student who wanted to cheat had to go to the trouble of finding someone who would let them copy their answers.

The internet and the shift to doing work on computers rather than by hand dramatically lowered the barriers to cheating. A study of thousands of students at Rutgers University found that, in 2017, a majority copied their homework answers from the internet. AI has taken that dynamic to new extremes. It can mimic any writing style, produce a unique essay, and add in typos to make it appear human-authored. The available detectors are not foolproof. Studies have consistently found that teachers are worse than they think at detecting AI usage. “It’s a temptation,” Anthony Grafton, a longtime Princeton history professor who retired last year, told me. “I can imagine the student with the devil over his or her left shoulder and the angel over his or her right shoulder.”

Since generative AI became widely available, in fall 2022, Princeton has seen rising academic dishonesty. The Committee on Discipline, which has jurisdiction over take-home assignments, found 82 students responsible for academic violations in the 2024–25 academic year, compared with 50 students in 2021–22. Those are just the students who manage to get caught; the real numbers are undoubtedly much higher. In the school newspaper’s survey of graduating seniors, which 501 students responded to, 30 percent said that they had cheated, 28 percent said that they had used ChatGPT on an assignment when it was not allowed, and 45 percent said that they knew of cheating by a peer and chose not to report it. Michael Laffan, a Princeton history professor, told me that he has sat in coffee shops near campus and watched as students copied responses from ChatGPT and passed them off as their own.

The ease of AI-enabled cheating seems to be imparting a “bad moral education” of its own. Cheating has become more visible, Nadia Makuc, a senior at Princeton and former chair of the Honor Committee, told me. Students post about violating the Honor Code on Fizz, the campus’s anonymous social-media app. That makes students who play by the rules feel like suckers. “There’s an air of people cheating on take-homes and people just using ChatGPT,” Makuc said. “As long as people think there is more cheating, it encourages more cheating.”

[Ian Bogost: College students have already changed forever]

Princeton’s professors are finally trying to reset the system. Proctors are just one component. In the past year, the number of take-home exams at Princeton has declined by more than two-thirds. Next year, the economics department will require its majors to do an oral defense of their research projects, Smita Brunnermeier, the director of undergraduate studies, told me. David Bell, a history professor, has also added in oral exams and switched from short take-home papers to in-class writing in blue books. One of his colleagues in the history department forces students to write their papers in Google Docs so that he can review the stages of their composition.

In short, what the 1876 editorial called a “system of suspicion and surveillance” is making a comeback. “It does change something about the student-faculty relationship,” William Aepli, a graduating senior and the former chair of the group that represents students accused of violating the Honor Code, told me. “It’s one thing to have proctoring from the very beginning. It’s another thing to have this tradition of self-proctoring exams and trust that students abide by the Honor Code, and then to take that away.”

Bell told me that AI has made him more wary of his students, and that they can tell. When he changes his assignments to keep them from cheating, they understand that he doesn’t trust them. “Inevitably, all the solutions involve a greater degree of surveillance—that’s the one thing in common,” he said. “Maybe we’ll just have to get used to this new kind of police state of instruction. But I’m not eager to see where this leads.”

Much of higher education’s value rests on the assumption that cheating is an exception, not the rule. A diploma is meaningless if employers and graduate programs can’t trust that graduates learned something in college. Prospective students and their families must believe that their tuition dollars will purchase a good education. And taxpayers need to trust that public-school students are getting something from their four years of subsidized education. Rampant AI use breaks down these signals. “It is bad policy to suspect a man of being a rogue in order to be sure that he is a scholar,” The Princetonian warned in 1876. Perhaps so. But the alternative is even worse.

Read the whole story
istoner
1 day ago
reply
Saint Paul, MN, USA
Share this story
Delete

this comic is inspired by... MY DAD, who thinks it's impossible for anyone to tell - much less me, who has known his voice my entire life - when he uses AI

2 Shares
archive - contact - sexy exciting merchandise - search - about
May 8th, 2026next

May 8th, 2026: I saw the best minds of my generation, and they're doing great! They're really having a good time of it and it's so nice to see.

– Ryan

Read the whole story
istoner
5 days ago
reply
Saint Paul, MN, USA
Share this story
Delete

tick tock, motherlover

1 Share
archive - contact - sexy exciting merchandise - search - about
May 6th, 2026next

May 6th, 2026: I may have spoken too soon re: it feeling like spring, since my weather just told me "cooling over the next few days." Why? WHY

– Ryan

Read the whole story
istoner
8 days ago
reply
Saint Paul, MN, USA
Share this story
Delete

Orion is rarely seen like this.

1 Comment

Orion is rarely seen like this. Orion is rarely seen like this.


Read the whole story
istoner
8 days ago
reply
Wow!
Saint Paul, MN, USA
Share this story
Delete

Grieving What AI Has Taken from Learning

1 Comment

“I wonder if these people have ever seen a student’s face when they finally understand something for the first time.”

Jane Sloan Peters, a professor of religious studies and historical theologian at the University of Mount Saint Vincent, was talking with her students about changes she has made to her teaching so as to safeguard student learning from artificial intelligence when “a wave of sadness washed over me, and I actually got choked up in front of the class.”

“Before AI,” I told them, “Students used to work hard to come up with their own ideas. I’d help, and they’d struggle, but they’d come to something that was their own. That doesn’t happen anymore and I grieve that.” Then I felt embarrassed and went on teaching as though nothing had happened.

Her reflections on this experience will resonate with many Daily Nous readers. She identifies one of the many feelings she has been having about how AI is altering education as grief.

[Trenton Doyle Hancock, detail of untitled etching from “Bye and Bye”]

She writes:

AI promises great gains, but many educators sense that with its advent, we have lost so much. In this particular instance, students have lost the freedom to sit comfortably in a space of silence and uncertainty, a space as dark and rich as the spring soil in which seedlings are born. And I have lost the joy of sitting with them, encouraging them, watching as their thoughts take root and grow…

The [American Psychological Association] defines grief as “the anguish experienced after significant loss, usually the death of a beloved person.” We are witnessing the death not of a beloved person, but of love as the grounds of education. Love is the heart of a liberal education—a love of the truth, as well as the kind of friendship-love (philia) between teachers and students that makes it possible to pursue the truth together.

AI sycophants would have you believe that teachers like me are simply scared of a new system that will expose their personal deficiencies and outdated pedagogical methods—intractability about AI is ultimately self-interested self-preservation. Dear teacher, you are not fooling anyone but yourself—those lecture notes belong in the bin.

I wonder if these people have ever seen a student’s face when they finally understand something for the first time. What’s more, I wonder if they’ve ever seen a student experience the unique delight in not knowing. Certainly, the unknown can produce confusion and frustration. But sometimes I see in students’ faces a flash of something like the relish of a traveler who knows the journey ahead will be just as delightful as the destination. 

Professor Peters says that this “delight in not knowing” is something that’s especially valuable to cultivate in students studying theology, and I imagine many Daily Nous readers think the same is true of philosophy.

At the end of her piece, which you can read in its entirety here, she says: “despite being aware of the real losses in education, I still believe the love and wonder at the heart of education can be salvaged, somehow. The question is, how?”

(via Zena Hitz)

The post Grieving What AI Has Taken from Learning first appeared on Daily Nous.

Read the whole story
istoner
13 days ago
reply
I am not optimistic that higher ed in anything like its current form can be saved. I would definitely call the feeling grief, with an unhealthy dollop of rage. Students in my classes used to learn something, and I took some life satisfaction from believing that I had made some small contribution to the path of their lives. They might be slightly richer and more rewarding for practicing some basic philosophical skills, grappling with texts that they struggled out the outset to understand, having their minds blown, at least for a moment or two, and so on. None of that happens when all of their interactions with texts, each other, and me, are mediated by AI. Without a ban on tech during college studies, I don't see that changing. And higher-ed administrators will continue to believe AI is a boon that supercharges learning until the entire system has been burned to ashes.
Saint Paul, MN, USA
Share this story
Delete
Next Page of Stories