Socrates ponders his death

I teach religious studies. This means that I get the following question with some frequency: “What happens when we die?” My response is rarely satisfying: “I don’t know. I haven’t died yet.” I don’t say this to brush off a serious question or to be coy. This is the response that is most authentic. To claim to know anything more than this would be to lie, at least for me. Maybe someone out there knows what happens.

Answering with Socrates
I was asked the aforementioned question both this week and last. Incidentally, over the past few days, I reread Plato’s Apology. This is Socrates’ defense of himself before the Athenians who would vote that he was guilty of corrupting the young men of Athens and denying the commonly received gods. When it comes time for sentencing, we find Socrates rejecting an opportunity to be ostracized: “But surely, Socrates, after you have left us you can spend the rest of your life in quietly minding your own business” (Tredennick and Tarrant translation, p. 66 [37e]). But Socrates believes he has been doing what is right and what is good when he goes around challenging commonly held assumptions. He speaks of it as if it was a divine calling. To accept a form of exile would be to abandon this mandate and to fail to do what is right and good.

Here we get Socrates’ famous line about the unexamined life (p. 66 [38a]):

“If on the other hand I tell you that to let no day pass without discussing goodness and all the other subjects about which you hear me talking and examining both myself and other is really the very best thing that a man can do, and that life without this sort of examination is not worth living, you will be even less inclined to believe me”

So, rather than save his life and undermine his own message and mission, Socrates accepts the sentence of the death penalty. He states that unlike the sophists who argue to win, “I would rather die as a result of this defense than live as the result of the other sort.” Also, “…the difficulty is not so much to escape death; the real difficulty is to escape wickedness, which is far more fleet of foot.” These statements (p. 67 [38d-e]) turn into a warning to the Athenians that he being slow will be caught by death but escape wickedness while they being fast will escape death but be caught by wickedness, which is far worse.

But what I want to highlight is what he says about death that is relevant to the questions I’ve received about the afterlife. Socrates warns that to assume that death is an evil is to make a mistaken claim because we don’t know (p. 69 [40 b-c]). Then he states (p. 69 [40c]):

“We should reflect that there is much reason to hope for a good result on other grounds as well. Death is one of two things. Either it is annihilation, and the dead have no consciousness of anything; or, as we are told, it is really a change: a migration of the soul from this place to another.”

Socrates proposes it might be like the deepest, dreamless sleep you’ve experienced. How refreshing! “If death is like this, then, I call it gain: because the whole of time, if you look at it in this way, can be regarded as no more than a single night.” In other words, death puts us into a peaceful, everlasting state of rest (p. 69 [40e]). Or, “…on the other hand death is a removal from here to some other place, and if what we are told is true, that all the dead are there, what greater blessing could there be than this, gentlemen of the jury?” He imagines he might spend time philosophizing with the greats, like Hesiod and Homer (p. 69 [40e-41a]).

What About Scary Visions of the Afterlife?
Notably, he entertains no scary or torturous visions of the afterlife. I imagine that this may have something to do with his assumption that goodness is from the gods and that depictions of badness or wickedness coming from the gods should be rejected (see “Euthyphro and Goodness” and  “Would Plato approve of children reading the story of ‘Noah’s Ark’?”). Or, it may (also) be that he recognizes himself as a good man. He says (p. 70 [41d]), “…nothing can harm a good man either in life or after death, and his fortunes are not a matter of indifference to the gods.” If we entertain theories of divine evil or divine goodness that looks nothing like what we can recognize as goodness, then we must entertain horrific visions of the afterlife. Or we must be confident that we were good people, all things considered. Materialist visions of death, or certain Buddhist ones, or even something like Calvinistic ones seem to be a different discussion altogether.

Socrates and Christ?
I have hope that death isn’t final but I’m aware that it could be. I hope that there’s some sort of continuation of personality after death. But I don’t know what happens. I hope that I can reunite with those I love. I’m not sure how it would work though. Frankly, I don’t even have the faintest idea about what happens. I know what different theologians and religious systems have suggested but I can’t tell you who’s right and who’s wrong.

This confuses some because I claim to be a Christian but I’m a Christian who has drawn a bold line between what I want to happen and what I think I can say will happen. And I’m a Christian in the sense that I try to ask myself, “Do I want to see a world that looks something like what Jesus imagined when he spoke of the ‘Kingdom of God’?” As long as my answer is “yes”, then I’ll try to be a Christian. All veneration/worship of Jesus is an attempt to recenter my affections in a world that tries to draw our eyes to power, influence, wealth, etc. That said, the theology and metaphysics of my religion are to me what poetry is: (potentially) beautiful, symbolic speech about things we sense, feel, experience, seek, hope for, etc., but that we can’t explain concretely or logically or scientifically. It’s a categorical error to turn our poetic theology into something scientific and systematic, in my view. If you read about the resurrection of Jesus across the four canonical Gospels—ignoring non-canonical Gospels for this exercise—you’ll find an evolving narration of what can be categorized at best as an “apocalyptic” events. (To call it a “historical” event seems both misaligned with what historians are doing and underwhelming in light of what Christianity has been claiming.) On Easter, I’ll say “he is risen!” but what I think I mean is “I hope what Jesus’ followers experienced after his death is a small window into what might await us after death!” I hope but I don’t know. In other words, I feel greater kinship with Socrates on this matter than St. Paul. My hopes aren’t the same as epistemic claims. I’ve come to accept that I’ll live with the doubts of Good Friday until the day death comes for me.

Euthyphro and Goodness

This past week, I reread Plato’s Euthyphro. This is the source of the famous “Euthyphro Dilemma” or “Euthyphro Problem”:

In Euthyphro, the question is whether the gods command what is good/holy because those commands are right/just or whether what the gods command are right/just, and therefore good/holy, because the gods command it. If the former, then it would seem that the gods are responsible to submit to a higher standard meaning that there is something greater than the gods, namely goodness, or justice, etc. If the latter, then what we call good/holy/right is merely a matter of power: we have to do what the gods tell us and this can be arbitrary; this can change over time. The gods could say “don’t murder” today but then “murder!” tomorrow and the rightness of it all would be determined by their divine positionality.

According to the polytheist Euthyphro, “…what’s holy is whatever all the gods approve of” (Tredennick and Tarrant translation, p. 20). For the monotheist, agreement isn’t needed about the number of gods. For monotheist, this uniformity is accomplished with the one god. As the video above presented it though, monotheism doesn’t escape the question: Is something right because God commands it or does God command it because it’s right. If the former, it seems arbitrary; if the latter, it seems that God is held to a standard greater than God. Most monotheists that I know respond with an argument that goes something like this: “What God commands is good because God’s commands are based on God’s nature which is inherently good.”

I’m not opposed to this argument but I think it closes the door on more fundamentalist readings of sacred texts. Let me turn to something Socrates says to Euthyphro to explain. Euthyphro is prosecuting his father for murder. Euthyrphro claims that what his is doing follows divine commands. He gives the example of how Zeus himself castrated his father Cronus because Cronus “had unjustly swallowed his sons”. In other words, Cronus had done an evil, so Zeus was justified in harming his own father.

The point I want to make has nothing to do with whether in the context of the myth, Zeus was justified. Instead, it’s Socrates’ response that interests me, as I noted above. Socrates says, “whenever someone talks like this about the gods, I find it very difficult to accept” (he says it in the form of a question but it implies his view, p. 14). Socrates finds this depiction of Zeus and Cronus problematic. Recall that recently I wrote a post titled “Would Plato approve of children reading the story of ‘Noah’s Ark’?” where I shared how in The Republic, Socrates says that God must be presented as “good”. Returning to Euthyphro, in response to the story of Zeus and Cronus, Socrates asks, “…do you really believe that these things happened like this?”

I find that if anyone is going to respond to the Euthyphro Dilemma with the claim that God’s commands are good because they come from God’s good nature, then they have to reject many of the Bible’s stories’ theologies (as well as the similar stories of other sacred texts). The response of the apologetically inclined will be to say that God’s goodness is different than ours. Maybe but to what degree? Humanicide? Or if we are to speak of doctrines like eternal damnation? If God’s goodness includes these horrid acts—acts that none of us would call good if God wasn’t attached to them—then the word “goodness” becomes meaningless. We should abandon any theologizing.

Socrates pushes Euthyphro to consider whether he understands holiness and divine justice. Euthyphro’s appeal to stories like Zeus castrating Cronus are unconvincing to Socrates because they present an inferior depiction of the divine. Socrates’ main goal is to help Euthyphro realize that he doesn’t know what the gods want because he doesn’t understand holiness and its relationship to justice. He needs epistemic humility. Likewise, modern religious people that speak of a good God doing things that we’d clearly define as bad in any other context seem to not understand divine goodness, if such a thing exists. They need epistemic humility. And they should be hesitant to appeal to sacred myths that depict the divine as having a lower standard of good than our own. Our own standard of good may not meet divine standards (because God is so extremely good) but surely they shouldn’t be clearly superior to them either. I would never exterminate almost all of humanity nor would I burn anyone for an extended period of time, let alone for an eternity. Such theologies make God wicked by any standard. And the only recourse is to retreat into arguing that what God says is good is good because God’s more powerful than us and God declared it. If this is so, then the dilemma hasn’t be addressed at all.

AI, reading, and the humanities

In a recent episode of the podcast “The Philosopher’s Zone” with David Rutledge titled “AI and Reading”, UMass-Lowell philosophy professor John Kaag was interviewed about a new project of his: “Rebind”. For your convenience, here’s a trailer for the product:

I admit, this sounds kind of interesting and I’d be interested in reading some of the books they have ready with the commentary and chat features, just to see what the experience is like.

In the interview, Kaag addresses a few topics that many of us in education know already. He talks about how learning needs to move away from information dumping and regurgitation. He addresses the problem of the perceived inaccessibility of many of the classic texts. He reminds us that the humanities have been in decline for a while now, so the AI revolution—if that’s what’s happening—can’t be blamed for growing disinterest. He’s positive toward AI. He sees it more as a solution than a problem.

As I listened, a few things came to mind:

  1. Kaag talks about how teaching may need to be a little more personal, a little more 1-to-1. The problem with this suggestion is practical. Class sizes are growing. If your institution—public, public charter, or private—isn’t growing, they either have a strong endowment/tax base or they’re dying. So it’s unlikely that class sizes can shrink to the place where teachers can do the type of 1-to-1 educating that Kaag suggests. We’d have to do a complete rehaul of our current system.
  2. Kaag sees the rise of AI’s significance and necessity as inevitable. My response would be that this is likely true but I think that the inevitability of AI’s significance can be embrace in a healthy manner and an unhealthy one. Large Language Models (LLMs) scrape the Internet for their data. We humans created that data. If we collectively become too reliant on LLMs, I fear that this will hinder human creativity. Yes, we learn from others. Yes, our learning is the ingestion and realigning of things we learn. But we do this as embodied creatures with agendas, goals, desires, motives, inspirations, imaginations, etc. We do this with an almost endless variety of purposes, as each of us contributes something unique. Our collective “hive-mind” is what it is because of individuality and uniqueness, in part. I don’t see that in LLM’s. Will LLMs have less and less truly unique and creative insight upon which to draw if we humans outsource of thinking to computers?
  3. Much of what LLMs produce is akin to what the philosopher Henry Frankfurt calls “bullshit” (neither truth nor lies, both which indicate intentionality, but just content that is careless about whether it is true or false). See the recent article “ChatGPT is Bullshit” by Michael Townsend Hicks, James Humphries, and Joe Slater for the journal Ethics and Information Technology.
  4. This may mean that there are stages to our developing skills and postures with regard to learning and learning with AI. My gut says that we should try to create a setting where young people have to develop their own independent thinking abilities in preparation for using those thinking abilities to engage what AI has to offer as active contributors and not just passive consumers. In my classes, students read from paper and handwrite on paper. Hypothetically, let’s say that your freshman/sophomore years, all reading and writing is done this way in class under the supervision of educators. Then as one becomes a junior and a senior in high school, preparing for the independence of their college and/or professional years, we focus on teaching them how to take their own original ideas and interface them with AI? An argument could be made that we have them wait until college to do this and high school focuses completely on reading/writing in a traditional, almost pre-Internet way.

I’m sympathetic to the idea that the humanities needs to embrace AI in order to be relevant in the future because this is what current cultural and market forces demand. But I’m hesitant to abandon the boring, laborious parts of learning that lay the ground work for the human brain because I worry that the quicker we outsource our thinking and creativity to AI, the sooner we’re going to realize we’ve placed ourselves in a collective spin where little creativity, innovation, or new thought can flourish.

Why do I blog?

Blogging may be an outdated form of media. I don’t think it’s dead like say MySpace. There remain many popular blogs out there. I presume their readership is mostly Gen X and older Millennials. But even if it isn’t dead, it’s not popular. You don’t start a blog in 2024 if you want to get a message to the masses. You get on TikTok, I presume.

The most “relevant” social media platform with which I engage is Instagram. Facebook is ads mixed with sadness, though it’s how I remain connected to many people. Threads is coming alive but nothing I share seems interesting to the people on there…or the algorithm! “X” is scary. I left that dystopia long ago. I’m not going to touch something like Snapchat. And though I have peers who have done well with TikTok, I’m not interested.

This is because I don’t blog for a big audience. I blog to keep myself writing with frequency. I blog because unlike keeping a private Word document to record my thoughts, occasionally people can read what I write here, enjoy it, share it, and even respond to it. But I don’t look for that sort of response in the same way social media influencers do. It’s more like when blogging first began in the late 2000s and there was the joy of being able to write and be read by a handful of people with similar interests. That was my favorite part of blogging culture and it remains so.

It’s funny because for a long while, I had a blog that was very popular by blog standards. I know these stats don’t match the stats someone might get on YouTube or TikTok but my most “successful” blog has seen almost 1.5 millions views in its lifetime and about a half-a-million unique visitors. There was a day when over seven thousand people visited back in 2013.

This blog was central to me finding my way when I moved to San Antonio. One person who read it, Greg Richards, directed “College Missions” for the Diocese of West Texas of the Episcopal Church (for whom I work indirectly now). He was my first connection with the denomination that is tied to the school where I work and he was one of the people who wrote me a recommendation when I applied for the job I’ve had for more than eight years now. Another person was Dr. Rubén Dupertuis at Trinity University here in town. He gave me two opportunities to be a “Teaching Intern” which helped my resume. Also, he wrote me a recommendation letter. So, my old blog helped me network somewhere new. This networking helped me find the job that I have now. I’m grateful for that old blog!

I had a few other blogs that started, failed to gain any readership, and/or were closed because I gave up on the theme upon which it was anchored. (For example, when I was on the doorstep of leaving Pentecostalism permanently, I abandoned a blog with the clever name “Azusa Remixed” that tried to gather together Pentecostal and Pentecostal-friendly but also forward thinking writers to talk about a future for Pentecostalism. When I knew my vision wasn’t going to match reality, and that reality was that I didn’t belong in Pentecostal circles anymore, I shut down the project. On a side note, the old saying that the Internet doesn’t forget isn’t true. If you google “Azusa Remixed” you’ll find nothing about my blog that I can see though there’s some connection to an anime character!)

I think Twitter was the beginning of the end of blogging supremacy as a novel way to communicate on the still young Internet. Now it’s something older people like me do. My old blog sits there without a new contribution since 2014 but it still gets about four times as many visitors every week as my current one. If you’re a reader of this blog, I’m grateful for you but clearly “readership” in the abstract isn’t my goal. My goal is to process my thoughts through writing. Blogging was the method of writing that has been the most successful at helping me develop consistency. So, because I value the connection between writing and thinking, and blogging helps me maintain that connection, I continue to blog.

Anecdotal evidence about phones in the classroom

I’m not a psychologist or a social scientist. But my own experience in the classroom has made me pay attention to the claims of people like Jonathan Haidt and Jean Twenge. Both have sounded the alarm with regard to adolescent (over)use of smartphones. I’ve confiscated student phones only to have my pocket buzz incessantly. I wondered how anyone could focus with notification after notification from Snapchat, Instagram, and TikTok vying for their attention. I’ve seen my students sit around together but not speaking to each other as each stared into their phone. Adults do this sort of thing too but as Haidt, Twenge, and other has noted: we had a chance to live through our brain’s important developmental stages before getting smartphones. Gen Z didn’t get the opportunity. For this reason, Haidt, Twenge, et al., have argued for causation between smartphone use/addiction and the ongoing mental health crisis we see about America’s youth (for example, see Haidt’s “End the Phone-Based Childhood Now”).

My wife and I have seen the children of parents who raised their kids without smartphones and tablets and those who allowed it. Our experience told us that there are drastic differences in these kids ability to wait, be patient, delay gratification, hold conversations, read books, be creative, and just enjoy being children with imaginations. Our kid won’t have a smartphone or a tablet at their disposal. If they use it at all in daycare or school, we’ll ask for limits. My plan is to keep these technologies out of their lives as long as I can.

For this reason, I was surprised when a recent episode of Freakonomics (“Is Screen Time as Poisonous as We Think?”) interviewed Andrew K. Przybylski of Oxford University who seemed to brush these concerns aside. I think his main point was that phones aren’t the end-all, be-all of Gen Z’s mental health crisis. But as I listened to him, I thought what he was saying didn’t match my experience at all. You see, this year our school went phone free. And I don’t know how many students are going to our student counselor. And I can’t tell you whether they feel happier in general. I can tell you what I see in the classroom though: they’re more focused; they contribute to class conversations more freely; they seem to have more patience when reading; they seem less stressed and distracted; they seem more in the moment. Several of my colleagues have noticed the same thing.

Our school is using Yondr. The kids were not happy about this at the beginning of the year but more and more are telling my colleagues that they admit that they kind of enjoy the freedom. Maybe Przybylski would agree that this can be good. Maybe his point has little to do with phones in schools and more to do with the smartphone-mental health causation argument. But a few weeks into this new school year and I think our school’s decision to remove phones has been one of the best ones we’ve made in years. The students seem happier!

Phones weren’t allowed last year, technically. We told the kids to keep them “off and away” during class. They could take them out between classes. This meant that in reality many students still had their phones on their bodies all day. All those notifications grabbing their attention endlessly from their pocket, making them want the class to be over now so that they could hurry to check their social media. Now my students often lose track of time as they lack phones and smart watches, and I rarely use computers in my class. Also, I don’t have a clock on my wall. The few students with traditional watches keep time but quite often it’s clear that they don’t know how much time has passed in class. This has made a huge difference.

I teach at a relatively affluent private school. My experience is limited to one demographic of kids. I don’t want to claim to be diving into the big picture psychology and social science of adolescents and phones. But for our school, and for my students, the removal of phones has been a gift. As an adult, I’ve noticed that when I spent too much time on social media, I feel worse about things. When I stare at my phone for too long, it’s rarely a good sign. As I try to use my phone and social media less, my brain feels freer, happier. If this is how things are for my forty-two year old brain, I can’t imagine that a fourteen through eighteen year old brain doesn’t benefit at least as much from time away from their phones and social media. For that reason, as the debate goes forward in universities and research labs, I’m going to go with my experience and root for limiting phone/social media use by young people.

Handwriting is good for the brain

I’ve mention a few times that my students handwrite their notes, and their exit tickets, and pretty much everything. The #1 reason for this? I can’t compete with all the tabs open on their Internet browser. When I used to allow computers, engagement seemed impossible. It wasn’t clear that they were listening to me. It was easy to have a classmate send their notes by email or messenger, which could then by copied-and-pasted. This changed after I banned computers. Students became more likely to participate in class. And even if you copy your classmates notes, you have to take time to write them out yourself, which leads to more learning than copying-and-pasting.

This leads me to the #2 reason: I think handwriting is better for learning. I can’t remember the book that I read years ago on this subject—it’s likely outdated and out of print now—but recent science seems to confirm that handwriting notes helps learning stick! For instance, earlier this year the article “Handwriting but not typewriting leads to widespread brain connectivity: a high-density EEG study with implications for the classroom” was published in the journal Frontiers in Psychology and they authors (F R Ruud Van der Weel Audrey L H Van der Meer ) conclude:

“When writing by hand, brain connectivity patterns were far more elaborate than when typewriting on a keyboard, as shown by widespread theta/alpha connectivity coherence patterns between network hubs and nodes in parietal and central brain regions. Existing literature indicates that connectivity patterns in these brain areas and at such frequencies are crucial for memory formation and for encoding new information and, therefore, are beneficial for learning. Our findings suggest that the spatiotemporal pattern from visual and proprioceptive information obtained through the precisely controlled hand movements when using a pen, contribute extensively to the brain’s connectivity patterns that promote learning.”

I’m not a scientist so I can’t evaluate these findings but PubMed has many other articles that seem to be making the same claim. My experience is anecdotal but even if there weren’t studies like this one that seem to support my hunch, I know I would continue to have my students write by hand because of the difference that I’ve experienced. And because I’m not as interesting as whatever is on tabs 7, 12, 28, and 39.

What do we want for our students? Dispositional growth!

Lately, I’ve been writing a lot about shifts in my thinking regarding what I teach and how I teach it. I tend to be an introspective and retrospective person by nature. This has been super charged by the news that I received several months ago that by the end of the year, I’ll be a “dad”. I began to wonder, “What kind of education will I want my kid to have when they enter high school?” Also, “If I were to be my child’s teacher someday, what/how would I want to teach them?” These questions haven’t led to a midlife crisis. I enjoy what I’m doing as a teacher. I have no desire to do anything else. But I’ve thought a lot about the future relevance of what I’m teaching currently, and I’ve second guessed the viability of the field of study to which I’ve dedicated so much of my life, at least whether or not I’m interested in the questions that I would need to keep asking in order to continue being engaged.

Additionally, I’ve reflected on the environment I hope to create in my classroom but also outside my classroom, as in what place do I think education has within the context of the adolescent’s life. In a recent post, “Homework, rigor, and being the ‘chill’ teacher”, I wrote about how I try to measure the success of my teaching in ways that don’t align naturally with the current modus operandi of American education. What do I want to see? I want to see students learning how to read: to read thoughtfully, carefully, and intentionally. I want them to become accustom to taking notes and using those notes. I want them to practice putting what they’ve learned into their own words, so that they take ownership of their knowledge rather than thoughtlessly parroting how others say it, or worse, outsourcing their learning to emerging AI or the top Google search results. I plan to help students learn to develop arguments (in the philosophical sense) where they can show their reasoning. I want students to be mentally tired at times in my class but I want the culture of my classes to be such that when they look back on their experiences, it felt “easy” because what I was trying to teach them became natural to them, and while they were pushed to stretch themselves, they weren’t driven to anxiety. These are ideals, aspirations.

I purchased a book titled The Art of Teaching Philosophy: Reflective Values and Concrete Practices to help me think through these ideals and aspirations. One essay captured what I’ve been feeling:

That first paragraph from David W. Concepción‘s essay (pp, 189-196) grabbed my attention. That’s what I’ve been trying to articulate. I want to focus on dispositional growth. And this exercise at the beginning of the essay grasps what I’ve been seeking. What do I want to stick with my students when they reflect on the experience of my classes years after they take them? Those are my “learning objectives”!

I completed the exercise. My gut response is something like this:

  1. I want them to use their knowledge to increase their inward happiness but also the outward good that they will do in the world
  2. I want them to become more thoughtful/self-examined/self-aware
  3. I want them to develop an open posture toward learning
  4. I want them to become more tolerant/less dogmatic
  5. I want them to develop and sharpen their critical thinking skills

Concepción’s chapter addresses concerns that administrators may have that such dispositional goals are immeasurable. He argues convincingly that all learning assessments are “[inferential] through proxy”. Furthermore, he provides guidance for how objectives that matter to philosophers and other teachers under the umbrella of the humanities, such as increased “curiosity, intellectual humility, comfort with ambiguity, and fair-mindedness” (p. 191), can be measured and the types of assessments/rubrics that would do the job. I won’t spoil the chapter for those are interested. I will recommend it! I think Concepción is exactly right that our real learning objectives have to do with the type of people we help our students become. The information we provide them is necessary but not sufficient in itself. As C. Thi Nguyen writes in another essay in the book (“On Writing Fun, Joyful, Open-Ended Exams”, pp. 297-303; here, p. 298): “Many of us have come to think that good pedagogy is not just about the transmission of information. It is also about trying to encourage a mindset to foster intellectual virtue.”

This may be uncomfortable to say, especially in our current political environment where teachers are frequently targeted, often accused of “indoctrination” (a rich, though pitiful accusation as any teacher who has struggled to get students to complete work or pay attention in class for more than a few minutes is aware) but education includes values as much as (though likely more than) it includes information. If we teachers are honest, we don’t teach because we think we can compete with Wikipedia, Google, or ChatGPT; we teach because we believe we can model and impart intellectual virtues that help out students grow into flourishing humans. As soon as we admit this to ourselves, and articulate it aloud, we’ll begin to see that dispositional goals are the most important goals for most of us.

One more brief comment about homework

As many of us adults realize that being worked more and more does little for us and our mental health, there has been great consideration of “right to disconnect” laws. These laws protect you from employers who may expect you to answer emails or do task when you’re technically off the clock. I support these types of laws. Though I have workaholic tendencies, mostly because my interests, hobbies, and work overlap (religion, philosophy, education…I do that for a living but these are some of my primary curiosities in life as well), I do believe that we should “work to live” not “live to work”.

In my last post (see “Homework, rigor, and being the ‘chill’ teacher”), I explained why I don’t give my students homework. I conceded that homework is likely needed for some areas of study that need day-to-day practice: languages, mathematics, some sciences. And there are some challenging classes into which students self-select like Advanced Placement (AP) classes. But if the day-to-day practice isn’t needed, and students haven’t chosen a more challenging class, then I suggested that homework may not be useful.

I want to add to that this statement: what do we want to teach students about work/life balance? In adulthood, there are those who want certain challenges. There are those who want to work from sunrise to sunset. But that’s their choice (like signing up for AP classes). If like me you support “right to disconnect” laws because they allows adults who want to have lives outside of their work to have those lives, then it follows that we should be open to giving high school students some of this same respect.

This may not apply to higher education. We opt into higher education. But high school (and middle and elementary school) are mandatory. We may need to give students required homework to prepare them for college and the workplace but also we need to consider where we need to show them the same respect we hope to receive as adults who want free time to pursue interests outside the demands of our “produce, produce, produce more” culture.

Homework, rigor, and being the “chill” teacher

Earlier this week, our Head of School shared April Rubin’s Axios article, “Schools rethink homework” on Linkedin. I read it because I abandoned giving homework a few years ago. In the article, the pros and cons of homework are discussed. Two primary concerns regarding the giving of homework include (1) the ongoing mental health struggle of America’s youth and (2) the rise of AI which tempts students to find ways that may shortcut their learning. Between the risks of burning out our kids, and AI’s relativizing of homework’s value, some schools, even whole districts, have abandoned homework completely. California is asking schools to evaluate “the mental and physical health impacts of homework assignments”. I don’t know whether or not the complete removal of homework is good for our students but I do think we need to ask ourselves what it is that we think homework accomplishes.

Why I stopped assigning homework
When we returned from the pandemic, it was apparent to me that students struggled to learn at home compared to when they’re in school. I knew during the pandemic that many of my students were finding clever ways to check boxes to get the work done but it was less clear whether they were learning much. My reaction to what the pandemic taught me about teaching high schoolers was (1) to limit use of computers because I can’t compete the attractiveness of dozens of open tabs on my student’s browsers and (2) I decided that learning with me as their teacher was far superior to asking them to learn by themselves at home. Today, my students do a ton of handwriting. Almost everything is done on paper like it’s 1993. And I tell my students on the day that we go over the syllabus: “I want your commitment for the one hour and fifteen minutes we’re together every day and then when you leave this classroom, your time is your time, I won’t take any of that from you.” In my estimation, most of my students agree to this bargain and uphold their end of the deal.

My class in the school’s ecosystem
A related reason that I abandoned homework is that I teach religious studies. Don’t misunderstand me: I think that what I teach is as relevant to my student’s education as anything that my colleagues teach. What I don’t think is that they need to spend several hours at home going more in-depth in order to learn what I want them to learn. I could be wrong but my main goal has been to teach them ways of thinking, even postures toward learning, rather than just information. I teach them how to think about religion but not so much what to think about religion. This is best done in the community of my classroom. If students are curious to learn more about something we discuss in class—and that does happen—there’s no stopping them from learning more at home. But I haven’t found that by forcing them to take more work home that this has ever sparked their curiosity.

One reason that I don’t know if I’m against homework, full stop, is because I don’t teach math, science, languages, or AP (Advanced Placement) classes. Those classes may demand more day-to-day work. To be good at Calculus may require practicing every day. To learn Spanish may require practice every day. Now, if someone goes on to major in religious studies in college, they should be thinking about religious studies every day but for the purpose of high school religious studies—something most students in schools across America don’t study and if they do it’s almost always from a purely confessional vantage point—it seems unnecessary. If my students must have homework, I would rather that they work on their Calculus or Spanish at home. We can talk about Buddhist rituals in class tomorrow!

Is it bad to be the “chill” teacher?
Every semester, I have students who have taken my classes already tell me, “I miss your class!” Even students who seemed like they weren’t all that engaged. For many, this has to do with what we learned and how we learned it. But I get nervous at times because students will tell me that my class was a “GPA-booster”. (I’m not a difficult grader. Mostly, I grade for effort and work completed. If you show up, put forth effort, do the work I ask you to do, then you’re going to get most of your grade right there.) I had a student tell me, “Your class was so ‘chill’!” My immediate response was, “Oh no!” Why? Because I know humanities are often seen as less serious and less rigorous than STEM subjects. And many educators may see religious studies as frivolous or excessive. But when I asked the student to clarify what was meant, I was told that it had more to do with creating a low pressure environment where learning was enjoyable. My class didn’t stress them out.

Being the “fun” teacher isn’t always a compliment. But it can be. If students have fun learning, this isn’t bad. If they’re having fun because nothing academic is happening, then that’s a problem. I know from student testimonials and the observations of my colleagues that learning is happening, so I’ve learned to embrace the designation of “chill teacher” since I know what it means now: I’m not burning them to the ground. In part, I think the decision to ditch homework plays a role.

Do students know how to measure their own learning?
A colleague told me today that he overheard students talking about my classes that they took last year. In adolescent speech, one said something to the extent that “we didn’t have to do much for that class” to which the another student responded, “He did have us take a ton of notes.” I’ve had conversations with my students about how I teach and I get this sense: students measure classes by how (1) difficult they are and (2) demanding they are of the student’s time. My classes are neither. Yet students will comment on how much I drive the class and how I use all the time, often finishing right around the time of the bell so that there’s no wasted time. They’ll complain about all the reading and writing when they’re my students, and this is what I emphasis: a lot of reading and a lot writing. Not long papers. But a lot of note taking. A lot a shorter writing responses ranging from two, to five, to ten sentences where I ask them to put what they’re learning into their own (hand-written) words or to consider scenarios where they’d apply what they learned. It’s interesting to me that they find being in my class to be demanding, at the time, even stretching, but also “chill” and in retrospect one of the classes that gave them the most room to breathe.

When the road ends without major finals, or AP tests, or something like the SAT, there’s no “score” that helps my students see that they’ve developed a more sophisticated understanding of the complexities of religion or how to read the types of complicated texts we find in the Bible. For this reason, students don’t see something objective that shows them they’ve changed. That’s something that as a teacher I see (and sometimes don’t see) in their writing, in their discussions, etc. That my students go from expressing their frustration with trying to learn difficult, complex ideas, to reflecting on my class as one in which they felt comfortable, less stressed, and “chill” is a positive in my eyes. If I can teach them about hermeneutics, ancient history, genres of biblical literature, Hindu cosmologies, Buddhist rituals, the diversity of Judaism, etc., and they turn around and say, “that wasn’t so bad,” that seems positive to me. When they were done, like climbing a hill to see a beautiful sunset, the difficulty faded into the light of their new found knowledge and their reshaped worldviews. They forget how much their intellectual muscles were strained to get there. I’ll take that all day, every day.