Tweaks to how I’ve been teaching the Bible

As readers of this blog are aware, one of my great frustrations over the past several years has been my inability to find a satisfying way to teach biblical studies in a high school setting. Comparative religion? Check! Theory of religion? Check! American religion? Check! Even philosophy? Check! All these topics have been doable; not perfect, always, but doable! But most of my classes on the Bible have been frustrating. They’ve been the hardest to maintain attention, manage my classroom, create discussion, etc.

The harsh feedback of one student last year was something like this: “This class goes too deep; it covers too much”. We used to offer up to two semesters’ worth. The last versions of these classes were known as “The Hebrew Scriptures” and “The Christian Scriptures”. Loosely, they covered the Tanakh/Old Testament and then the New Testament, dabbling a little in non-canonical literature.

The decision was made to streamline the religious studies catalog going into this year. This included creating a standard class that all students must take as part of their religious studies credit (“Philosophy for Human Flourishing”). And it meant that we’d have a single semester offering of the Bible (“Introduction to the Bible”).

While I’m only a quarter of a year into it, so maybe I’m speaking too soon, I think this was the correct decision. Here are some of the changes that occurred in how I teach the Bible now that I’ve got half the time to do it in:

  1. I spend a unit talking about how we got the Bible: ancient writing, scribal culture, the role of the printing press in standardizing and democratizing access to the Bible, and how we get modern English translations. This was received surprisingly well.
  2. I focus on the basic basics. I mean, I essentially outline the Bible around Abraham, to Moses, to David, to Jesus. I don’t assume any biblical literacy going in. This is wise. I’ve noticed a steep decline in biblical literacy. I assume no pre-knowledge and explain everything like it’s the first time my students are hearing it. For those with some pre-knowledge, they’re able to contribute by asking questions and making observations that thicken the class discussion.
  3. I focus on the canon. While I do explain non-canonical literature, most students in high school taking a class on the Bible (in a private school in Texas) want to study the Bible, not early Jewish and Christian literature in the abstract. I’ll miss reading the Infancy Gospel of Thomas with my students, but I do think most people who enjoy non-canonical literature do so because of their familiarity with canonical literature.
  4. I’ve moved away from deep hermeneutical theory. Now, I will say that for many of my students, the hard work of hermeneutics was the most transformative part of my class. Students may have hated going through the lessons on how we read the Bible in an academic context, but they often expressed that this is where they learned the most. On the other hand, some students struggled and shut down during those lessons, which, for better or worse, I fronted my semesters with. Also, those for whom the Bible is such a sacred object that they’re almost afraid to read it (such actions should be left to a priest or pastor, right?), those lessons could cause them to become defensive. This isn’t to say that I’ve moved away from reading the Bible academically. But instead of explaining how this is done, I just try to model it for them.
  5. I’m more open to my students exploring the Bible as an object of their faith. I think I often taught the teenager I was, and not to the ones I had in the room with me. I needed someone to deconstruct certain fundamentalisms for me. The toxic presentation of the Bible that I experienced in my youth and college years needed fixin’. And I think I tried to introduce my students to academic biblical studies in order to preemptively help them avoid some of the pitfalls of fundamentalist hermeneutics. I still try to be the teacher who gets my students to think about historical, cultural, and other contextual matters; I still try to help them see the challenges of interpretation. But I’m not teaching teenage me. My audience is different, and I think they come to class with a healthier relationship to the Bible, maybe because they haven’t been force-fed it. They want to understand the text, and for many, maybe most, this is not because they want to study the Bible academically, but because they want a basic understanding of the sacred text of their faith. So, I’m trying to be more accommodating to that interest.
  6. I’ve gone back to physical Bibles. I used to print out excerpts. But I think there’s something about holding a book that leads us to take the act of reading more seriously, especially a book like the Bible.
  7. The final thing is outside of my control: class size. Usually, my classes are 20+. I know my public and Catholic school colleagues are probably thinking, “cry me a river,” but 20+ is a lot. This semester, my “Introduction to the Bible” classes are 11 and 14, and next semester, 14 and 20. The smaller numbers have made it more conducive to reading a text closely with a group.

We’ll see if these changes continue to have a positive impact, but I will say that even as I’ve watered down the academic side of things, a lot, I’m having more fun teaching the Bible than I’ve had in years!

In defense of “tryhards”

This morning, I saw a post on Threads from user @matt_dean94 that said, “Anti-intellectualism is so ingrained in western culture that we have expressions like ‘know-it-all’, ‘smarty pants’, ‘geek’, ‘nerd’ etc. to insult the clever kids at school”. It reminded me of something I’ve heard in my classroom the past couple of years. Students have been using the insult “tryhard” to dismiss kids who put in effort. “Tryhard” is a pejorative that suggests that someone puts in too much effort or cares too much. Yes, it can be used for someone who is a “poser” (though I think there’s another discussion to be had as to whether being a poser is a bad thing) but often I’ve heard it used against students who just want to do well.

What’s the alternative to being a “tryhard”?
A few weeks ago, I addressed the use of “tryhard” when I heard it. I asked this question: What’s the alternative? Even if you’re “naturally” intelligent—whatever that may mean—to call another kid a “tryhard” is to admit that you intentionally underperform. What would be the rational for that other than fear of failing? It seems unlikely that there is one. When we don’t try on purpose, it’s so that when we do fail we can tell ourselves, “Well, of course I failed, because I didn’t try.” To try and to fail can be devastating to our pride. In other words, to be something other than a “tryhard” is a defence mechanism used by someone who is afraid to take a risk; to put themselves out there; to fail.

To call someone else a “tryhard” is to brag that whatever success one has as a non-tryhard is handed to them by parents, teachers, their school, their college counselors, etc. It’s to admit that we’re willing to be carried by others and then take credit for it later. It’s to say that we got lucky enough to be part of systems—familial, educational, etc.—that can guarantee our success. It would appear to me that while being called a “tryhard” is supposed to a criticism, to be a non-tryhard can’t be anything other than criticism. It’s like bragging about nepotism. (And yes, sadly, I recognize that we do live in a world where people brag about benefitting from nepotism.)

Gary Plummer, or someone
If my memory serves me correctly, when I was about 12 years old, I was watching the 49ers on TV and there was a player being interviewed. I think the player Gary Plummer whose last game was 1994. Again, I can’t claim that this is what I heard with precision but it’s an impression that’s stuck with me for decades that I’ve long attributed to Plummer. The player said something like, “Thank you to all the players who had so much more talent than I did but who didn’t put in the work.” In essence, the player was saying that he wasn’t the most naturally talented but that he was willing to outwork his peers. That willingness to put in the effort paid off and gave him opportunities he wouldn’t have had otherwise. That stuck with me and my mind has recalled it many times over the years. Plummer, or the player who I’m associating with Plummer, was a “tryhard” and it gave him a full NFL career.

Now, I don’t want to be heard as promoting “hustle culture” or being a work-a-holic. I don’t advocate for those mindsets. But there’s nothing wrong with wanting to do well. There’s nothing wrong with wanting to give one’s best. There’s nothing wrong with knowing you gave your all, put in full effort, and gave yourself a chance to fully experience something. For this reason, being a “tryhard” is a compliment, not a criticism.

Trying hard doesn’t guarantee success…but that’s not the point
Being a “tryhard” doesn’t guarantee success. Our world is full of ignorant and incompetent people who are in positions of authority or have become wealthy. (This is why Plato warned that if wise people don’t govern, they’re guaranteed to be ruled over by fools!) Life is too complicated for all the good things to go to those who put in the effort. This is why meritocracy can be such a misleading ideology that lets people down. (A great book on that topic is Michael J. Sandel’s The Tyranny of Merit.) But something I want my students to understand, and something I’ll teach my son (who was born about almost two weeks ago and is doing great!), is that all you can control is your integrity. All you can control is knowing that you gave your best and stayed true to your values; that you lived fully in every context and gave yourself the opportunity to discover what you may enjoy in life. Yes, it’s true that you may hate Algebra no matter how hard you try…but you could end up loving Algebra and you’ll miss every single one of these potential experiences when you preemptively decide to save face by refusing to become vulnerable enough to find out what you truly enjoy in life. Youth is when you have this opportunity. It fades with adulthood. So, be a “tryhard” while you can. You never know what a little effort will help you discover about yourself and the world.

Song of the Day

This week I was talking to one of our seniors who took my classes when she was either a freshman or a sophomore. She told me that those were difficult years for her but then she shared something that made my day. She told me that the days that she had my class she would brighten up a bit because she thought, “I wonder what song will be played in LePort’s class today?” She’s referring to a daily tradition of mine that I derived from my friend and mentor Ruben Dupertuis. I begin each class with a “Song of the Day” that is playing toward the end of the passing period between classes. As they enter my classroom, the song is connected in some way to that day’s lesson content. It could be the artist, song title, album cover/music video, or the excerpt from the lyrics that are on the screen.

I’ve turned “Song of the Day” into a daily extra credit (“Bonus Point”) opportunity where I allow up to five students to try to tell me what the connection is. It functions as a fun pregame show, if you will, for the lesson’s content. But it’s also a culture builder. It creates a warmth to the classroom as they enter. Or, at least that’s what I hope it creates! And I think for many it works to get them thinking about what we’re about to learn as they listen to their peers try to bridge the gap between the song and the title of the lesson written on a white board.

For the aforementioned student, it was just the idea of a class beginning with music that brighten her day. I don’t think my pedagogical goals were being accomplished because of all that she was experiencing but as I’ve learned over the past eight years as a high school teacher, your main priority is helping young people become adults. What you teach does matter. I don’t want to downplay that at all. The subject-matter matters! But your goal in high school is different than being a college professor who is teaching to students who happen to be majoring in your field of expertise. Most of your students won’t go on to become the same type of professional that you are. (So far, only one of the several hundred students that I’ve taught has gone on to seminary. Two others minored in religious studies and another minored in philosophy. I could be wrong but I think that’s the extent of the students who have gone on to focus on the type of content that I teach once they graduated.) They will become contributors to our society which I hope will remain a functioning democracy. Sometimes the best you can do is help them continue forward through their rough patches. That may mean that your classroom feels like a place where they can be happy during unhappy days.

But there’s a pedagogical method to the madness as well. Music helps our brain make connections and memories. I’ve had students walk past my classroom and tell me, “I remember that song and we talked about…”! They don’t remember all the details but they have some retention. I was never one to memorize Bible passages, or lines from plays, etc., but what I do remember all the way from childhood was information that was connected to music (for example, I can tell you the “fruits of the spirit” because I was taught it to music as a kid).

On a final note, I don’t think teachers need to start each class with music but I do wonder how much music could improve a class. I imagine teaching modern American history and dropping certain songs into different lessons that were important at the time. My guess is that this would enliven any class but also tie the content to music which should help students remember what was taught a little better!

AI in the/my classroom

The use of Artificial Intelligence (AI) in the classroom is something that all faculties, from elementary to graduate school, need to address. Last week our upper school faculty broke into groups to do just this. It seemed fruitful but nowhere near final. I’ll admit that I’m something of an AI-skeptic. I won’t pretend that I understand how it all works but I do try to read articles and listen to podcast episodes where experts address the rapid changes that we’re seeing. To the best of my ability, I’ve formulated an opinion not so much on whether AI should be used in the classroom but whether it should be used in my classroom. I want to put those thoughts down somewhere, so here we go.

What do we mean by “AI”?
One problem with this discussion is that everything seems to be “AI” now. As one podcast I was listening to pointed out: AI has become a marketing label. It’s useful for gaining venture capital. It’s helpful for selling your product. AI means so many different things (does Word use AI? Grammarly? ChatGPT? and are these products all doing the same thing?) that a broad acceptance or denouncement is impossible. (I’m sure it’s linked below but I can’t remember which one of the podcasts this point is from!) Personally, I’m most concerned with “Large Language Models” or “LLMs”.

Is AI’s relevance the same for all subjects?
One thing I noticed during our faculty discussion is that my colleagues who teach in our “English” or “Social and Religious Studies” departments emphasized the dangers of AI while my colleagues who teach STEM topics emphasized the benefits. The educational goals of the humanities stand in tension with many of the educational goals of STEM. I’ve noticed that many STEM teachers are prone to celebrate what humans can do with new scientific discoveries and technological advances whereas many humanities teachers tend to sound the alarm with regard to what these discoveries and advances might do to our humanity. (On this note, I highly recommend Scott Stephens and Shannon Valor’s discussion: “What is AI doing to our humanity?”) This isn’t always the case. Some people involved in the humanities are convinced that the humanities need to embrace things like AI (e.g. “AI, reading, and the humanities”). They may be correct though as I’ll discuss below, I think the answer to the question of “Is AI good for us?” depends on the context in which it’s being asked.

Again, I return to my favorite “Jurassic Park” meme to explain how humanities teachers often feel about what’s happening in the world of STEM:

In a recent interview with Sean Illing (see “Yuval Norah Harari in the eclipsing of human intelligence”), Yuval Noah Harari talked about his new book Nexus: A Brief History of Information Networks from the Stone Age to AI. He frames history around information networks. Harari isn’t an alarmist but he’s concerned about the impact of AI (one information network) on democracy (another information network). This goes beyond Russian spam bots on X/Twitter and other social media. If someone like Harari is sounding the alarm, we should listen. The more we teach our students to outsource their own thinking to AI systems, or even Google search results, the less we should be surprised when we’re surrounded by people who are easily manipulated by technology for the simple reason that it’s technology!

For reasons like this, I won’t speak to what my colleagues in mathematics or the sciences are doing. I will say that those of us who teach students to read, write, philosophize, theologize, engage in politics, compile history, create art, etc., should be very concerned about what AI could do to our student’s brains.

Is AI’s dominance inevitable?
Another argument I heard for using AI in the classroom goes something like this: the dominance of AI is inevitable, it’s the future, so we better spend time teaching students how to use it. I’m not so sure that I’m convinced that this is true. One book that I want to read soon is AI Snake Oil: What Artificial Intelligence Can Do, What It Can’t, and How to Tell the Difference. One of the authors, Prof. Arvind Narayanan of Princeton University, was interviewed by Anthony Funnell (see “AI snake oil—its limits, risks, and its thirst for resources”), and I came away from listening to that interview wondering if many of us are buying into the marketing campaigns of the Elon Musks and Sam Altmans of the world who hope to continue make profit off of convincing us that they can see the future. Musk has been promising self-driving Teslas for a while now and we know that hasn’t been going well but if Musk, or Altman, tell investors and consumers that they don’t know if and when the technology will mature, they’ll lose investors and consumers. It’s important for them to convince us that we’re missing the train to the future and that they’re driving it!

Does AI need to be paired with maturity?
Let’s concede that AI’s dominance is inevitable, for the sake of argument. This doesn’t automatically answer whether or not students should use these tools in our classrooms. There are many things that may be inevitable for our students when they’re older. I would be shocked to see a third grade teacher putting a kid behind the wheel of a car because driving is inevitable! Similarly, if students haven’t learned how to read, write, analyze, etc., yet, it’s educational malpractice to emphasize tools for which they’re not ready!

There are stages of our development when handwriting is really good for students (see “Handwriting is good for the brain”). There are stages of development when less is more with regard to technology use and accessibility (see “Anecdotal evidence about phones in the classroom”). And I think there are stages in our development when once we’ve learned the basic skills that the humanities teach us, we may be ready for using AI. Personally, I’m happy for my students to wait until college and I’m satisfied with punting to the colleges and universities that have way more resources for dealing with student use of AI. When kids go to college, they have to make all sorts of decisions about how they spend their time, who they spend it with, etc., that we don’t ask them to make in high school.

I’ve heard some compare hesitancy to embrace AI with hesitancy to embrace the Internet in the 1990s. I don’t think this is the same thing but I do think that such a claim makes an unintentional observation. All of us wish we would’ve known how the Internet would be weaponized for things like misinformation, bullying, algorithms that feed on anger, etc. If we could go back and prepare ourselves for the ugly side of Internet use, we would. This is my warning! We know that LLMs bullshit (see “ChatGPT is Bullshit” by Michael Townsend, et al., and “Are LLMs Natural Born Bullshitters” by Anand Jayprakash Vaidya). They don’t know any better. If we don’t try to help our students develop skeptical thinking skills (see below), we’re feeding them to AI systems that have no way of caring whether or not what is being said is true or false. As J. Aaron Simmons has written about bullshitters (see “I’d Rather Be a Liar”):

“In contrast to the liar, the bullshitter doesn’t even care about truth at all. They are not intending to deceive their audience, but rather the bullshitter attempts to motivate behavior in their audience that supports their own self-interest.”

Systems like ChatGPT have one “goal”: engagement. They’re not concerned with truth, as Vaidya wrote in the article linked above:

Research suggests that LLMs, left to their own devices, are natural-born bullshitters. The tendency for LLMs to hallucinate has only been reduced through reinforcement learning from human feedback. Without human intervention, they appear to lack the ability to control or reduce their hallucinations through training unaided by humans. Even if their hallucination rate is low, it might be that they have a fundamental disposition to bullshit as a result of the fact that they think* as opposed to think as well as care* as opposed to care for the truth.”

In other words, whatever seems “human” about LLMs is because we humans remain involved. One analogy Vaidya gives is helpful. He writes, “Just as we can say a car ‘runs’, when it is clear to everyone that the underlying mechanics of a functioning car and a running animal are fundamentally different, we can also apply words like ‘think’, ‘assert’, ‘understand’, and ‘know’ to LLMs without losing sight of the underlying mechanical and structural differences. Mental life need not be human mental life to be mental life.” Hence, the asterisks next to “think” and “care” in the above quote. LLMs “think” and “care” like us like cars “run” like us.

Creating Skeptical Thinkers/Avoiding AI’s “Mirror”
Personally, I don’t think many adolescents are ready to discern what bullshitters like ChatGPT are feeding them. This means that those of us who are fighting for the future of the humanities need to be very intentional in teaching our students to be skeptical thinkers. What do I mean by this? Well, I mean something like what Prof. Jamil Saki of Stanford University calls “hopeful skepticism” which he contrasts with cynicism:

“…hopeful skepticism is about applying a scientific mindset. Like a scientist, hopeful skeptics seek out facts and evidence instead of relying on feelings and fears. And rather than being fatalistic, they are critical and curious instead.”

We need to teach students to have a skeptical mindset that doesn’t just accept things at face value but, again, seeks “out facts and evidence” and is “critical and curious”. I can use ChatGPT this way. I can use Google search results this way. But my students could become easily susceptible to just embracing whatever ChatGPT or Google feeds them. If we don’t prepare them for this (which may mean walking them through the use of LLMs in our classes but doesn’t necessitate making that jump), we’ll be in trouble as a society. We’ll face a future were LLMs, like dogs returning to their vomit, consume AI generated information so that the cycle of information is AI feeding AI feeding AI. As Shannon Vallor argues in (another book I need to read) The AI Mirror, “today’s powerful AI technologies reproduce the past”. They reflect past, cumulative human knowledge (see the already linked above interview: “What is AI doing to our humanity?”). Whether they can create new knowledge is to be determined but we shouldn’t outsource the creativity of the human brain to AI anymore than we should start talking to someone’s reflection in a mirror while ignoring the person/people being reflected. When it comes to thinking, we’re still superior.

AI, reading, and the humanities

In a recent episode of the podcast “The Philosopher’s Zone” with David Rutledge titled “AI and Reading”, UMass-Lowell philosophy professor John Kaag was interviewed about a new project of his: “Rebind”. For your convenience, here’s a trailer for the product:

I admit, this sounds kind of interesting and I’d be interested in reading some of the books they have ready with the commentary and chat features, just to see what the experience is like.

In the interview, Kaag addresses a few topics that many of us in education know already. He talks about how learning needs to move away from information dumping and regurgitation. He addresses the problem of the perceived inaccessibility of many of the classic texts. He reminds us that the humanities have been in decline for a while now, so the AI revolution—if that’s what’s happening—can’t be blamed for growing disinterest. He’s positive toward AI. He sees it more as a solution than a problem.

As I listened, a few things came to mind:

  1. Kaag talks about how teaching may need to be a little more personal, a little more 1-to-1. The problem with this suggestion is practical. Class sizes are growing. If your institution—public, public charter, or private—isn’t growing, they either have a strong endowment/tax base or they’re dying. So it’s unlikely that class sizes can shrink to the place where teachers can do the type of 1-to-1 educating that Kaag suggests. We’d have to do a complete rehaul of our current system.
  2. Kaag sees the rise of AI’s significance and necessity as inevitable. My response would be that this is likely true but I think that the inevitability of AI’s significance can be embrace in a healthy manner and an unhealthy one. Large Language Models (LLMs) scrape the Internet for their data. We humans created that data. If we collectively become too reliant on LLMs, I fear that this will hinder human creativity. Yes, we learn from others. Yes, our learning is the ingestion and realigning of things we learn. But we do this as embodied creatures with agendas, goals, desires, motives, inspirations, imaginations, etc. We do this with an almost endless variety of purposes, as each of us contributes something unique. Our collective “hive-mind” is what it is because of individuality and uniqueness, in part. I don’t see that in LLM’s. Will LLMs have less and less truly unique and creative insight upon which to draw if we humans outsource of thinking to computers?
  3. Much of what LLMs produce is akin to what the philosopher Henry Frankfurt calls “bullshit” (neither truth nor lies, both which indicate intentionality, but just content that is careless about whether it is true or false). See the recent article “ChatGPT is Bullshit” by Michael Townsend Hicks, James Humphries, and Joe Slater for the journal Ethics and Information Technology.
  4. This may mean that there are stages to our developing skills and postures with regard to learning and learning with AI. My gut says that we should try to create a setting where young people have to develop their own independent thinking abilities in preparation for using those thinking abilities to engage what AI has to offer as active contributors and not just passive consumers. In my classes, students read from paper and handwrite on paper. Hypothetically, let’s say that your freshman/sophomore years, all reading and writing is done this way in class under the supervision of educators. Then as one becomes a junior and a senior in high school, preparing for the independence of their college and/or professional years, we focus on teaching them how to take their own original ideas and interface them with AI? An argument could be made that we have them wait until college to do this and high school focuses completely on reading/writing in a traditional, almost pre-Internet way.

I’m sympathetic to the idea that the humanities needs to embrace AI in order to be relevant in the future because this is what current cultural and market forces demand. But I’m hesitant to abandon the boring, laborious parts of learning that lay the ground work for the human brain because I worry that the quicker we outsource our thinking and creativity to AI, the sooner we’re going to realize we’ve placed ourselves in a collective spin where little creativity, innovation, or new thought can flourish.

Anecdotal evidence about phones in the classroom

I’m not a psychologist or a social scientist. But my own experience in the classroom has made me pay attention to the claims of people like Jonathan Haidt and Jean Twenge. Both have sounded the alarm with regard to adolescent (over)use of smartphones. I’ve confiscated student phones only to have my pocket buzz incessantly. I wondered how anyone could focus with notification after notification from Snapchat, Instagram, and TikTok vying for their attention. I’ve seen my students sit around together but not speaking to each other as each stared into their phone. Adults do this sort of thing too but as Haidt, Twenge, and other has noted: we had a chance to live through our brain’s important developmental stages before getting smartphones. Gen Z didn’t get the opportunity. For this reason, Haidt, Twenge, et al., have argued for causation between smartphone use/addiction and the ongoing mental health crisis we see about America’s youth (for example, see Haidt’s “End the Phone-Based Childhood Now”).

My wife and I have seen the children of parents who raised their kids without smartphones and tablets and those who allowed it. Our experience told us that there are drastic differences in these kids ability to wait, be patient, delay gratification, hold conversations, read books, be creative, and just enjoy being children with imaginations. Our kid won’t have a smartphone or a tablet at their disposal. If they use it at all in daycare or school, we’ll ask for limits. My plan is to keep these technologies out of their lives as long as I can.

For this reason, I was surprised when a recent episode of Freakonomics (“Is Screen Time as Poisonous as We Think?”) interviewed Andrew K. Przybylski of Oxford University who seemed to brush these concerns aside. I think his main point was that phones aren’t the end-all, be-all of Gen Z’s mental health crisis. But as I listened to him, I thought what he was saying didn’t match my experience at all. You see, this year our school went phone free. And I don’t know how many students are going to our student counselor. And I can’t tell you whether they feel happier in general. I can tell you what I see in the classroom though: they’re more focused; they contribute to class conversations more freely; they seem to have more patience when reading; they seem less stressed and distracted; they seem more in the moment. Several of my colleagues have noticed the same thing.

Our school is using Yondr. The kids were not happy about this at the beginning of the year but more and more are telling my colleagues that they admit that they kind of enjoy the freedom. Maybe Przybylski would agree that this can be good. Maybe his point has little to do with phones in schools and more to do with the smartphone-mental health causation argument. But a few weeks into this new school year and I think our school’s decision to remove phones has been one of the best ones we’ve made in years. The students seem happier!

Phones weren’t allowed last year, technically. We told the kids to keep them “off and away” during class. They could take them out between classes. This meant that in reality many students still had their phones on their bodies all day. All those notifications grabbing their attention endlessly from their pocket, making them want the class to be over now so that they could hurry to check their social media. Now my students often lose track of time as they lack phones and smart watches, and I rarely use computers in my class. Also, I don’t have a clock on my wall. The few students with traditional watches keep time but quite often it’s clear that they don’t know how much time has passed in class. This has made a huge difference.

I teach at a relatively affluent private school. My experience is limited to one demographic of kids. I don’t want to claim to be diving into the big picture psychology and social science of adolescents and phones. But for our school, and for my students, the removal of phones has been a gift. As an adult, I’ve noticed that when I spent too much time on social media, I feel worse about things. When I stare at my phone for too long, it’s rarely a good sign. As I try to use my phone and social media less, my brain feels freer, happier. If this is how things are for my forty-two year old brain, I can’t imagine that a fourteen through eighteen year old brain doesn’t benefit at least as much from time away from their phones and social media. For that reason, as the debate goes forward in universities and research labs, I’m going to go with my experience and root for limiting phone/social media use by young people.

Handwriting is good for the brain

I’ve mention a few times that my students handwrite their notes, and their exit tickets, and pretty much everything. The #1 reason for this? I can’t compete with all the tabs open on their Internet browser. When I used to allow computers, engagement seemed impossible. It wasn’t clear that they were listening to me. It was easy to have a classmate send their notes by email or messenger, which could then by copied-and-pasted. This changed after I banned computers. Students became more likely to participate in class. And even if you copy your classmates notes, you have to take time to write them out yourself, which leads to more learning than copying-and-pasting.

This leads me to the #2 reason: I think handwriting is better for learning. I can’t remember the book that I read years ago on this subject—it’s likely outdated and out of print now—but recent science seems to confirm that handwriting notes helps learning stick! For instance, earlier this year the article “Handwriting but not typewriting leads to widespread brain connectivity: a high-density EEG study with implications for the classroom” was published in the journal Frontiers in Psychology and they authors (F R Ruud Van der Weel Audrey L H Van der Meer ) conclude:

“When writing by hand, brain connectivity patterns were far more elaborate than when typewriting on a keyboard, as shown by widespread theta/alpha connectivity coherence patterns between network hubs and nodes in parietal and central brain regions. Existing literature indicates that connectivity patterns in these brain areas and at such frequencies are crucial for memory formation and for encoding new information and, therefore, are beneficial for learning. Our findings suggest that the spatiotemporal pattern from visual and proprioceptive information obtained through the precisely controlled hand movements when using a pen, contribute extensively to the brain’s connectivity patterns that promote learning.”

I’m not a scientist so I can’t evaluate these findings but PubMed has many other articles that seem to be making the same claim. My experience is anecdotal but even if there weren’t studies like this one that seem to support my hunch, I know I would continue to have my students write by hand because of the difference that I’ve experienced. And because I’m not as interesting as whatever is on tabs 7, 12, 28, and 39.

What do we want for our students? Dispositional growth!

Lately, I’ve been writing a lot about shifts in my thinking regarding what I teach and how I teach it. I tend to be an introspective and retrospective person by nature. This has been super charged by the news that I received several months ago that by the end of the year, I’ll be a “dad”. I began to wonder, “What kind of education will I want my kid to have when they enter high school?” Also, “If I were to be my child’s teacher someday, what/how would I want to teach them?” These questions haven’t led to a midlife crisis. I enjoy what I’m doing as a teacher. I have no desire to do anything else. But I’ve thought a lot about the future relevance of what I’m teaching currently, and I’ve second guessed the viability of the field of study to which I’ve dedicated so much of my life, at least whether or not I’m interested in the questions that I would need to keep asking in order to continue being engaged.

Additionally, I’ve reflected on the environment I hope to create in my classroom but also outside my classroom, as in what place do I think education has within the context of the adolescent’s life. In a recent post, “Homework, rigor, and being the ‘chill’ teacher”, I wrote about how I try to measure the success of my teaching in ways that don’t align naturally with the current modus operandi of American education. What do I want to see? I want to see students learning how to read: to read thoughtfully, carefully, and intentionally. I want them to become accustom to taking notes and using those notes. I want them to practice putting what they’ve learned into their own words, so that they take ownership of their knowledge rather than thoughtlessly parroting how others say it, or worse, outsourcing their learning to emerging AI or the top Google search results. I plan to help students learn to develop arguments (in the philosophical sense) where they can show their reasoning. I want students to be mentally tired at times in my class but I want the culture of my classes to be such that when they look back on their experiences, it felt “easy” because what I was trying to teach them became natural to them, and while they were pushed to stretch themselves, they weren’t driven to anxiety. These are ideals, aspirations.

I purchased a book titled The Art of Teaching Philosophy: Reflective Values and Concrete Practices to help me think through these ideals and aspirations. One essay captured what I’ve been feeling:

That first paragraph from David W. Concepción‘s essay (pp, 189-196) grabbed my attention. That’s what I’ve been trying to articulate. I want to focus on dispositional growth. And this exercise at the beginning of the essay grasps what I’ve been seeking. What do I want to stick with my students when they reflect on the experience of my classes years after they take them? Those are my “learning objectives”!

I completed the exercise. My gut response is something like this:

  1. I want them to use their knowledge to increase their inward happiness but also the outward good that they will do in the world
  2. I want them to become more thoughtful/self-examined/self-aware
  3. I want them to develop an open posture toward learning
  4. I want them to become more tolerant/less dogmatic
  5. I want them to develop and sharpen their critical thinking skills

Concepción’s chapter addresses concerns that administrators may have that such dispositional goals are immeasurable. He argues convincingly that all learning assessments are “[inferential] through proxy”. Furthermore, he provides guidance for how objectives that matter to philosophers and other teachers under the umbrella of the humanities, such as increased “curiosity, intellectual humility, comfort with ambiguity, and fair-mindedness” (p. 191), can be measured and the types of assessments/rubrics that would do the job. I won’t spoil the chapter for those are interested. I will recommend it! I think Concepción is exactly right that our real learning objectives have to do with the type of people we help our students become. The information we provide them is necessary but not sufficient in itself. As C. Thi Nguyen writes in another essay in the book (“On Writing Fun, Joyful, Open-Ended Exams”, pp. 297-303; here, p. 298): “Many of us have come to think that good pedagogy is not just about the transmission of information. It is also about trying to encourage a mindset to foster intellectual virtue.”

This may be uncomfortable to say, especially in our current political environment where teachers are frequently targeted, often accused of “indoctrination” (a rich, though pitiful accusation as any teacher who has struggled to get students to complete work or pay attention in class for more than a few minutes is aware) but education includes values as much as (though likely more than) it includes information. If we teachers are honest, we don’t teach because we think we can compete with Wikipedia, Google, or ChatGPT; we teach because we believe we can model and impart intellectual virtues that help out students grow into flourishing humans. As soon as we admit this to ourselves, and articulate it aloud, we’ll begin to see that dispositional goals are the most important goals for most of us.

One more brief comment about homework

As many of us adults realize that being worked more and more does little for us and our mental health, there has been great consideration of “right to disconnect” laws. These laws protect you from employers who may expect you to answer emails or do task when you’re technically off the clock. I support these types of laws. Though I have workaholic tendencies, mostly because my interests, hobbies, and work overlap (religion, philosophy, education…I do that for a living but these are some of my primary curiosities in life as well), I do believe that we should “work to live” not “live to work”.

In my last post (see “Homework, rigor, and being the ‘chill’ teacher”), I explained why I don’t give my students homework. I conceded that homework is likely needed for some areas of study that need day-to-day practice: languages, mathematics, some sciences. And there are some challenging classes into which students self-select like Advanced Placement (AP) classes. But if the day-to-day practice isn’t needed, and students haven’t chosen a more challenging class, then I suggested that homework may not be useful.

I want to add to that this statement: what do we want to teach students about work/life balance? In adulthood, there are those who want certain challenges. There are those who want to work from sunrise to sunset. But that’s their choice (like signing up for AP classes). If like me you support “right to disconnect” laws because they allows adults who want to have lives outside of their work to have those lives, then it follows that we should be open to giving high school students some of this same respect.

This may not apply to higher education. We opt into higher education. But high school (and middle and elementary school) are mandatory. We may need to give students required homework to prepare them for college and the workplace but also we need to consider where we need to show them the same respect we hope to receive as adults who want free time to pursue interests outside the demands of our “produce, produce, produce more” culture.

Homework, rigor, and being the “chill” teacher

Earlier this week, our Head of School shared April Rubin’s Axios article, “Schools rethink homework” on Linkedin. I read it because I abandoned giving homework a few years ago. In the article, the pros and cons of homework are discussed. Two primary concerns regarding the giving of homework include (1) the ongoing mental health struggle of America’s youth and (2) the rise of AI which tempts students to find ways that may shortcut their learning. Between the risks of burning out our kids, and AI’s relativizing of homework’s value, some schools, even whole districts, have abandoned homework completely. California is asking schools to evaluate “the mental and physical health impacts of homework assignments”. I don’t know whether or not the complete removal of homework is good for our students but I do think we need to ask ourselves what it is that we think homework accomplishes.

Why I stopped assigning homework
When we returned from the pandemic, it was apparent to me that students struggled to learn at home compared to when they’re in school. I knew during the pandemic that many of my students were finding clever ways to check boxes to get the work done but it was less clear whether they were learning much. My reaction to what the pandemic taught me about teaching high schoolers was (1) to limit use of computers because I can’t compete the attractiveness of dozens of open tabs on my student’s browsers and (2) I decided that learning with me as their teacher was far superior to asking them to learn by themselves at home. Today, my students do a ton of handwriting. Almost everything is done on paper like it’s 1993. And I tell my students on the day that we go over the syllabus: “I want your commitment for the one hour and fifteen minutes we’re together every day and then when you leave this classroom, your time is your time, I won’t take any of that from you.” In my estimation, most of my students agree to this bargain and uphold their end of the deal.

My class in the school’s ecosystem
A related reason that I abandoned homework is that I teach religious studies. Don’t misunderstand me: I think that what I teach is as relevant to my student’s education as anything that my colleagues teach. What I don’t think is that they need to spend several hours at home going more in-depth in order to learn what I want them to learn. I could be wrong but my main goal has been to teach them ways of thinking, even postures toward learning, rather than just information. I teach them how to think about religion but not so much what to think about religion. This is best done in the community of my classroom. If students are curious to learn more about something we discuss in class—and that does happen—there’s no stopping them from learning more at home. But I haven’t found that by forcing them to take more work home that this has ever sparked their curiosity.

One reason that I don’t know if I’m against homework, full stop, is because I don’t teach math, science, languages, or AP (Advanced Placement) classes. Those classes may demand more day-to-day work. To be good at Calculus may require practicing every day. To learn Spanish may require practice every day. Now, if someone goes on to major in religious studies in college, they should be thinking about religious studies every day but for the purpose of high school religious studies—something most students in schools across America don’t study and if they do it’s almost always from a purely confessional vantage point—it seems unnecessary. If my students must have homework, I would rather that they work on their Calculus or Spanish at home. We can talk about Buddhist rituals in class tomorrow!

Is it bad to be the “chill” teacher?
Every semester, I have students who have taken my classes already tell me, “I miss your class!” Even students who seemed like they weren’t all that engaged. For many, this has to do with what we learned and how we learned it. But I get nervous at times because students will tell me that my class was a “GPA-booster”. (I’m not a difficult grader. Mostly, I grade for effort and work completed. If you show up, put forth effort, do the work I ask you to do, then you’re going to get most of your grade right there.) I had a student tell me, “Your class was so ‘chill’!” My immediate response was, “Oh no!” Why? Because I know humanities are often seen as less serious and less rigorous than STEM subjects. And many educators may see religious studies as frivolous or excessive. But when I asked the student to clarify what was meant, I was told that it had more to do with creating a low pressure environment where learning was enjoyable. My class didn’t stress them out.

Being the “fun” teacher isn’t always a compliment. But it can be. If students have fun learning, this isn’t bad. If they’re having fun because nothing academic is happening, then that’s a problem. I know from student testimonials and the observations of my colleagues that learning is happening, so I’ve learned to embrace the designation of “chill teacher” since I know what it means now: I’m not burning them to the ground. In part, I think the decision to ditch homework plays a role.

Do students know how to measure their own learning?
A colleague told me today that he overheard students talking about my classes that they took last year. In adolescent speech, one said something to the extent that “we didn’t have to do much for that class” to which the another student responded, “He did have us take a ton of notes.” I’ve had conversations with my students about how I teach and I get this sense: students measure classes by how (1) difficult they are and (2) demanding they are of the student’s time. My classes are neither. Yet students will comment on how much I drive the class and how I use all the time, often finishing right around the time of the bell so that there’s no wasted time. They’ll complain about all the reading and writing when they’re my students, and this is what I emphasis: a lot of reading and a lot writing. Not long papers. But a lot of note taking. A lot a shorter writing responses ranging from two, to five, to ten sentences where I ask them to put what they’re learning into their own (hand-written) words or to consider scenarios where they’d apply what they learned. It’s interesting to me that they find being in my class to be demanding, at the time, even stretching, but also “chill” and in retrospect one of the classes that gave them the most room to breathe.

When the road ends without major finals, or AP tests, or something like the SAT, there’s no “score” that helps my students see that they’ve developed a more sophisticated understanding of the complexities of religion or how to read the types of complicated texts we find in the Bible. For this reason, students don’t see something objective that shows them they’ve changed. That’s something that as a teacher I see (and sometimes don’t see) in their writing, in their discussions, etc. That my students go from expressing their frustration with trying to learn difficult, complex ideas, to reflecting on my class as one in which they felt comfortable, less stressed, and “chill” is a positive in my eyes. If I can teach them about hermeneutics, ancient history, genres of biblical literature, Hindu cosmologies, Buddhist rituals, the diversity of Judaism, etc., and they turn around and say, “that wasn’t so bad,” that seems positive to me. When they were done, like climbing a hill to see a beautiful sunset, the difficulty faded into the light of their new found knowledge and their reshaped worldviews. They forget how much their intellectual muscles were strained to get there. I’ll take that all day, every day.