Whose opinions matter?

In reference to the opinions of others, there are two pitfalls we need to avoid:

  1. We shouldn’t concern ourselves with everyone’s opinions. This will lead to severe anxiety. It will freeze us because we’ll be afraid to act. Yes, people have their views about the actions of others, and people will have their thoughts about your actions, but eventually people will forget about what you did because they’re in their own head most of the time. This means they’re not thinking about you all that much. It doesn’t take very long for most people to forget why they were celebrating or denouncing you!
  2. We shouldn’t edge toward sociopathic behavior. (If someone is a diagnosed sociopath, that’s a different conversation to be had…with a psychologist.) The “only God can judge me” mindset forgets that we humans are communal. Our actions and decisions impact others. We don’t live in a bubble. We depend on others, no matter how individualistic we may be. If our words or deeds are harming others, and they tell us, we should listen to what they have to say. If we’re working with others in any capacity, and we’re taking for granted their contributions thinking ourselves to be more important than we are, we need a reality check.

The tricky thing is to know when someone’s opinions should matter to us and to what degree. I think that the opinions of strangers should be of least importance. The opinions of those who are invested in your life on the day-to-day should matter more. But there’s a caveat as it could be a stranger who offers you a message of hope and it can be our loved ones who are most harmful toward us. That’s with regard to personal matters. When it comes to broader scientific questions, the source of expertise changes.

In Plato’s Crito, Socrates sits in jail awaiting his execution with a friend named Crito who arrived before Socrates awoke in order to convince him that he should escape prison because his sentencing was unjust (see the video above for a great overview). Crito throws several arguments against the wall to see if any of them will stick, convincing Socrates to flee with his help. One of Crito’s arguments is that if Socrates dies, people will judge his friends for being cowards who didn’t help him: “it [will] appear that we have let you slip out of our hands through some lack of courage and enterprise on our part” (Tredennick and Tarrant translation, p. 83 [46a]).

Socrates responds to this concern a number of ways. First, he comments that it has never been his way to “accept advice from any of my ‘friends’ except the argument that seems best on reflection” (p. 84 [46b]). In other words, just because a friend says something doesn’t mean we must agree. We should be thoughtful and discerning about even what our friends say. The opposite must be true: our enemies aren’t wrong just because they’re our enemies.

Second, we should give certain people standing and not others: “Was it always right to argue that some opinions should be taken seriously but not others?” Socrates asks the question with the assumption is that yes, we should accept the opinions of some as worth more than others (p. 84 [46d]).

Third, we should choose the opinions we entertain based on their soundness: “one should regard the sound ones and not the flawed”. By this Socrates means, “The opinions of the wise being sound, and the opinions of the foolish flawed” (p. 85 [47a]). The “wise” here are reintroduced a few sentences later as “the one qualified person” (p. 85 [47b]). This may be the greatest challenge because if we’re seeking wisdom, then it may be difficult for us to know who to listen to. There’s no easy solution to this problem. It must be determined on a case-by-case basis. But we’re better off trusting our doctor’s medical advice than someone random person on Reddit. We’re safer trusting the credentialed experts in their field than influencers on TikTok. As Socrates says about ignoring the experts, “…if he disobeys the one man and disregard his opinions and commendations, and prefers the advice of the many who have no expert knowledge, surely he will suffer some bad effect?” Crito affirms (p. 85 [47c]): “Certainly.”

Not everything has to do with objective realities “out there” though. If your significant other or kids questions how you use your time, they may know you better than you know yourself because they’re stating that how you use your time isn’t showing that you value them and they’re assuming that you want to value them, so put down the Xbox controller for the evening.

In Socrates’ situation, he reminds Crito (p. 86 [48a]), “…what we ought to worry about is not so much what people in general will say about us but what the expert in justice and injustice says, the single authority and with him the truth itself.” Again, this doesn’t guarantee rightness. The expert can be wrong and the novice right though the odds are against it. But Socrates’ mindset is the right one. We should try to discern which voices are the most likely to give us the best guidance. This avoids the error of caring about what everyone thinks but also it avoids the equally dangerous mistake of thinking we are our only guide so we shouldn’t care about what anyone thinks. Both errors are black-and-white, dogmatic stances. The right way is the far more complicated, contextual way that asks us to think through the advice we’re receiving and from whom that advice is derived.

If TL;DR, consider instead:

Socrates ponders his death

I teach religious studies. This means that I get the following question with some frequency: “What happens when we die?” My response is rarely satisfying: “I don’t know. I haven’t died yet.” I don’t say this to brush off a serious question or to be coy. This is the response that is most authentic. To claim to know anything more than this would be to lie, at least for me. Maybe someone out there knows what happens.

Answering with Socrates
I was asked the aforementioned question both this week and last. Incidentally, over the past few days, I reread Plato’s Apology. This is Socrates’ defense of himself before the Athenians who would vote that he was guilty of corrupting the young men of Athens and denying the commonly received gods. When it comes time for sentencing, we find Socrates rejecting an opportunity to be ostracized: “But surely, Socrates, after you have left us you can spend the rest of your life in quietly minding your own business” (Tredennick and Tarrant translation, p. 66 [37e]). But Socrates believes he has been doing what is right and what is good when he goes around challenging commonly held assumptions. He speaks of it as if it was a divine calling. To accept a form of exile would be to abandon this mandate and to fail to do what is right and good.

Here we get Socrates’ famous line about the unexamined life (p. 66 [38a]):

“If on the other hand I tell you that to let no day pass without discussing goodness and all the other subjects about which you hear me talking and examining both myself and other is really the very best thing that a man can do, and that life without this sort of examination is not worth living, you will be even less inclined to believe me”

So, rather than save his life and undermine his own message and mission, Socrates accepts the sentence of the death penalty. He states that unlike the sophists who argue to win, “I would rather die as a result of this defense than live as the result of the other sort.” Also, “…the difficulty is not so much to escape death; the real difficulty is to escape wickedness, which is far more fleet of foot.” These statements (p. 67 [38d-e]) turn into a warning to the Athenians that he being slow will be caught by death but escape wickedness while they being fast will escape death but be caught by wickedness, which is far worse.

But what I want to highlight is what he says about death that is relevant to the questions I’ve received about the afterlife. Socrates warns that to assume that death is an evil is to make a mistaken claim because we don’t know (p. 69 [40 b-c]). Then he states (p. 69 [40c]):

“We should reflect that there is much reason to hope for a good result on other grounds as well. Death is one of two things. Either it is annihilation, and the dead have no consciousness of anything; or, as we are told, it is really a change: a migration of the soul from this place to another.”

Socrates proposes it might be like the deepest, dreamless sleep you’ve experienced. How refreshing! “If death is like this, then, I call it gain: because the whole of time, if you look at it in this way, can be regarded as no more than a single night.” In other words, death puts us into a peaceful, everlasting state of rest (p. 69 [40e]). Or, “…on the other hand death is a removal from here to some other place, and if what we are told is true, that all the dead are there, what greater blessing could there be than this, gentlemen of the jury?” He imagines he might spend time philosophizing with the greats, like Hesiod and Homer (p. 69 [40e-41a]).

What About Scary Visions of the Afterlife?
Notably, he entertains no scary or torturous visions of the afterlife. I imagine that this may have something to do with his assumption that goodness is from the gods and that depictions of badness or wickedness coming from the gods should be rejected (see “Euthyphro and Goodness” and  “Would Plato approve of children reading the story of ‘Noah’s Ark’?”). Or, it may (also) be that he recognizes himself as a good man. He says (p. 70 [41d]), “…nothing can harm a good man either in life or after death, and his fortunes are not a matter of indifference to the gods.” If we entertain theories of divine evil or divine goodness that looks nothing like what we can recognize as goodness, then we must entertain horrific visions of the afterlife. Or we must be confident that we were good people, all things considered. Materialist visions of death, or certain Buddhist ones, or even something like Calvinistic ones seem to be a different discussion altogether.

Socrates and Christ?
I have hope that death isn’t final but I’m aware that it could be. I hope that there’s some sort of continuation of personality after death. But I don’t know what happens. I hope that I can reunite with those I love. I’m not sure how it would work though. Frankly, I don’t even have the faintest idea about what happens. I know what different theologians and religious systems have suggested but I can’t tell you who’s right and who’s wrong.

This confuses some because I claim to be a Christian but I’m a Christian who has drawn a bold line between what I want to happen and what I think I can say will happen. And I’m a Christian in the sense that I try to ask myself, “Do I want to see a world that looks something like what Jesus imagined when he spoke of the ‘Kingdom of God’?” As long as my answer is “yes”, then I’ll try to be a Christian. All veneration/worship of Jesus is an attempt to recenter my affections in a world that tries to draw our eyes to power, influence, wealth, etc. That said, the theology and metaphysics of my religion are to me what poetry is: (potentially) beautiful, symbolic speech about things we sense, feel, experience, seek, hope for, etc., but that we can’t explain concretely or logically or scientifically. It’s a categorical error to turn our poetic theology into something scientific and systematic, in my view. If you read about the resurrection of Jesus across the four canonical Gospels—ignoring non-canonical Gospels for this exercise—you’ll find an evolving narration of what can be categorized at best as an “apocalyptic” events. (To call it a “historical” event seems both misaligned with what historians are doing and underwhelming in light of what Christianity has been claiming.) On Easter, I’ll say “he is risen!” but what I think I mean is “I hope what Jesus’ followers experienced after his death is a small window into what might await us after death!” I hope but I don’t know. In other words, I feel greater kinship with Socrates on this matter than St. Paul. My hopes aren’t the same as epistemic claims. I’ve come to accept that I’ll live with the doubts of Good Friday until the day death comes for me.

Euthyphro and Goodness

This past week, I reread Plato’s Euthyphro. This is the source of the famous “Euthyphro Dilemma” or “Euthyphro Problem”:

In Euthyphro, the question is whether the gods command what is good/holy because those commands are right/just or whether what the gods command are right/just, and therefore good/holy, because the gods command it. If the former, then it would seem that the gods are responsible to submit to a higher standard meaning that there is something greater than the gods, namely goodness, or justice, etc. If the latter, then what we call good/holy/right is merely a matter of power: we have to do what the gods tell us and this can be arbitrary; this can change over time. The gods could say “don’t murder” today but then “murder!” tomorrow and the rightness of it all would be determined by their divine positionality.

According to the polytheist Euthyphro, “…what’s holy is whatever all the gods approve of” (Tredennick and Tarrant translation, p. 20). For the monotheist, agreement isn’t needed about the number of gods. For monotheist, this uniformity is accomplished with the one god. As the video above presented it though, monotheism doesn’t escape the question: Is something right because God commands it or does God command it because it’s right. If the former, it seems arbitrary; if the latter, it seems that God is held to a standard greater than God. Most monotheists that I know respond with an argument that goes something like this: “What God commands is good because God’s commands are based on God’s nature which is inherently good.”

I’m not opposed to this argument but I think it closes the door on more fundamentalist readings of sacred texts. Let me turn to something Socrates says to Euthyphro to explain. Euthyphro is prosecuting his father for murder. Euthyrphro claims that what his is doing follows divine commands. He gives the example of how Zeus himself castrated his father Cronus because Cronus “had unjustly swallowed his sons”. In other words, Cronus had done an evil, so Zeus was justified in harming his own father.

The point I want to make has nothing to do with whether in the context of the myth, Zeus was justified. Instead, it’s Socrates’ response that interests me, as I noted above. Socrates says, “whenever someone talks like this about the gods, I find it very difficult to accept” (he says it in the form of a question but it implies his view, p. 14). Socrates finds this depiction of Zeus and Cronus problematic. Recall that recently I wrote a post titled “Would Plato approve of children reading the story of ‘Noah’s Ark’?” where I shared how in The Republic, Socrates says that God must be presented as “good”. Returning to Euthyphro, in response to the story of Zeus and Cronus, Socrates asks, “…do you really believe that these things happened like this?”

I find that if anyone is going to respond to the Euthyphro Dilemma with the claim that God’s commands are good because they come from God’s good nature, then they have to reject many of the Bible’s stories’ theologies (as well as the similar stories of other sacred texts). The response of the apologetically inclined will be to say that God’s goodness is different than ours. Maybe but to what degree? Humanicide? Or if we are to speak of doctrines like eternal damnation? If God’s goodness includes these horrid acts—acts that none of us would call good if God wasn’t attached to them—then the word “goodness” becomes meaningless. We should abandon any theologizing.

Socrates pushes Euthyphro to consider whether he understands holiness and divine justice. Euthyphro’s appeal to stories like Zeus castrating Cronus are unconvincing to Socrates because they present an inferior depiction of the divine. Socrates’ main goal is to help Euthyphro realize that he doesn’t know what the gods want because he doesn’t understand holiness and its relationship to justice. He needs epistemic humility. Likewise, modern religious people that speak of a good God doing things that we’d clearly define as bad in any other context seem to not understand divine goodness, if such a thing exists. They need epistemic humility. And they should be hesitant to appeal to sacred myths that depict the divine as having a lower standard of good than our own. Our own standard of good may not meet divine standards (because God is so extremely good) but surely they shouldn’t be clearly superior to them either. I would never exterminate almost all of humanity nor would I burn anyone for an extended period of time, let alone for an eternity. Such theologies make God wicked by any standard. And the only recourse is to retreat into arguing that what God says is good is good because God’s more powerful than us and God declared it. If this is so, then the dilemma hasn’t be addressed at all.

AI, reading, and the humanities

In a recent episode of the podcast “The Philosopher’s Zone” with David Rutledge titled “AI and Reading”, UMass-Lowell philosophy professor John Kaag was interviewed about a new project of his: “Rebind”. For your convenience, here’s a trailer for the product:

I admit, this sounds kind of interesting and I’d be interested in reading some of the books they have ready with the commentary and chat features, just to see what the experience is like.

In the interview, Kaag addresses a few topics that many of us in education know already. He talks about how learning needs to move away from information dumping and regurgitation. He addresses the problem of the perceived inaccessibility of many of the classic texts. He reminds us that the humanities have been in decline for a while now, so the AI revolution—if that’s what’s happening—can’t be blamed for growing disinterest. He’s positive toward AI. He sees it more as a solution than a problem.

As I listened, a few things came to mind:

  1. Kaag talks about how teaching may need to be a little more personal, a little more 1-to-1. The problem with this suggestion is practical. Class sizes are growing. If your institution—public, public charter, or private—isn’t growing, they either have a strong endowment/tax base or they’re dying. So it’s unlikely that class sizes can shrink to the place where teachers can do the type of 1-to-1 educating that Kaag suggests. We’d have to do a complete rehaul of our current system.
  2. Kaag sees the rise of AI’s significance and necessity as inevitable. My response would be that this is likely true but I think that the inevitability of AI’s significance can be embrace in a healthy manner and an unhealthy one. Large Language Models (LLMs) scrape the Internet for their data. We humans created that data. If we collectively become too reliant on LLMs, I fear that this will hinder human creativity. Yes, we learn from others. Yes, our learning is the ingestion and realigning of things we learn. But we do this as embodied creatures with agendas, goals, desires, motives, inspirations, imaginations, etc. We do this with an almost endless variety of purposes, as each of us contributes something unique. Our collective “hive-mind” is what it is because of individuality and uniqueness, in part. I don’t see that in LLM’s. Will LLMs have less and less truly unique and creative insight upon which to draw if we humans outsource of thinking to computers?
  3. Much of what LLMs produce is akin to what the philosopher Henry Frankfurt calls “bullshit” (neither truth nor lies, both which indicate intentionality, but just content that is careless about whether it is true or false). See the recent article “ChatGPT is Bullshit” by Michael Townsend Hicks, James Humphries, and Joe Slater for the journal Ethics and Information Technology.
  4. This may mean that there are stages to our developing skills and postures with regard to learning and learning with AI. My gut says that we should try to create a setting where young people have to develop their own independent thinking abilities in preparation for using those thinking abilities to engage what AI has to offer as active contributors and not just passive consumers. In my classes, students read from paper and handwrite on paper. Hypothetically, let’s say that your freshman/sophomore years, all reading and writing is done this way in class under the supervision of educators. Then as one becomes a junior and a senior in high school, preparing for the independence of their college and/or professional years, we focus on teaching them how to take their own original ideas and interface them with AI? An argument could be made that we have them wait until college to do this and high school focuses completely on reading/writing in a traditional, almost pre-Internet way.

I’m sympathetic to the idea that the humanities needs to embrace AI in order to be relevant in the future because this is what current cultural and market forces demand. But I’m hesitant to abandon the boring, laborious parts of learning that lay the ground work for the human brain because I worry that the quicker we outsource our thinking and creativity to AI, the sooner we’re going to realize we’ve placed ourselves in a collective spin where little creativity, innovation, or new thought can flourish.

Why do I blog?

Blogging may be an outdated form of media. I don’t think it’s dead like say MySpace. There remain many popular blogs out there. I presume their readership is mostly Gen X and older Millennials. But even if it isn’t dead, it’s not popular. You don’t start a blog in 2024 if you want to get a message to the masses. You get on TikTok, I presume.

The most “relevant” social media platform with which I engage is Instagram. Facebook is ads mixed with sadness, though it’s how I remain connected to many people. Threads is coming alive but nothing I share seems interesting to the people on there…or the algorithm! “X” is scary. I left that dystopia long ago. I’m not going to touch something like Snapchat. And though I have peers who have done well with TikTok, I’m not interested.

This is because I don’t blog for a big audience. I blog to keep myself writing with frequency. I blog because unlike keeping a private Word document to record my thoughts, occasionally people can read what I write here, enjoy it, share it, and even respond to it. But I don’t look for that sort of response in the same way social media influencers do. It’s more like when blogging first began in the late 2000s and there was the joy of being able to write and be read by a handful of people with similar interests. That was my favorite part of blogging culture and it remains so.

It’s funny because for a long while, I had a blog that was very popular by blog standards. I know these stats don’t match the stats someone might get on YouTube or TikTok but my most “successful” blog has seen almost 1.5 millions views in its lifetime and about a half-a-million unique visitors. There was a day when over seven thousand people visited back in 2013.

This blog was central to me finding my way when I moved to San Antonio. One person who read it, Greg Richards, directed “College Missions” for the Diocese of West Texas of the Episcopal Church (for whom I work indirectly now). He was my first connection with the denomination that is tied to the school where I work and he was one of the people who wrote me a recommendation when I applied for the job I’ve had for more than eight years now. Another person was Dr. Rubén Dupertuis at Trinity University here in town. He gave me two opportunities to be a “Teaching Intern” which helped my resume. Also, he wrote me a recommendation letter. So, my old blog helped me network somewhere new. This networking helped me find the job that I have now. I’m grateful for that old blog!

I had a few other blogs that started, failed to gain any readership, and/or were closed because I gave up on the theme upon which it was anchored. (For example, when I was on the doorstep of leaving Pentecostalism permanently, I abandoned a blog with the clever name “Azusa Remixed” that tried to gather together Pentecostal and Pentecostal-friendly but also forward thinking writers to talk about a future for Pentecostalism. When I knew my vision wasn’t going to match reality, and that reality was that I didn’t belong in Pentecostal circles anymore, I shut down the project. On a side note, the old saying that the Internet doesn’t forget isn’t true. If you google “Azusa Remixed” you’ll find nothing about my blog that I can see though there’s some connection to an anime character!)

I think Twitter was the beginning of the end of blogging supremacy as a novel way to communicate on the still young Internet. Now it’s something older people like me do. My old blog sits there without a new contribution since 2014 but it still gets about four times as many visitors every week as my current one. If you’re a reader of this blog, I’m grateful for you but clearly “readership” in the abstract isn’t my goal. My goal is to process my thoughts through writing. Blogging was the method of writing that has been the most successful at helping me develop consistency. So, because I value the connection between writing and thinking, and blogging helps me maintain that connection, I continue to blog.

Anecdotal evidence about phones in the classroom

I’m not a psychologist or a social scientist. But my own experience in the classroom has made me pay attention to the claims of people like Jonathan Haidt and Jean Twenge. Both have sounded the alarm with regard to adolescent (over)use of smartphones. I’ve confiscated student phones only to have my pocket buzz incessantly. I wondered how anyone could focus with notification after notification from Snapchat, Instagram, and TikTok vying for their attention. I’ve seen my students sit around together but not speaking to each other as each stared into their phone. Adults do this sort of thing too but as Haidt, Twenge, and other has noted: we had a chance to live through our brain’s important developmental stages before getting smartphones. Gen Z didn’t get the opportunity. For this reason, Haidt, Twenge, et al., have argued for causation between smartphone use/addiction and the ongoing mental health crisis we see about America’s youth (for example, see Haidt’s “End the Phone-Based Childhood Now”).

My wife and I have seen the children of parents who raised their kids without smartphones and tablets and those who allowed it. Our experience told us that there are drastic differences in these kids ability to wait, be patient, delay gratification, hold conversations, read books, be creative, and just enjoy being children with imaginations. Our kid won’t have a smartphone or a tablet at their disposal. If they use it at all in daycare or school, we’ll ask for limits. My plan is to keep these technologies out of their lives as long as I can.

For this reason, I was surprised when a recent episode of Freakonomics (“Is Screen Time as Poisonous as We Think?”) interviewed Andrew K. Przybylski of Oxford University who seemed to brush these concerns aside. I think his main point was that phones aren’t the end-all, be-all of Gen Z’s mental health crisis. But as I listened to him, I thought what he was saying didn’t match my experience at all. You see, this year our school went phone free. And I don’t know how many students are going to our student counselor. And I can’t tell you whether they feel happier in general. I can tell you what I see in the classroom though: they’re more focused; they contribute to class conversations more freely; they seem to have more patience when reading; they seem less stressed and distracted; they seem more in the moment. Several of my colleagues have noticed the same thing.

Our school is using Yondr. The kids were not happy about this at the beginning of the year but more and more are telling my colleagues that they admit that they kind of enjoy the freedom. Maybe Przybylski would agree that this can be good. Maybe his point has little to do with phones in schools and more to do with the smartphone-mental health causation argument. But a few weeks into this new school year and I think our school’s decision to remove phones has been one of the best ones we’ve made in years. The students seem happier!

Phones weren’t allowed last year, technically. We told the kids to keep them “off and away” during class. They could take them out between classes. This meant that in reality many students still had their phones on their bodies all day. All those notifications grabbing their attention endlessly from their pocket, making them want the class to be over now so that they could hurry to check their social media. Now my students often lose track of time as they lack phones and smart watches, and I rarely use computers in my class. Also, I don’t have a clock on my wall. The few students with traditional watches keep time but quite often it’s clear that they don’t know how much time has passed in class. This has made a huge difference.

I teach at a relatively affluent private school. My experience is limited to one demographic of kids. I don’t want to claim to be diving into the big picture psychology and social science of adolescents and phones. But for our school, and for my students, the removal of phones has been a gift. As an adult, I’ve noticed that when I spent too much time on social media, I feel worse about things. When I stare at my phone for too long, it’s rarely a good sign. As I try to use my phone and social media less, my brain feels freer, happier. If this is how things are for my forty-two year old brain, I can’t imagine that a fourteen through eighteen year old brain doesn’t benefit at least as much from time away from their phones and social media. For that reason, as the debate goes forward in universities and research labs, I’m going to go with my experience and root for limiting phone/social media use by young people.

Handwriting is good for the brain

I’ve mention a few times that my students handwrite their notes, and their exit tickets, and pretty much everything. The #1 reason for this? I can’t compete with all the tabs open on their Internet browser. When I used to allow computers, engagement seemed impossible. It wasn’t clear that they were listening to me. It was easy to have a classmate send their notes by email or messenger, which could then by copied-and-pasted. This changed after I banned computers. Students became more likely to participate in class. And even if you copy your classmates notes, you have to take time to write them out yourself, which leads to more learning than copying-and-pasting.

This leads me to the #2 reason: I think handwriting is better for learning. I can’t remember the book that I read years ago on this subject—it’s likely outdated and out of print now—but recent science seems to confirm that handwriting notes helps learning stick! For instance, earlier this year the article “Handwriting but not typewriting leads to widespread brain connectivity: a high-density EEG study with implications for the classroom” was published in the journal Frontiers in Psychology and they authors (F R Ruud Van der Weel Audrey L H Van der Meer ) conclude:

“When writing by hand, brain connectivity patterns were far more elaborate than when typewriting on a keyboard, as shown by widespread theta/alpha connectivity coherence patterns between network hubs and nodes in parietal and central brain regions. Existing literature indicates that connectivity patterns in these brain areas and at such frequencies are crucial for memory formation and for encoding new information and, therefore, are beneficial for learning. Our findings suggest that the spatiotemporal pattern from visual and proprioceptive information obtained through the precisely controlled hand movements when using a pen, contribute extensively to the brain’s connectivity patterns that promote learning.”

I’m not a scientist so I can’t evaluate these findings but PubMed has many other articles that seem to be making the same claim. My experience is anecdotal but even if there weren’t studies like this one that seem to support my hunch, I know I would continue to have my students write by hand because of the difference that I’ve experienced. And because I’m not as interesting as whatever is on tabs 7, 12, 28, and 39.

What do we want for our students? Dispositional growth!

Lately, I’ve been writing a lot about shifts in my thinking regarding what I teach and how I teach it. I tend to be an introspective and retrospective person by nature. This has been super charged by the news that I received several months ago that by the end of the year, I’ll be a “dad”. I began to wonder, “What kind of education will I want my kid to have when they enter high school?” Also, “If I were to be my child’s teacher someday, what/how would I want to teach them?” These questions haven’t led to a midlife crisis. I enjoy what I’m doing as a teacher. I have no desire to do anything else. But I’ve thought a lot about the future relevance of what I’m teaching currently, and I’ve second guessed the viability of the field of study to which I’ve dedicated so much of my life, at least whether or not I’m interested in the questions that I would need to keep asking in order to continue being engaged.

Additionally, I’ve reflected on the environment I hope to create in my classroom but also outside my classroom, as in what place do I think education has within the context of the adolescent’s life. In a recent post, “Homework, rigor, and being the ‘chill’ teacher”, I wrote about how I try to measure the success of my teaching in ways that don’t align naturally with the current modus operandi of American education. What do I want to see? I want to see students learning how to read: to read thoughtfully, carefully, and intentionally. I want them to become accustom to taking notes and using those notes. I want them to practice putting what they’ve learned into their own words, so that they take ownership of their knowledge rather than thoughtlessly parroting how others say it, or worse, outsourcing their learning to emerging AI or the top Google search results. I plan to help students learn to develop arguments (in the philosophical sense) where they can show their reasoning. I want students to be mentally tired at times in my class but I want the culture of my classes to be such that when they look back on their experiences, it felt “easy” because what I was trying to teach them became natural to them, and while they were pushed to stretch themselves, they weren’t driven to anxiety. These are ideals, aspirations.

I purchased a book titled The Art of Teaching Philosophy: Reflective Values and Concrete Practices to help me think through these ideals and aspirations. One essay captured what I’ve been feeling:

That first paragraph from David W. Concepción‘s essay (pp, 189-196) grabbed my attention. That’s what I’ve been trying to articulate. I want to focus on dispositional growth. And this exercise at the beginning of the essay grasps what I’ve been seeking. What do I want to stick with my students when they reflect on the experience of my classes years after they take them? Those are my “learning objectives”!

I completed the exercise. My gut response is something like this:

  1. I want them to use their knowledge to increase their inward happiness but also the outward good that they will do in the world
  2. I want them to become more thoughtful/self-examined/self-aware
  3. I want them to develop an open posture toward learning
  4. I want them to become more tolerant/less dogmatic
  5. I want them to develop and sharpen their critical thinking skills

Concepción’s chapter addresses concerns that administrators may have that such dispositional goals are immeasurable. He argues convincingly that all learning assessments are “[inferential] through proxy”. Furthermore, he provides guidance for how objectives that matter to philosophers and other teachers under the umbrella of the humanities, such as increased “curiosity, intellectual humility, comfort with ambiguity, and fair-mindedness” (p. 191), can be measured and the types of assessments/rubrics that would do the job. I won’t spoil the chapter for those are interested. I will recommend it! I think Concepción is exactly right that our real learning objectives have to do with the type of people we help our students become. The information we provide them is necessary but not sufficient in itself. As C. Thi Nguyen writes in another essay in the book (“On Writing Fun, Joyful, Open-Ended Exams”, pp. 297-303; here, p. 298): “Many of us have come to think that good pedagogy is not just about the transmission of information. It is also about trying to encourage a mindset to foster intellectual virtue.”

This may be uncomfortable to say, especially in our current political environment where teachers are frequently targeted, often accused of “indoctrination” (a rich, though pitiful accusation as any teacher who has struggled to get students to complete work or pay attention in class for more than a few minutes is aware) but education includes values as much as (though likely more than) it includes information. If we teachers are honest, we don’t teach because we think we can compete with Wikipedia, Google, or ChatGPT; we teach because we believe we can model and impart intellectual virtues that help out students grow into flourishing humans. As soon as we admit this to ourselves, and articulate it aloud, we’ll begin to see that dispositional goals are the most important goals for most of us.

One more brief comment about homework

As many of us adults realize that being worked more and more does little for us and our mental health, there has been great consideration of “right to disconnect” laws. These laws protect you from employers who may expect you to answer emails or do task when you’re technically off the clock. I support these types of laws. Though I have workaholic tendencies, mostly because my interests, hobbies, and work overlap (religion, philosophy, education…I do that for a living but these are some of my primary curiosities in life as well), I do believe that we should “work to live” not “live to work”.

In my last post (see “Homework, rigor, and being the ‘chill’ teacher”), I explained why I don’t give my students homework. I conceded that homework is likely needed for some areas of study that need day-to-day practice: languages, mathematics, some sciences. And there are some challenging classes into which students self-select like Advanced Placement (AP) classes. But if the day-to-day practice isn’t needed, and students haven’t chosen a more challenging class, then I suggested that homework may not be useful.

I want to add to that this statement: what do we want to teach students about work/life balance? In adulthood, there are those who want certain challenges. There are those who want to work from sunrise to sunset. But that’s their choice (like signing up for AP classes). If like me you support “right to disconnect” laws because they allows adults who want to have lives outside of their work to have those lives, then it follows that we should be open to giving high school students some of this same respect.

This may not apply to higher education. We opt into higher education. But high school (and middle and elementary school) are mandatory. We may need to give students required homework to prepare them for college and the workplace but also we need to consider where we need to show them the same respect we hope to receive as adults who want free time to pursue interests outside the demands of our “produce, produce, produce more” culture.