This is a collection of my thoughts from reading:
A. B. Heim, G. Lawrence, R. Agarwal, M. K. Smith, and N. G. Holmes, Perceptions of interdisciplinary critical thinking among biology and physics undergraduates, Phys. Rev. Phys. Educ. Res. 21, 010138 (2025).
The paper is available on the PaperCast at https://open.spotify.com/episode/394yVw46zKSA8gPYLKLaGh?si=7bb3c2173d9545aa
Structure
This post is a series of thoughts on different topics that arose as I read the paper.
Tasks as the basis of research as well as curricular design and sharing
Listening to this paper on interdisciplinary curricular design, one justification they mention for focusing on learning tasks is that tasks are a manageable level for curricular design research. Tasks are simpler than entire courses, and once designed, they can be sequenced into a full course. An additional argument is that approaching a course holistically is simply too complex. While I agree that designing a course by first defining overarching goals and objectives, then breaking those down by unit, and only then developing tasks to meet the established goals, is more complex, I believe it produces a much stronger result. My initial experience designing 131 is an example wherein I designed tasks first and then strung them together did not produce an effective course. Thus, while I recognize the value of research on tasks and understand why they serve as a convenient focal point for curricular design research, I struggle with the idea that tasks should be the starting point. This perspective connects with ideas I’ve been developing for my new graduate TA training system. Graduate TAs often focus on designing lectures and other tasks, but I want to encourage a more holistic perspective—thinking critically about course goals and objectives first. In summary, while I agree that tasks are a critical part of curricular design and a convenient unit of research, I do not believe they are an appropriate starting point. That approach feels backward to me. It’s like designing the perfect brick before knowing what you’re going to build. If you’re building a house, that brick might be useful. But if you’re building a boat, that perfect brick may not help—and trying to use it could result in a boat made of bricks, which is ineffective at best.
Physics education, especially for biology students and even engineers, is at a point where curriculum reconsideration may not be absolutely necessary, but is certainly warranted. We need to ask: what are we teaching, and why? This line of thinking is influenced by the Hake paper1, which emphasizes that active learning is where learning happens. However, active learning takes time. This means that designing a course requires hard decisions: what do you teach, and—more importantly—what do you drop? I would encourage others, as Redish2 does in his reconsideration of introductory physics for biologists, to start by digging into what you want students to learn and why—then move on to the tasks.
The “Earthworm Problem” as a Level-2 Problem
This is referring to
Alice Churukian, David Smith, Colin Wallace, Duane Deardorff, Daniel Young, and Laurie McNeil, Living Physics Portal: PALS Breathing Worms, https://www.livingphysicsportal.org/details/d7a0eea7-e6a5-4fb9-a644-c276ee0968bf.
Later in the same paper, they discuss the Nexus Physics curriculum’s well-known earthworm problem. This involves comparing oxygen absorption through the skin with oxygen usage in the body—a scaling law problem. Surface area scales with R², and volume with R³. For cylinders, surface area scales linearly with radius, and volume (or cross-sectional area) scales with R². This relationship is relevant here.
While the problem is solid in concept, I still find it somewhat lacking. Perhaps not inauthentic, but not as engaging as it could be. The main issue is that students are told exactly what to do: create this equation, graph that one, identify where the lines intersect. While this applies physics problem-solving techniques to a biological context—which the paper classifies as “level two”—I don’t like that the path is laid out so explicitly.
In my IPLS course, I want students to figure out how to approach problems. They may need some guidance, but ultimately, I want them to be able to look at a biological or chemical situation and apply physics tools independently. The goal is for them to encounter something in a biology or chemistry course and think, “I’ve seen something like this before in physics. I know how to use physics to better understand this phenomenon.”
Importantly, I want them to apply this reasoning to new contexts—not just replicate what we’ve done in class. That’s why my exams use novel scenarios. If they can look at something like bird vs. bumblebee flight and think about the role of Reynolds number, or apply physics when studying blood flow in chemistry or biology, then they’re doing what I hope they’ll do. If an instructor mentions the electrical properties of blood, they should be able to draw on physics problem-solving to deepen their understanding—not just memorize the concept.
So, while the earthworm problem has value, I believe it would be stronger as a more open-ended task. For example: “Why do earthworms have limited cross-sectional area per unit length?” Something that invites critical thinking, not just procedural execution.
Presumed Uniformity of Introductory Biology Curricula
I also want to mention another critique of the paper: its assumption about the universality of biology curricula. In adapting Nexus materials at UMass Amherst, I’ve found that our students arrive in IPLS with different backgrounds than those at the University of Maryland. This makes me wonder whether biology curricula vary more than physics curricula across the country.
My conversations with biology colleagues suggest that variation exists even within UMass. For example, Laura Francis emphasizes a mechanistic and quantitative approach, while Caleb Rounds focuses on data literacy and graph interpretation. Others still follow a more traditional memorization-based approach. These differences suggest limitations in the universal applicability of the paper’s ideas.
For this reason, I believe authentic IPLS development requires close collaboration with local biology and chemistry faculty. These partnerships help align course design with the actual experiences of students. A good example is how my focus on HOMO-LUMO transitions in organic chemistry—used to motivate quantum mechanics—emerged from conversations with Laura. Using shared language and figures, and teaching this content simultaneously in chemistry and physics, reinforces its authenticity for students.
In short, one must be cautious when trying to generalize IPLS implementations from one institution to another. Biology prerequisites and emphasis vary, and effective course design must reflect this.
The Importance of Language in Problem Authorship
Another interesting example from the paper involves two different prompts for protein unfolding. The first is essentially a physics problem imposed on a biological context—no more authentic than a cheetah chasing an antelope. While it might be fun, it doesn’t feel authentic to many students, and in my experience, some even resent such forced pairings.
The second prompt, involving the energy landscape, feels much more authentic. In addition to structural differences, a key factor is the use of biologically native language. This aligns with ideas from my mutual mentoring work: authenticity is enhanced when you speak the language of the students.
This is something that I find missing in many IPLS discussions. Language and conventions matter. For example, I used to describe HOMO-LUMO transitions as “excitation from ground to first excited state”—the physicist’s language. Calling it a HOMO-LUMO transition, though, immediately felt more authentic to students. I don’t have hard data to support this, but my classroom experience strongly suggests it helps.
The Value of Level-One Tasks
One final point: what the paper calls “level one” tasks can actually be counterproductive. Many students are perceptive. They recognize when a problem is a forced attempt to make physics seem relevant. When it feels fake, they may conclude that physics has nothing to offer biology. In fact, the paper eventually makes this same point.
What I’ve found effective are tasks where students can draw relevant biological, chemical, or medical conclusions. For example, calculating eyeglass prescriptions based on eyeball size, predicting absorption lines of organic molecules, or solving for a 70 mV membrane potential. These are meaningful, relevant facts, but students often don’t know their origins. When they see physics helping them understand such concepts, engagement increases.