How Old is the Shepherd?

We asked 101 high schoolers the following question: 

There are 125 sheep in a flock and 25 dogs.
How old is the shepherd?

The question is an invitation to take a closer look at the kinds of mathematics that we are asking students to engage with in our maths classrooms today. What does it mean for us as educators when students give responses like 130 because 125 + 5 = 130 or 25 because 125/5 = 25? Moreover, what does it mean for us as educators when we expect these responses from students? 

I first heard about the shepherd question through Robert Kaplinsky though the question has its origins based on research by Professor Kurt Reusser from 1986, possibly sooner

Context

The data was collected via an online survey (on account of school closures due to COVID-19) and was given to students in China earlier this year. Our initial goals were: 

  • To collect some data regarding sense making in mathematics amongst our high schoolers (grades 10 – 12)
  • To analyze the data and assess what this means for us as educators. Are there differences in responses amongst the different grade levels? Are there gaps in student learning that we need to address? If so, how might we begin to address those gaps? 
  • To use this activity as a way to begin a dialogue with students and teachers about sense-making in mathematics 
Picture

The survey was conducted via Microsoft forms.

Results

Picture

Of the 101 student who were surveyed: 

  • 25 stated there was not enough information to answer the problem, or did not supply a numerical response based on the constraints identified in the problem
  • 74 gave numeric responses
  • 2 students did not answer the question (e.g. one student wrote “nice question”) 

At first, this data seems to be consistent with the results from Kaplinsky’s experiment with the 32 eight graders, in which 75% of them gave numerical responses by using random addition, subtraction, division, or multiplication of 125 and 5. Upon closer examination, however, we see that of the 73 that gave numeric responses, 28 used random math procedures, thus not making sense of the problem, but 45 of those students gave some sort of reasoning independent of the problem to support their numeric responses. 

Sample Responses

Students that gave numeric responses by combining 125 and 5 via random math operations (not making sense): 
Note that a couple of students pointed out that there seemed to be an issue with the problem, but proceeded to give an answer anyway.

Students that did not provide a numeric response (making sense): 

One response in particular really blew me away (click to expand the image): 
Picture

This superstar response really blew me away.

​Not only did this student state that the question did not give enough information to provide a specific answer, they used what information was presented in the problem, along with sources to support their thinking, to deduce an age range for the shepherd! Wow. How can I get the rest of my students here?? 

This is the point where I began to see another category emerge… Students who provided a numeric response, but justified their answers outside the range of expected range numeric responses such as:

  • 125 + 5 = 130
  • 125 – 5 = 120
  • 125/5 = 25 
  • Or other such random combinations of numbers. Note that no one responded with 125*5 = 625  as they seemed to realize this may be ridiculous age for a shepherd to be. Although one student did provide a response of 956000 through a series of random additions and subtractions.

I referred to this new category of responses as “supported guesses.” To be honest, I had difficulties categorizing some of these responses as “making sense” after seeing the superstar example from above, but ultimately decided that anything other than random math was a step in the right direction, although you will probably agree with me that some responses seem to employ more evidence of reasoning than others: 

​Results

​My Take-Aways

I was quite blown away by the number of students that treated this as a “trick” question and thus gave a wide range of responses, which ranged in creativity and depth of thinking. Like I mentioned, I found it difficult categorizing some of these responses, and found that after reading that Superstar response from above, my expectations rose (not necessarily a bad thing, but definitely made categorizing more difficult). 

Some factors worth considering: 

  • The responses were collected via an online survey. Would results have differed if this was done face to face? Did students try looking up the problem before attempting to solve it? 
  • Students feeling like they did not have intellectual autonomy; not wanting to question the questioner due to respect for authority. 
  • The old “my math teacher is asking me this, so I must calculate something” trick. 
  • Cultural factors may be at play here. Perhaps students have seen some version of this problem before, thus accounting for the varied supported guesses observed. 


It is also worth noting that in a follow-up reflection activity with students, some pointed out the need for teachers to ask less ambiguous problems, a few attributed their responses to poor understanding of the problem due to language barriers, while a fair number mentioned the importance of practicing different kinds of problem solving to develop critical thinking skills. This, I think, is a step in the right direction.