How Old is the Shepherd?

We asked 101 high schoolers the following question: 

There are 125 sheep in a flock and 25 dogs.
How old is the shepherd?

The question is an invitation to take a closer look at the kinds of mathematics that we are asking students to engage with in our maths classrooms today. What does it mean for us as educators when students give responses like 130 because 125 + 5 = 130 or 25 because 125/5 = 25? Moreover, what does it mean for us as educators when we expect these responses from students? 

I first heard about the shepherd question through Robert Kaplinsky though the question has its origins based on research by Professor Kurt Reusser from 1986, possibly sooner


The data was collected via an online survey (on account of school closures due to COVID-19) and was given to students in China earlier this year. Our initial goals were: 

  • To collect some data regarding sense making in mathematics amongst our high schoolers (grades 10 – 12)
  • To analyze the data and assess what this means for us as educators. Are there differences in responses amongst the different grade levels? Are there gaps in student learning that we need to address? If so, how might we begin to address those gaps? 
  • To use this activity as a way to begin a dialogue with students and teachers about sense-making in mathematics 

The survey was conducted via Microsoft forms.



Of the 101 student who were surveyed: 

  • 25 stated there was not enough information to answer the problem, or did not supply a numerical response based on the constraints identified in the problem
  • 74 gave numeric responses
  • 2 students did not answer the question (e.g. one student wrote “nice question”) 

At first, this data seems to be consistent with the results from Kaplinsky’s experiment with the 32 eight graders, in which 75% of them gave numerical responses by using random addition, subtraction, division, or multiplication of 125 and 5. Upon closer examination, however, we see that of the 73 that gave numeric responses, 28 used random math procedures, thus not making sense of the problem, but 45 of those students gave some sort of reasoning independent of the problem to support their numeric responses. 

Sample Responses

Students that gave numeric responses by combining 125 and 5 via random math operations (not making sense): 
Note that a couple of students pointed out that there seemed to be an issue with the problem, but proceeded to give an answer anyway.

Students that did not provide a numeric response (making sense): 

One response in particular really blew me away (click to expand the image): 

This superstar response really blew me away.

​Not only did this student state that the question did not give enough information to provide a specific answer, they used what information was presented in the problem, along with sources to support their thinking, to deduce an age range for the shepherd! Wow. How can I get the rest of my students here?? 

This is the point where I began to see another category emerge… Students who provided a numeric response, but justified their answers outside the range of expected range numeric responses such as:

  • 125 + 5 = 130
  • 125 – 5 = 120
  • 125/5 = 25 
  • Or other such random combinations of numbers. Note that no one responded with 125*5 = 625  as they seemed to realize this may be ridiculous age for a shepherd to be. Although one student did provide a response of 956000 through a series of random additions and subtractions.

I referred to this new category of responses as “supported guesses.” To be honest, I had difficulties categorizing some of these responses as “making sense” after seeing the superstar example from above, but ultimately decided that anything other than random math was a step in the right direction, although you will probably agree with me that some responses seem to employ more evidence of reasoning than others: 


​My Take-Aways

I was quite blown away by the number of students that treated this as a “trick” question and thus gave a wide range of responses, which ranged in creativity and depth of thinking. Like I mentioned, I found it difficult categorizing some of these responses, and found that after reading that Superstar response from above, my expectations rose (not necessarily a bad thing, but definitely made categorizing more difficult). 

Some factors worth considering: 

  • The responses were collected via an online survey. Would results have differed if this was done face to face? Did students try looking up the problem before attempting to solve it? 
  • Students feeling like they did not have intellectual autonomy; not wanting to question the questioner due to respect for authority. 
  • The old “my math teacher is asking me this, so I must calculate something” trick. 
  • Cultural factors may be at play here. Perhaps students have seen some version of this problem before, thus accounting for the varied supported guesses observed. 

It is also worth noting that in a follow-up reflection activity with students, some pointed out the need for teachers to ask less ambiguous problems, a few attributed their responses to poor understanding of the problem due to language barriers, while a fair number mentioned the importance of practicing different kinds of problem solving to develop critical thinking skills. This, I think, is a step in the right direction.


  1. I’d like to posit a that the problem isn’t the students, it’s the question. The goal of an assessment is to establish if students have successfully met a learning goal. This question significantly fails at that because it isn’t based around a learning goal. It’s based around misrepresenting context.
    This question isn’t written in a way to open the kind of discourse you are hoping students engage in. It’s written in the vernacular of a typical mathematical word problem. It’s goal is to deceive, not invite critical thinking. If you were hoping to see how students could extrapolate a plausible answer based on limited contextual information, you should have asked them to do that.
    Imagine if this question was written instead as “A shepherd has 125 sheep and 5 dogs, what additional information might be needed to find the age of the shepherd?”
    This question invites you to invent. The original question does not. It is similar to asking the question “There are 125 sheep and one dog for every 25 sheep, how many sheep are there?” This is an equally stupid question preying on pretext rather than analysis.
    The real solution here is to ask questions in a way that invite discussion, that students are prepared to encounter, and have a specific goal you are assessing.



    1. Thanks for your comment Marc! You bring up some good points worth considering, and a few students brought up a similar point about changing the wording in their reflections. I agree that perhaps the question itself may be flawed in some ways, however, do you think there is something to be said about getting students to a point where they feel comfortable respectfully disagreeing with a question?
      In that sense, perhaps it isn’t necessarily a problem with the questions we ask, but rather thinking about how we can create a class atmosphere that focuses less on calculating and procedural knowledge, and more critical thinking and problem solving? Just a thought.



  2. I’m not sure if I’m more surprised that you had these results or less surprised. Like I get that this happens everywhere, but another country + online and I thought it could be a bit different.
    Thanks for sharing this!



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s