Asking the right questions

effective questioning sketchnote

I presented at the 2017 NSW Secondary Deputy Principals Association Conference this week on embedding effective questioning into assessment for learning. According to research, teachers ask 400 questions a day, wait under 1 second for a reply from students and most of these questions are lower order questions that require students to recall facts. The research also shows that increasing the number of higher order questions leads to increases in on-task behaviour, better responses from students and more speculative thinking from students.

There are other reasons why teachers ask question, like asking a question to wake up the student daydreaming at the back of the class, or asking students to repeat instructions to an activity to make sure they know what to do. These are fine, as long as teachers know the reasons for those questions (and these types of questions do not dominate the majority of class time).

tenor

Strategic questioning is key to assessment for learning. While questioning is essential for students in all grade levels, teachers can take the opportunity of new syllabuses and school based assessment requirements for the HSC to re-think how they design and implement assessment for learning in Stage 6. However, questioning is often viewed as an intuitive skill, something that teachers “just do”. At a time when many teachers are creating new units of work and resources for the new Stage 6 syllabuses, it may be a good opportunity to look at strategic questioning and embed some quality questions and questioning techniques.

What do good questions look like?

For assessment for learning, there are two main reasons why teachers ask questions:

  1. To gather evidence for learning to inform the next step in teaching
  2. To make students think

For these questions to be effective, it depends on how the question itself is designed, how the question is asked, and how response collected and analysed, to inform the next step in teaching and learning. Here are some strategies:

Hinge questions

Hinge questions are often multiple choice questions (they don’t have to be). They are asked by the teacher to the class towards the middle of the lesson for the teacher to decide whether the class has understood the critical concepts of the lesson to move on. Hinge questions have four essential components:

  1. The question is based on a critical concept for that lesson that students must understand.
  2. Every student must respond to the question.
  3. The teacher is able to collect every student’s response and interpret the responses in under 30 seconds. (This is why many hinge questions are multiple choice).
  4. Prior to the lesson, the teacher must have decided what the teaching and learning that follows for:
    • the students who have answered correctly
    • the students who have answered incorrectly

Here is an example of a hinge question:

hinge question example

The question assesses students’ understanding of validity, reliability and accuracy in scientific investigations. Many students confuse the 3 concepts. This hinge question can be used for a lesson on investigation design where validity, reliability and accuracy have been explained. Towards the end of this explanation (typically around the middle of the lesson), this question can be asked to all students. Then the teacher can decide on the next steps for students who “get it” and those who don’t. For this question, the correct answer (key) is B. Note that the wrong answers (distractors) in a hinge question must be plausible so students do not answer correctly with the wrong thinking. A really, really good hinge question would have distractors where each distractor reveals a misconception.

Here is another example of a hinge question from Education Scotland.

hinge question maths example

For this question, the key is B. The annotated blue boxes show the wrong thinking behind each distractor.

So how do you implement hinge questions? How do you ask them so that every student responds and you can collect and interpret their responses, and decide the next step in under 30 seconds?

No hands up

The first thing to do is to create a class culture of “No Hands Up”. Students can only put up their hands to ask questions, not to answer questions. Either everyone answers or the teacher selects who answers. When the teacher selects who answers, it must be done in a random way so that everyone is accountable to answering the question. This ensures that it is not just the “Lisa Simpsons” or the daydreaming student who answers the questions. For this to happen, teachers can use mini whiteboards and a randomisation method.

Mini whiteboards can be purchased or cheaply made by laminating pieces of white paper. For hinge questions, students write down their response (A, B, C, D, etc) and holds up their whiteboard for the teacher to see when the teacher says so. This allows the teacher to scan every board (so every student’s response) to see approximately how many students have understood the critical concept. The teacher can then decide what activities they can do while intervening for those students who do not understand. The key to hinge questions is to intervene during the lesson.

As Dylan Wiliam says,

It means that you can find out what’s going wrong with students’ learning … If you don’t have this opportunity, then you’ll have to wait until you grade their work. And then, long after the students have left the classroom.

Alternatively, you can use digital tools like Plickers, Kahoot and Mentimeter. I personally find mini whiteboards the easiest to implement.

While hinge questions require everyone to respond, other questions are more suited to randomly selecting a student to respond. Teachers can use these strategies:

  • Digital random name generator from tools like Classtools and Class Dojo.
  • Writing each student’s name on paddle pop sticks and selecting a stick out of a cup

paddle pop sticks

Higher order questions

Selecting a student at random to answer is more suited to higher order questions. the key is to create and pre-plan higher order questions to take to class to avoid asking too many lower order questions. To plan a sequence of low order to higher order questions, there are numerous strategies. There are heaps of resources for using Bloom’s question stems (just Google it). The strategy I find less popular, but more accessible to students, is the Wiederhold question matrix.

question matrix

Questions are created by combining a column heading with a row heading. Eg. What is …. , Where did … , How might ….

Teachers can put a stimulus in the middle of the table for students to create their own question, like this source I found via Kate Littlejohn for Stage 6 Modern History.

question matrix history

Some sample questions include:

  • What is an ally? What is an opponent?
  • Who decides who is an ally and who is an opponent?
  • What is WWI? Where did it happen?
  • Why did WWI happen?
  • How would you decide who paid the highest price in WWI? What criteria would you use?
  • How might the numbers in each category compare if a world war happened today?

Both hinge questions and creating a sequence of questions are not easy. It is worthwhile for teachers to look at building a bank of hinge questions and higher order questions as they collaboratively create units of work and resources.

You can find more information and resources on questioning in assessment for learning here.

Wait, wait and wait

Lastly, regardless of what questions you are asking (hinge, higher order questions, questions to wake up students), remind yourself to wait. Wait at least 3 seconds for lower order questions and more than 3 seconds for higher order questions; the longer the better.

Potential of hinge questions in flipped learning

As an interesting note, I think hinge questions can be very useful in flipped learning. The hinge questions can be asked at the start of the lesson to assess who has understood the concept from the instructional videos and who hasn’t so the teacher can decide on how the rest of the lesson should run. Hinge questions can also be incorporated into the instructional video at key points so that the video continues in a certain way if students answer correctly and in another way if students answer incorrectly.

Formative assessment with hexagons

Formative assessment is something I’ve been putting a lot more emphasis on over the past few years. I’m so sick of just relying of end-of-topic exams to gauge what students have learnt. I want my students to continuously question how they are going and make changes to their learning accordingly. This is one of the reasons that my faculty has embarked on a Structured Observed Learning Outcomes (SOLO) journey this year. One of the ways that many teachers using SOLO use to assess student learning is with SOLO hexagons.

SOLO hexagons involves the major concepts or ideas from a topic to be placed individually onto hexagons. Students then work individually or in groups to connect the hexagon concepts together and they must justify why they have made these connections. It is the justification where both the teacher and the student can assess the student’s learning. It is how students have connected the hexagons and their justification of WHY they have done it that way that allows their learning and thinking to then be assessed using the SOLO taxonomy (or not; the hexagon activity still works with no understanding of SOLO).

Here’s a video showing one way of using the SOLO hexagons in a UK science class.

Here’s an explanation of how to use SOLO hexagons from the SOLO guru, Pam Hooke.

I changed the hexagon activity slightly to suit the needs of my students. The picture shows the instructions that my students received.

instructions for hexagon activity

And here are the hexagons my students used (note that the hexagons were pre-cut for students and placed into zip lock bags with the above instruction card). My students worked in groups of 2 to 4. I used the SOLO hexagon generator to create the hexagons.

Here’s some samples of the hexagons my students made.

20140406-140303.jpg

20140406-140251.jpg

20140406-140235.jpg

Some things I noticed was that:

  • My students were all fantastic at explaining each hexagon concept
  • Some groups connected all the nervous system concepts and the endocrine system concepts together, showing they had an understanding that the nervous system and endocrine system worked together. However all the groups had the immune system concepts separate altogether. I did spend a lot of class time making it explicit that the nervous system and the endocrine system work together to control and coordinate the body. And while the students’ project was to make a fact sheet about how a particular disease/health issue affected the nervous system and the endocrine system, they seem to think that the immune system works on its own and is completely separate from the other systems.

From this activity we discussed their SOLO levels of understanding and how they can use their hexagon connections to see whether they were at a unistructural level, multistructural level, relational level or extended abstract level. Most students concluded they were at a relational level for most concepts and some thought they were extended abstract for some parts of the topic.

The SOLO hexagon activity is definitely something I will use again with my students. Now that they have done it once, the next time will run even better. Feedback from students was that they enjoyed talking about science with each other and that they learnt a lot from each other just by listening to what others had to say about each concept.

 

Small changes can make a huge difference

Over the past few years I have been constantly changing the way I teach due to introduction of 1:1 laptop initiatives in some classes and a continually-developing understanding of how students learn. In a lot of cases it has involved turning things upside down and completely rewriting units of work. This is tiring. Worth it but tiring. But I found out recently that small, minor changes can make a huge difference too. The Student Research Project (SRP) has been around since I was in high school. It’s an oldie but a goodie. The SRP involves students planning, doing and reporting on an experiment of their choice. It is a compulsory activity for all Year 7-10 students in NSW, Australia. Each student must do at least one SRP once in Year 7 and 8, and another one in Year 9 and 10. By doing the SRP, students learn how to design a fair experiment, a must-have skill for all scientists! See here for more info on the SRP.

It was the Year 8’s turn to do the SRP in September this year. The traditional way of doing the SRP is for students to choose an experiment, plan it, do it and then submit a written report. This year my faculty decided to revamp it and not just rehash the status quo. However this didn’t involve major changes that would stress everyone out. It involved a few tweaks that would have the most impact. Like always we gave students the choice of whatever experiment they wanted. My class were doing experiments ranging from water absorption of different types of soils to whether particular types of video games would improve people’s reaction times to using Gary’s Mod to run a simulated experiment. However instead of forcing students to do a written report, we decided to let students choose how to present their SRP findings in whatever medium they wanted. Some students still chose to submit a written report (but by sharing it as a Google document to make the feedback process more efficient) while other students chose to create Prezis or videos. Students had to justify why their chosen medium would be the most effective in communicating their findings to others. At the conclusion of the SRP, students shared their findings with their class over a two-day conference, just like real scientists.

In the presentations I would usually get students to give each other feedback (one medal and one mission) by writing it down on a piece of paper, which I will take home and collate and then give back to students. This was a really inefficient way of doing it. Students had to wait at least 24 hours to get peer feedback and it took me time to type of the students’ feedback. This time I decided to create a backchannel on Edmodo that students used to give feedback to each presenter. Students did this by using laptops. A designated student had the role of creating a post for each presenter and then the whole class will reply to that post with a medal and mission for the presenter. Doing it this way meant that the presenter got the feedback as soon as they finished presenting; they didn’t have to wait till the next day after I’ve collated the class’ feedback. Students really liked the immediacy of the feedback they got from the Edmodo backchannel. There was also one student who made a video for his SRP, but he was ill over the two days of the presentations. His video was still shown and he was able to receive feedback on it at home from his peers via the Edmodo backchannel.

A sample of the Edmodo backchannel

So just with a little of tweaking, the good ol’ SRP has been thrusted into the 21st century. I didn’t have to completely re-write it or turn it upside down. Just by adding Google docs, more student choice and Edmodo, the SRP was made a million times better for students as a learning process. From the end-of-term evaluations, many students from across all Year 8 classes identified the SRP to be their favourite activity this term because it gave them choice, it let them use technology and they learnt by doing.

Next time I’d like to have students sharing their findings with a global audience, or at least with an audience beyond their class. But one small step at a time 🙂

Student Research Project – crowd-sourcing feedback

This is a draft version of a Year 8 assessment task called the Student Research Project. It is quite a task that spans over a month where students plan, conduct, analyse and present on a scientific experiment.

This assessment task has already gone through a few feedback cycles within my school, but I’d like some feedback on it from educators, parents, scientists or anyone beyond that. The task is designed so that it caters for a range of teachers and students. For example the task leaves it up to the teacher and their students to decide HOW they will present the task (they can submit it as a traditional word-processed document or they can make a video, etc). The task can also be turned into project-based learning for those classes that have gone down that path.

Using video as evidence of learning

Today my Year 8s used lollies and toothpicks to model elements, molecules, compounds and mixtures. This isn’t anything new. Lots of teachers and students have done this before. However, I decide to allow students to film themselves explaining how the lolly models they made represent elements, molecules, compounds and mixtures as evidence of learning. For one group, I decided to record a question-and-answer conversation on my iPad.

The video showed that this student understood to a certain extent how particles are arranged in elements, molecules, compounds and mixtures. The student did accurately use the lollies for this, but upon questioning, she was confused about how many different types of particles made up her lolly models of compounds and mixtures.

I’d like this type of evidence of learning to be prominent in schools. As a system I think we rely too heavily on written exams and assignments to elicit student understanding of concepts. Having videos such as the one shown above is much more powerful to give feedback to students and to use as evidence of learning. Eventually I’d like each of teacher in my faculty to a collection of videos like this for professional discussions on our students’ learning.

Write down everything you know … NOW

exam rooom

In the past few weeks the following things have happened that have annoyed me and made me reflect:

  • I completed an exam for my uni subject as part of my postgraduate studies
  • Year 10 students completed their School Certificate exams
  • Year 7 and 9 students completing yearly exams

For those who follow me on Twitter, they all know too well my opposition towards completing an exam for uni. The exam was for a subject called “Social networking and online communities”, and the exam consisted of multiple choice questions, short answer questions and one essay question. The subject was to teach us how to build and sustain a successful online community whose members share and collaboratively create knowledge. In my tweets and my uni forum posts, I complained how this end-of-semester test did nothing but assess our ability to memorise and regurgitate information. The test didn’t actually test my understanding of online communities or my ability to create and sustain online communities. For example I memorised that ethnography involved participant observation, but I have no idea what this means. However, I was able to memorise it and regurgitate it in the exam, so I got a mark for it. While the content of the uni course was actually quite interesting, studying for the exam ruined the learning experience.

Meanwhile, in my last lesson with my Year 10 class before the School Certificate exams, one student asked, “Do we have science after the School Certificate?” I said yes. This student replied “But what’s the point?”

That really upset me. This student saw the purpose of our science lessons as a way for her to pass a test. After the test, learning doesn’t matter. School is supposed to be a place where we nurture the curiosity of young people. School is supposed to be a place where students want to learn. School isn’t supposed to be somewhere you went to pass an exam and then somehow become “free”. What school has become though for many of our students is a place where they cram in as many facts as they can, spill it out in an exam and forget it as soon as they leave the exam room. And what for? So they can get a piece of paper at the end. As a uni student, I hated being treated this way. Besides educational institutions, where else would insist on someone writing answers as fast as they can in a set time frame as an accurate way of finding out what someone knows and can do? It’s not like you get the exam back either. All you get is a piece of paper with a grade and/or a number. You have no idea of which areas you are good at and which areas you can improve on (and how).

So why do schools do it? Why do we as teachers insist on exams?

I’m not saying that tests don’t have their place in education. Regular tests can give lots of useful information to students and teachers, but why can’t we have other assessments that hold the same value in the community as exams. Why can’t we use portfolios, interviews or collaborative assignments that are weighted the same as exams? There must be better alternatives than sitting our students in a hall and telling them “write down everything you know now … you have two hours”. With less emphasis on exams, students would probably enjoy learning at school a lot more. Isn’t that what school is for?

Enhancing formative assessment & personalised learning – add on benefits of gamification

It has been two weeks since the implementation of gamification in my Year 10 Science class. Five out of six teams have completed the first two quests and have been awarded the achievement badge of “Cool Scientist” and the password to level up to Quest 3. Engagement and motivation has definitely increased for 99% of the students. I now get nervous when I log onto Edmodo because I know there’ll be heaps of work uploaded by the students with comments such as “please mark asap”. At the end of every lesson, almost every student submits one or two pieces of work on Edmodo for me to mark. I have to be honest – marking their work every night has been hard work. However, because the students are handing in quality work so regularly, I can easily analyse their areas of strengths and areas for improvement.

Before I go into this further I want to emphasise that every teacher, including myself, knows the benefits of formative assessment (For non-education readers formative assessment is about finding out what students can and cannot do regularly in class tasks. Students are given detailed written feedback. In many ways it is more effective than making students sit an end-of-topic exam). However, many teachers know how difficult it is to gather student work regularly for assessment. Many classrooms involve students doing a task and then the teacher going through the answers together with the whole class. Students mark the answers themselves and many students do not know what they need to improve on and more importantly how they can improve.

So back to gamification …. Since the students are so keen to submit their work, I had an opportunity after every lesson to see whether they “get it”. And what I found is that the design of scientific experiments is much harder for this class than I expected. I also found out they cannot construct tables to present data in a way to show trends. While most students understood independent, dependent and controlled variables, a selected number of students still didn’t. From this I was able to provide detailed written feedback via Edmodo for each student after every lesson. I was also able to plan mini-lessons at the start of each lesson to go through the concepts they did need to improve on. This was followed by students working in teams on their quests.

I can see so much potential with using gamification to enhance formative assessment, which branches off into better personalised learning plans. When I implement gamification for the next topic, I want to use it to enhance personalised learning. Here’s my idea – When students complete quests in the game, there are multiple parallel levels (tasks) that I as the teacher can give the students depending on their need. For example, the next topic is chemical reactions. If a student is capable of completing word chemical equations, I can give them the next level of writing chemical equations with chemical symbols as their “level up”. However for a student who needs more time with word equations I will provide them with more levels of practicing chemical equations. Points and leveling up is tailored for each student. I know this is a very ambitious plan and I’m still ironing out some ideas, but I think using gamification to engage and motivate, enhance formative assessment and better inform personalised learning can reap great benefits for our students.