Noun-phrasing is strategy used by psychologists to influence certain (desired) behavior, for example going to vote. Simply said, the effect of asking someone to go voting is much less effective than using the noun-version: asking this same person 'how it feels to be a voter'.
The same approach has now been applied to children, and rather than asking them to vote, they were asked to help them.
The experiment involved discussing the topic of helping with three groups of children: 1 group discusses that ‘some kids are helpers' and the other group discusses that 'some kids choose to help’, and a third group did not discuss anything about helping. These groups were then put into a situation similar to home (surrounded by toys), and 4 different ‘help tasks/situations’ (put away toys, open a container, clean a mess and pick up spilled crayons) were simulated and the researchers counted the number of occasions of kids actually helping. Of course, this was done experimentally, so two groups were tested, one of which used noun-phrasing, the other one just ‘asking kids for help’.
The outcomes, as you would expect, indeed show that kids respond much better to the noun-condition than to the verb-condition - 29%. There was no difference between the two other groups, making the noun-based intervention even more significant.
Why does this work? The basic hypothesis of these two studies are explained below:
Probably this week or the next, edX will roll out a - I think - transformational tool for teachers: A/B testing functionality inside the course development environment. It allows teachers to do - as you might guess - A/B testing, which means testing different ‘versions’ of the course to randomized groups of students. Within one course, teachers can experiment with different types of videos, test out motivational strategies, implement psychological interventions, and see the results in (student) engagement nearly instantly and results in achievement over time. See screenshot below for an impression of the interface.
Why transformational? It enables teachers to better understand what works, how content should be created to have maximum effect, through a rather rigorous process of trying different versions to randomized groups of students. Not just the click-data can be used to determine “the best solution”, but combined with the available student data, analyses can be done to better understand different versions for different subgroups of students. For example, if you want to know the effects of different message strategies on motivating students to do their homework, a result could be that <Imadethisup>a direct call for action is more effective among younger US students, while nudging strategy proved to be effective among Asian student populations.</Imadethisup>
Traditional one-directional lectures are moving online, greater and much more diverse populations of students need to be served (including adults), and these kinds of tools will become invaluable in order to be able to provide a level of personalization and enhance teachers’ capabilities to better understand their product and student population.
I will use this tool in a collaboration with Stanford University Psychology Dept., to design an experiment addressing self-affirmation theory in our upcoming MOOC on Next Generation Infrastructures. We will keep you posted about that.
More info on the A/B testing tool for edX here.