Spare a thought for the ‘experimenter effect’ in user research

Do you ever think about the impact of the experimenter effect (or Hawthorne effect) when you’re running face to face user research?

Here’s a quick test.

First, go and check your Analytics package to see how many users check your site’s Terms and Conditions before accepting them. My guess is that the number will be roughly 1-3% (maybe lower).

Now, take a look at the notes from your last few usability research projects. How many users diligently looked at the Terms and Conditions while you were watching them over their shoulder? In my last few projects, it’s been 10-30%

So, that’s roughly 10x more in my case. Pretty substantial. This is a perfect example of how people adjust their behaviour in face to face research sessions. As soon as you pay someone to sit in a room with you, give them a task and watch them intently, they will start doing and saying what they think you want them to.

The experimenter effect is unavoidable. I’m a huge advocate of face-to-face research, but this is one of the method’s biggest weaknesses (and in equal parts, it’s one of the biggest strengths of Analytics).

What steps do you take to mitigate the experimenter effect?

Comments below please!