HOME > 人事コラム > Secrets of Human-Capital Management 【August 2019】

Secrets of Human-Capital Management 【August 2019】

Bumps on the road to evidence-based HR

By David Creelman

 

HR often has to deal with managers who have strong opinions about people management; that can be a problem when those opinions are not based on the facts. A manager may feel “the only way to motivate these people is to pay them more” or “my team shouldn’t need to be praised” even though research has shown otherwise.  Thus, it’s a relief to see the slow shift towards evidence-based management. I’m always delighted when I see a manager or HR professional base their recommendation on evidence, whether that be the academic research on goal setting, a survey of market pay rates, or an internal study on retention of seasonal staff.


We are on the road to evidence-based HR, which is a good thing, but we’ll need to steel ourselves for some bumps in the road.

 

What we think is evidence-based isn’t always so

Everyone “knows” that research has shown interviewers usually make a decision in the first few minutes of an interview. I needed to get more detail on this and was surprised to find that this well-known fact wasn’t a fact at all. Research by Frieder et al found that only 30% of interviewers reported making a decision by the fifth minute of the interview.  The idea that interviewers make snap decisions seems to be based on single study conducted decades ago on a small sample size. The results of that poor old study made it into the folklore of management. (see Frieder, R. E., Van Iddekinge, C. H., & Raymark, P. H. 2016. How quickly do interviewers reach decisions? An examination of interviewers' decision-making time across applicants. Journal of Occupational and Organizational Psychology, 89(2), 223-248.)


More recently Harvard Professor Amy Cuddy did a TEDtalk claiming that research had shown that standing in the “Wonder Woman post” would increase confidence (see TEDtalks “Your body language may shape who you are”). This idea, so appealing, spread rapidly. Unfortunately, further research cast doubt on the finding. To add to the confusion, even more recent research suggests there may be something to Cuddy’s conclusion after all.


A more complex case involved an experiment with Israeli fighter pilots that appeared to show punishment was more effective than praise in improving performance. In the experiment, pilots who had an usually good training run were praised, while those with a particularly poor run were punished. What happened? Performance of the punished pilots improved while those of the praised pilots declined. It seems like a definitive finding until you recognize that if you have an usually good run the next one is likely to be worse and if you have a particularly bad run the next one is likely to be better. The phenomenon is called “regression to the mean” and it invalidated the conclusion that punishment was better than praise. (see Tversky, A. and Kahneman, D. (1974) Judgement under Uncertainty: Heuristics and Biases. Science, 185, 1124-1131.)

 

We could go on and on with examples of things that people commonly believe are backed by research that, in fact, are not. These myths include the so-called “first mover advantage” and the idea that 70% of change efforts fail.

 

What to do?

When we present something as evidence-based and it turns out not to be true then two things happen:

  • It undermines our credibility
  • It discourages us from being evidence-based

Those are bad outcomes, so let’s see how we can avoid them.  The first step is to recognize that evidence-based management is in its infancy and so of course there will be rough spots. The Wright Brother’s plane was pretty lousy but that didn’t lead us to believe that the whole project of human flight should be abandoned. So don’t be discouraged when you do make missteps.


The second step is to do a little research before you quote a “well-known fact”. I found the critique of Amy Cuddy’s work simply by Googling “Critique of Amy Cuddy” and I found the research on “snap hires” by starting with a search in an academic database on “How fast do interviewers make decisions”. It takes some playing around to find the right search terms, however it’s not that difficult. If you don’t have time to do this yourself then hire a graduate student on Upwork and ask them to check if the fact you have in mind is largely true (Note: these things can turn into huge research projects if you just ask the student to dig into a topic and normally that isn’t what you are after.  Ask the student to spend perhaps 1 to 3 hours looking into the topic to ensure you are not way off base with regards to the fact you are citing).

 

The third step is to be modest in presenting your conclusions and to cite your sources. If you were to say “apparently most interviewers make snap decisions” then your credibility will remain largely intact even if someone does dig up research showing otherwise.


Conclusion

We are better off making decisions based on the best available evidence rather than someone’s opinion.  Right now, that’s still hard to do and there is a lot of misinformation floating around. Don’t be discouraged. HR is getting better at this all the time and the resources that make it practical to assess the best possible evidence are improving rapidly. The trick is to balance your enthusiasm for evidence with a recognition that there will be some big bumps on the road.

 


 

David Creelman is CEO of Creelman Research. He is best known for his workshops on People Analytics, Evidence-based Management and the Future of Work.  You can connect to Mr. Creelman on LinkedIn or email him at dcreelman@creelmanresearch.com

 

人事コラム一覧へ

 

 

みのりQ&A

人事用語集

みのり風土調査