Do women improve decision-making on boards?

25 Nov 2023 | A fact is not data, A statement is not fact, Confirmation bias

Last week, Harvard Business Review published an article entitled “Research: How Women Improve Decision-Making on Boards”. It was widely shared on LinkedIn and someone tagged me in it, given my research on diversity, equity, and inclusion. When I became Managing Editor of the Review of Finance, I appointed the first women to its board of editors in our 20-year history, so I’d like to believe the findings. However, it’s important not to take claims at face value, particularly when we’d like them to be true, because confirmation bias may be at play. I read both the article and the research underpinning it, and unfortunately the evidence doesn’t come close to supporting the headlines. My analysis below is based purely on the quality of the evidence, and not at all by the claims or my beliefs on diversity.

1. The Study Doesn’t Measure Decision-Making Quality

Despite the title (“improve decision-making”), the study doesn’t measure the quality of decision-making. Nor does it  measure the quantity of decision making. Nor does it measure anything to do with decisions at all, nor anything related such as firm performance. It simply asks board members how women behave in the boardroom. Reported performance is very different from actual performance, as I have explained in a previous post. Given the unstoppable push towards more board diversity, and the widespread claims that diversity improves performance (even if based on cherry-picked evidence), respondents may well report superior behaviour because they believe it to be true.

The underlying paper has a much more accurate title, “Women Directors and Board Dynamics: Qualitative Insights from the Boardroom”. However, as usual, HBR went with something more click-baity.

2. The Sample Selection is Highly Skewed

For a study on board diversity, the sample selection is highly non-diverse. Out of the 49 directors interviewed, 73% (36) were women and 27% (13) were men. Moreover, the individual responses cited in the paper are highly skewed. For the “supporting evidence” for the topic “Politics in board meetings”, the authors cite statements from five women and no men. For “Point of attention”, it is eight women and no men. For “Openness to different points of view”, it is 8-1; for two other questions it is 6-2; for two more it is 5-2. The study essentially finds “female directors claim female directors behave better”. (Similarly, a study should not overrepresent men, as it might also lead to biased findings).

Moreover, how the 49 directors were chosen is highly suspect. They first reached out to “personal contacts through our universities and alumni networks”. Since a study claiming that diversity improves performance is more likely to be published, a respondent may be skewed towards giving results that would benefit his/her alma mater. Even worse, then they “asked initial interviewees to suggest other board members from their networks”. This may lead to completely the opposite of cognitive diversity, as interviewees are more likely to suggest others who will reinforce their responses.

I am very open to qualitative research, and have conducted it myself in a recent survey of directors and investors on CEO pay. However, in qualitative research, where you obtain people’s subjective opinions, it is crucial to ensure an unbiased sample. We attempted to contact every director of a FTSE All-Share company, and every fund manager of a UK equity fund.

3. The Data Collection is Highly Suspect

The authors conduct interviews with open-ended questions. Thus, their interpretation of the responses to the questions is highly subjective and may be skewed by the authors’ own biases. Indeed, this seems to be the case. Consider, for example, the response “I have never sat on a board where a woman says nothing. Whereas I have sat on boards where men say nothing”, which is highlighted not only in the paper but also the HBR article. This is used as evidence that women contribute more.

In my TED talk, I recommend asking yourself how you’d react if a statement was the opposite. What if a respondent said “I have never sat on a board where a man says nothing. Whereas I have sat on boards where women say nothing?” This may be interpreted as evidence that men talk too much and feel they have to say something regardless of expertise. Indeed, in the very same table row in the paper, there is a quote ” ‘I have absolutely no knowledge in this area, so I am looking forward to hearing what you have to say.’ And I thought to myself that not many men would have said something like that”. Thus, listening is initially praised, but a couple of lines down listening is then condemned as a lack of contribution.

Second, there are no statistical tests anywhere in the paper. We don’t know whether the number of respondents who claimed women contributed more is significantly higher than those who did not. The paper only contains cherry-picked quotes, and every single quote given is in favour of the authors’ hypothesis (for a paper on diversity, the views presented are non-diverse). This contrasts a standard empirical paper, where authors will often include a couple of specifications with insignificant results.

Third, the interviews are in-person, and both authors are women. Even though none of the interviewees is identified in the paper, it would still be very unlikely for respondents to say out loud that women are less effective directors.

For this reason, interviews are almost never used as the primary data collection method in economics. The main way of obtaining qualitative insights is through surveys. Here, questionnaires are distributed to respondents. This has the following advantages:

  • Every respondent is given the same questions.
  • Every question is worded in exactly the same way. This is particularly important since slight variations in the question, or even the tone of voice, can skew a respondent’s answer. The researchers rigorously beta-test the wording beforehand to ensure no biases
  • Respondents fill in the surveys themselves, without the pressure of researchers being in the room
  • Respondents choose a number (often between 1 or 5) for how strongly they agree with a statement, meaning no subjective interpretation by the researcher is necessary (or possible)
  • Since these responses are numerical, they can be statistically analysed

Sometimes survey reseachers will conduct follow-up interviews on a small proportion (e.g. 5%) of respondents to find out why they answered in the way they did, to ensure correct interpretation of their responses. But the responses have already been made, and the main inferences are obtained from a statistical analysis of the numerical responses.

Unfortunately, these concerns mean that we can take little away from the study. Women directors might still contribute more than male directors. And even if they don’t, there could be societal, moral, or ethical reasons for pursuing boardroom diversity. But this particular paper does not make either case.

If at first you don’t succeed, try try again

If at first you don’t succeed, try try again

One of the papers that I cited most prominently in Grow the Pie is "Corporate Sustainability: First Evidence on Materiality". It shows that ESG doesn't always pays off: firms with high ESG scores don't beat the market; only those that focus their ESG efforts on issues material to their industry. For example, climate change is a serious global threat, but it isn’t the most important concern for a tech company that conducts its business in the cloud rather than along the coastline. Thus, a tech company that’s best-in-class in its carbon footprint doesn’t beat the market; instead, it ...
Want a more innovative conclusion? Innovate the conclusion

Want a more innovative conclusion? Innovate the conclusion

'Want a more innovative company? Hire more women'. The title hooked me immediately. I’m an avid follower of the @TEDTalks Twitter page, but I don’t have time to watch every talk. But when I saw one with the title ‘Want a more innovative company? Hire more women’, I wanted to hit play instantly.
You couldn’t even make it up

You couldn’t even make it up

Confirmation bias leads us to make up excuses to dismiss facts we don’t like. If our favourite politician gets elected and the economy tanks, we’d argue it would have done worse had she not been in charge. Or we’d protest that we need to wait another year before we can truly evaluate her performance.