Does board diversity really improve environmental performance?

16 Dec 2023 | A statement is not fact, Confirmation bias, Data is not evidence, Diversity, Evidence is not proof

My LinkedIn feed came alive with this post:

 

 

Francesca Gino is the Harvard Business School professor accused of faking data in her research. HBS is conducting an investigation into the matter and I am reserving judgement until its outcome; this post is on something completely different. My TED talk highlighted that even if data is 100% accurate, we may make misleading inferences from them. Gino’s post spreads misinformation by making claims that are not at all justified by the actual evidence. I went to the HBR article that Gino’s post linked to, and the study on which the article was based. Unfortunately, all three (the study, the HBR article written by the study’s author, and Gino’s post on the HBR article) all make elementary errors.

1. The Study Does Not Measure ESG Ratings

Here’s the title of the HBR article:

and here’s the abstract of the paper:

The HBR article mentions ESG ratings, and the abstract mentions ESG three times. However, the paper never studies ESG ratings; it only studies the Environmental scores from a single data provider and ignores both Social and Governance performance. You cannot extrapolate from a single tree to the entire forest, particularly since E, S, and G are often unrelated to each other and sometimes conflict.

The start of Gino’s post correctly describes the study as being about environmental performance. However, she later refers to “sustainability” twice (which typically is used to mean ESG all together), and “innovation”, which is completely different.

This is an example of the first misstep up the Ladder of Misinference: a Statement is not Fact because it may not be Accurate.

2. The Study Ignores Many Common Factors

What if we only care about environmental performance and not ESG performance? Sure, the study, the HBR article, and (to a lesser extent) Gino were a bit sloppy with their writing, but this doesn’t change the fact that the study finds a correlation between diversity and environmental performance. However, all three mistake this correlation for causation, claiming that that diversity causes better environmental performance:

  • Study: Its title ends with “Board Diversity Helps Improve Firm Sustainability” and the abstract recommends “legislation aimed at improving board diversity”.
  • HBR article: It claims that “board diversity tangibly and positively affects a firm’s environmental sustainability”.
  • Gino’s post: Its first paragraph states that “Increasing board diversity boosts a firm’s environmental performance.”

The only control variable that the author uses (aside from board metrics and fixed effects) are 12-month Fama-French risk-adjusted stock performance. Thus, a huge number of common factors could be driving both board diversity and environmental performance:

  • Profitability. More profitable firms can invest more in environmental performance. They may also have the time and space to invest in board diversity.
  • Financial strength. This can drive both outcomes, for similar reasons. Several measures are available, such as leverage, cash holdings, and whether the firm pays a dividend.
  • Corporate governance. Well governed companies improve environmental performance, and also have more diverse boards.
  • Management quality. This can drive both outcomes, for similar reasons.

In addition for common factors, there could be reverse causality, where firms with stronger environmental performance attract more diverse directors. Given the push for diversity, diverse directors may be able to take their pick of firms, and they may be more likely to choose companies with strong environmental performance to avoid the reputational damage from being associated with a firm that suffers an environmental scandal.

This is an example of the third misstep up the Ladder of Misinference: Data is not Evidence because it may not be Conclusive.

3. The Articles Make Claims That Extend Far Beyond the Findings

Even if the study had perfectly nailed down its results, and there were no problem correlation and causation, the articles make many claims that extend way beyond the actual analysis conducted. The following are examples from the HBR article:

  • “Shareholders should understand that DEI is not just about improving diversity, but embracing the whole host of benefits that come along with it. Leadership diversity may bring broad-based benefits to society by promoting sustainable business practices.” But the author only considers a single outcome, not “a whole host of benefits”, “broad-based benefits”, or “sustainable business practices”.
  • “Business leaders ought to bolster minority directors and give them the space to champion issues they care about.” This is a huge stretch. Directors do not have the prerogative to champion issues that they themselves care about. Some argue that directors are accountable to shareholders and thus should only champion issues their long-term shareholders care about (and many indeed have concerns beyond financial returns, including environmental performance). Others argue that directors are accountable to stakeholders and so should champion issues that their stakeholders (e.g. employees) care about. Either way, directors should not champion their own  pet causes.
  • “Regulators might consider ways to promote board diversity … the implementation of diversity rules would accelerate levels of diversity, bring positive environmental benefits, and ensure minority directors have a strong voice in the board room.” Again, this is a huge stretch. Even if there were a causal link between diversity and performance, this does not imply a case for regulation, as this would prevent firms from choosing the optimal level of diversity to maximise performance. This optimum likely varies from firm to firm rather than being one size fits all.

This is an example of the fourth misstep up the Ladder of Misinference: Evidence is not Proof because it may not be Universal.

4. The Interview Results are Unreliable

The “mixed methods approach” described in the abstract of the paper involves supplementing the quantitative analysis with qualitative interviews. Unfortunately, as I wrote in a prior post, such interviews are unreliable as they suffer from confirmmation bias. The author describes the interview results in his HBR article: “I showed my results to several business leaders from a social minority background. … Interestingly, few were surprised by what they saw”. As Mandy Rice-Davies is often paraphrased, “They would say that, wouldn’t they?” – it’s unsurprising that minority leaders would believe that minority directors improve performance. In addition to interviewing board members, the author undertook “interviews with consultants, especially with those focused on Diversity, Equity, Inclusion (DEI) efforts and sustainability”, which makes it unsurprising that they will claim a link between DEI and sustainability.

In total, the study interviews 11 people. All 11 are women; the author does not follow his own advice to obtain a diversity of perspectives. Over half of them have DEI listed as an area of expertise (see Table 1).

The Bigger Picture

The average reader won’t have the time to delve into the weeds of a study and understand its methodology. In my TED talk, I highlighted a simple short-cut is to examine the credentials of the authors. The author is an Associate Consultant at Bain, which is an entry-level position, and the study appears to be an undergraduate thesis. It is incredulous how HBR spread the results of an unvetted, unchecked undergraduate thesis with its substantial readership. I have written extensively about the merits of peer review, but also recognised that peer review is slow and we can’t always wait until a study has crossed the finishing line before writing about it; thus, it’s legitimate to cover working papers by professors with expertise in research. But, while this might be a fine undergraduate thesis, the author has few research credentials.

Most of the blame does not lie with the author. If I had been invited to write an article for HBR when an Analyst at Morgan Stanley, I would have jumped at the chance, rather than admitting my lack of expertise and declining. Perhaps being invited to write such an article would have made me think that my undergraduate thesis was special, and got carried away and innocently made claims that extended beyond the study. It is HBR who is at fault here for amplifying such a flimsy study. How is it that HBR ended up inviting an associate consultant to write an article based on his undergraduate thesis? Because of confirmation bias: they start with a view that diversity pays off, and thus invite articles that claim this, no matter how flimsy. Unfortunately, this is far from the first time that HBR has done this: see here for the problems in another diversity article they published.

Gino then amplified the HBR study further. Despite the recent controversy, Gino remains a LinkedIn Top Voice and has a large following. She further made claims not even suggested by the author, arguing that it is “groundbreaking”, when in fact multiple articles have purported to find benefits of diversity. For example, this Reuters article claimed that diversity improves environmental performance, and it was widely shared on LinkedIn even though the study that Reuters wrote about didn’t even exist. People shared the article without even checking whether there was a study behind it.

I recognise that I have written three posts highlighting the flaws in studies claiming that diversity improves performance. This is not due to any ideological bias against diversity; as an ethnic minority, I would benefit from diversity initiatives, and my own research shows a positive link between DEI and performance. Instead, my goal is only to highlight the importance of rigorous research. Given confirmation bias, unfortunately flimsy papers claiming benefits of diversity are more common than flimsy papers claiming its costs.

Inside the Ivory Tower

Inside the Ivory Tower

In May Contain Lies, I highlight the value of academic research. While it's far from perfect, it can be more reliable than practitioner studies for a number of reasons: Its goal is scientific inquiry, rather than advocacy of a pre-existing position or releasing findings to improve a company's image. It's conducted by those with expertise in conducting scientific research. Papers published in top scientific journals are peer-reviewed, which helpsimprove their accuracy. However, authors, journalists, and practitioners will sometimes cite research as if it bears the hallmark ...
Does only 2% of VC funding go to female founders?

Does only 2% of VC funding go to female founders?

A widely quoted statistic is that only 2% of VC funding goes to female founders. For example, this Forbes article highlights that "only 2% of all VC funding goes to women-led startups" and asks "Why is only 2% of VC funding going to female founders"? If true, this statistic is substantial underrepresentation and needs to be urgently addressed. However, it's problematic for several reasons. 1. The Statistic Ignores Diverse Teams The 2% statistic actually refers to companies founded solely by women. It ignores diverse companies founded by both men and women. This is strange, because ...
An unhealthy obsession with organisational health

An unhealthy obsession with organisational health

Two leading asset management firms drew my attention to the McKinsey Organizational Health Index as a potential tool to evaluate a company. A book, "Beyond Performance 2.0: A Proven Approach to Leading Large-Scale Change", written by two McKinsey partners, claimed that companies with high scores on this Index trounced their unhealthy peers along a range of performance measures. For example, their shareholder returns were three times as high. But as I wrote in an earlier post, rather than being more impressed by big numbers, we should be more sceptical. If it were really possible to ...