Home Course Concepts About

Psychological safety

Overview

Psychological safety is a shared belief within a workgroup that people are able to speak up without being ridiculed or sanctioned. When psychological safety is present, team members think less about the potential negative consequences of expressing a new or different idea than they would otherwise. As a result, they speak up more when they feel psychologically safe and are motivated to improve their team or company. There are no topics which team members feel are “taboo” (an unspoken understanding that certain issues are not to be discussed and resolved). The concept has been developed over the past 20 years by psychology researcher Amy Edmondson [Edmondson 1999], building on earlier research on workplace engagement [Kahn 1990] and on individuals’ confidence in their ability to manage change within an organization [Schein and Bennis 1965].

In the absence of psychological safety, people will hesitate to speak up when they have questions or concerns related to safety. This can lead to under-reporting of incidents, to poor quality of investigation reports (since people do not feel that it is safe to mention possible anomalies which may have contributed to the event), and to poor underlying factor analysis (it is easier to point the finger at faulty equipment than at a poor decision made by the unit’s manager). It can hinder organizational learning [Edmondson 2018].

Psychological safety is a component of a just culture, an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information (including concerning mistakes made), and in which they are also clear about:

  • where the line must be drawn between acceptable and unacceptable behaviour, and
  • who gets to draw that line

[Edmondson 1999] found that high-performing medical teams reported more mistakes than their low-performing counterparts. Note the classical counter-intuitive effect here: you would expect a lower performing team to make more mistakes, and likely they do, but they report a low proportion of these mistakes. The research determined that these higher-performing teams were characterized by higher levels of psychological safety, so were more willing to report errors to their superiors, without fearing consequences such as blame or ridicule.

The concept has been recognized as important for team performance and degree of innovation by research at Google. Organizational rituals put in place at Google’s “moonshot” division X to improve psychological safety and encourage risk-taking at a business level included celebrating teams that killed one of their projects, casting failure as success [Bergmann and Schaeppi 2016].

A literature review of studies on this concept [Newman, Donohue, and Eva 2017] found that psychological safety has impacts on:

  • level of communication, knowledge sharing and voice behavior within teams;

  • learning behaviour (this impact is related to research on social learning, the way in which individuals learn through interactions in a social group);

  • performance, innovation and creativity;

  • work attitudes of individuals, such as organizational commitment.

Link with organizational silence

The lack of psychological safety is one factor that can lead to “organizational silence” (also called “employee silence”), a concept developed by management theorists in the beginning of the century [Morrison and Milliken 2000; Milliken, Morrison, and Hewlin 2003; Pinder and Harlos 2001]:

the withholding of any form of genuine expression about the individual’s behavioral, cognitive and/or affective evaluations of his or her organizational circumstances to persons who are perceived to be capable of effecting change or redress.

Earlier research in management recognized that organizations are typically intolerant of dissent and send a message (sometimes explicit, often implicit) that employees shouldn’t “rock the boat”, or should “be team players”.

Note that psychological safety is a group phenomenon, and is a component of the culture of small work teams. It is influenced by the organizational culture,The culture of an organization is the set of beliefs, assumptions, values, norms, artifacts, symbols, actions, and language patterns shared by all members of an organization.

by the professional group’s subcultureSubcultures within an organization are formed when members with common identities or job functions gather around their own interpretations of the organizational culture. Organizational theorist Edgar Schein identified three subcultures that are present in many organizations: executives, whose primary concerns are financial performance, engineers, who resolve problems using technology and specialist knowledge, and operators, who run the systems within the organization. There are often also much narrower subcultures present within complex organizations.

and by the more local characteristics of the specific work group. It is likely also to be influenced by the national culture (see box below). Psychological safety is not the same as interpersonal trust, which concerns two individuals, even if there is some overlap between the consequences of the two phenomena.

Influence of the national culture

Influential research by social psychologist Geert Hofstede studied the impact of national culture on values in the workplace, based on the analysis of a large database of employee “value scores” collected within IBM between 1967 and 1973, covering 50 countries. He used factor analysis to identify a number of underlying dimensions along which cultural values can be analyzed: individuality vs collectivism, power distance, tolerance of uncertainty, masculinity vs femininity, indulgence vs restraint, long or short term orientation (the latter two dimensions having been added in later studies) [Hofstede 2001].

Most research on the concept of psychological safety has been undertaken in Western countries which score low on attributes such as collectivism, power distance, and uncertainty avoidance, which tend to favour the expression of new ideas. National cultures in many countries in Asia have much higher scores on these attributes, and seem likely to lead to differences at the work team level. Hofstede cautioned against attempting to apply organizational theories developed in one country to other countries, in particular American theories that have a cultural bias such that “the extreme position of the United States on the Individualism scale leads to other potential conflicts between the U.S. way of thinking about organizations and the values dominant in other parts of the world” [Hofstede 1980, 61].

Possible interventions

Interventions that aim to modify features of the organizational/workgroup culture are increasingly popular among management consultants. Although it’s worthwhile being wary of interventionsIndeed, business school professor André Spicer argues that “most management initiatives remain utterly bereft of any evidence base. The result is that managers largely operate on the basis of superstition rather than fact” [Spicer 2017].

that aim to change something as difficult to define and deeply rooted as organizational culture (management consultants will have invoiced and be long gone before any changes to the culture of the company appear), there is a fair amount of evidence (including the study undertaken at Google mentioned above) that psychological safety can be improved by targeted interventions.

When psychological safety is low, it may be improved by:

  • training for managers in encouraging feedback from their colleagues and adopting a more participatory management style (empowering employees to take part in organizational decision-making, encouraging workers to voice their concerns). This is also called “leader inclusiveness”.Leader inclusiveness designates the words and deeds exhibited by leaders that invite and appreciate others’ contributions, or the extent to which leaders demonstrate openness and exhibit accessibility in their interactions with followers [Nembhard and Edmondson 2006]. It has been shown to be linked to the quality of learning from failures within a team, and to team performance [Hirak et al. 2012].

  • training for team members in voicing their concerns in an explicit manner, being assertive without being confrontational;

  • incentives for reporting incidents and making suggestions.

The first two points are commonly addressed by Crew Resource Management (CRM) training, as developed in the aviation industry over the past 20 years and now spreading to other industry sectors.

One factor that can affect psychological safety is the authority gradient (or power distance, a related concept used by Hofstede as described in the box above), the way in which decision-making power is distributed (or perceived to be distributed) among a work team. A steep authority gradient (characterized by a leader with an overbearing approach to decision-making, who makes no effort to integrate the opinions of other team members) will discourage team members from contributing their input to decisions and reduce the level of feedback, decreasing psychological safety. CRM or human factors training also commonly address the impact of authority gradients on teamwork, decision-making and safety.

Asiana Airlines Flight 214, in which a Boeing 777 crashed on its approach into San Francisco airport in 2013, lead to 3 deaths and numerous injuries. The investigation into the accident found that the pilots mismanaged their descent and realized too late that they were descending too fast, after having relied excessively on aircraft automation and having an incorrect mental model of the plane’s automated systems. The investigation report also states “Among the other issues raised by the investigation are some that long have concerned aviation officials, including hesitancy by some pilots to abort a landing when things go awry or to challenge a captain’s actions”, indicating that lack of psychological safety on the flight deck may have contributed to the accident.

British Midland Flight 92, commonly known as the Kegworth disaster, crashed near Kegworth in the United Kingdom in 1989 while attempting to make an emergency landing, killing 47 people on board. A fan blade broke in the left engine of the aircraft, impacting the air conditioning system and leading to the cabin being filled with smoke. The pilots believed that this indicated a failure of the right engine, because most 737 aircraft ventilate the passenger cabin from the right-hand side, but the 737-400 model they were flying used air from both engines. The pilots therefore powered down the only working engine, asked for full power from the damaged engine which led it to catch fire and fail completely. The captain had made an audio announcement to the passenger cabin indicating that he had shut down the right-hand engine, which concerned several passengers who saw smoke and flames emerging from the left engine. Though they indicated this to the cabin crew, who reported that they had not heard the pilots state “right-hand engine”, the flight crew did not communicate this concern to the pilots. There was at the time a strong division of responsibilities between pilots and cabin staff, which contributed to the failure in communication, and recommendations made by the investigation report led to better coordination between the cabin crew and flight crew.

In an article titled Silence that may kill: When aircrew members don’t speak up and why, [Bienefeld and Grote 2012] present a study undertaken with a European airline that analyzed why aircraft crew members weren’t always speaking up about safety critical events in flight. Their study, using anonymous questionnaires, found that participants (pilots, pursers and flight attendants) spoke up only 52% of the time regarding safety-critical information and outlines the main reasons given for the silence: differences in status, fear of damaging relationships, feelings of futility, lack of experience in the job role, negative impact on others and fear of punishment.

The level of psychological safety on the Deepwater Horizon offshore drilling rig, whose 2010 explosion in the Gulf of Mexico (Macondo well) killed 11 people and generated an environmental catastrophe, seems to have been low. A survey of workers on the rig a few weeks before the disaster, concerning “safety management and safety culture” issues, found that almost half of the crew members surveyed felt that the workforce feared reprisals for reporting unsafe situations [USNC 2011, 224].

Criticism of the concept

Some research, reviewed in [Newman, Donohue, and Eva 2017], has identified possible negative effects of psychological safety (though more in the “too much of a good thing” sense than an outright rejection of the notion):

  • Teams high in psychological safety and in utilitarianismUtilitarianism is an ethical framework in which the decision-maker examines primarily the decision’s outcomes and consequences in determining the best course of action. It contrasts with what these authors call formalism (sometimes called deontological ethics), which relies on past precedent, societal norms, and rules in making decisions.

    may be more likely to engage in unethical behavior [Pearsall and Ellis 2011]. Indeed, team members may be more likely to choose the most beneficial option when making decisions, once it has been raised as a possibility, even if the option may be unethical.

  • High levels of interpersonal trust can lead to lower team performance due to lower levels of monitoring within self-managed teams [Langfred 2004]. This result was only found in teams where individual autonomy was high; when individual autonomy was low (team members working together much more), higher levels of trust led to improved performance.

Conclusion

Psychological safety is important, but not sufficient to ensure that an organization works effectively with respect to safety issues. In particular, if the concerns that people raise by speaking up and by reporting possible risks do not lead to changes and to visible improvements, people will over time give up on reporting. This latter issue concerns information flow within organization, decision-making concerning resource allocation, and questions of power, all issues which are typically more difficult to change than interpersonal trust and local climate within work groups.

Image credit: my hobby, CC BY-NC licence

References

Bergmann, Bastian, and Joe Schaeppi. 2016. A data-driven approach to group creativity. Harvard Business Review. https://hbr.org/2016/07/a-data-driven-approach-to-group-creativity.

Bienefeld, Nadine, and Gudela Grote. 2012. Silence that may kill: When aircrew members don’t speak up and why. Aviation Psychology and Applied Human Factors 2(1):1–10. DOI: 10.1027/2192-0923/a000021.

Edmondson, Amy C. 1999. Psychological safety and learning behavior in work teams. Administrative Science Quarterly 44(2):350–383. DOI: 10.2307/2666999.

Edmondson, Amy C. 2018. The fearless organization: Creating psychological safety in the workplace for learning, innovation, and growth. Wiley, ISBN: 978-1119477242.

Hirak, Reuven, Ann C. Peng, A. Carmeli, and J. M. Schaubroeck. 2012. Linking leader inclusiveness to work unit performance: The importance of psychological safety and learning from failures. The Leadership Quarterly 23(1):107–117. DOI: 10.1016/j.leaqua.2011.11.009.

Hofstede, Geert. 1980. Culture’s consequences: International differences in work-related values. 1st ed. Sage, ISBN: 978-0803914445.

Hofstede, Geert. 2001. Culture’s consequences: Comparing values, behaviors, institutions, and organizations across nations. 2nd ed. Sage, ISBN: 978-0803973244.

Kahn, William A. 1990. Psychological conditions of personal engagement and disengagement at work. Academy of Management Journal 33(4):692–724. DOI: 10.5465/256287.

Langfred, Claus W. 2004. Too much of a good thing? Negative effects of high trust and individual autonomy in self-managing teams. Academy of Management Journal 47(3):385–399. DOI: 10.5465/20159588.

Milliken, Frances J., Elizabeth W. Morrison, and Patricia F. Hewlin. 2003. An exploratory study of employee silence: Issues that employees don’t communicate upward and why. Journal of Management Studies 40(6):1453–1476. DOI: 10.1111/1467-6486.00387.

Morrison, Elizabeth W., and Frances J. Milliken. 2000. Organizational silence: A barrier to change and development in a pluralistic world. Academy of Management Review 25(4):706–725. DOI: 10.5465/AMR.2000.3707697.

Nembhard, Ingrid M., and Amy C. Edmondson. 2006. Making it safe: The effects of leader inclusiveness and professional status on psychological safety and improvement efforts in health care teams. Journal of Organizational Behavior 27(7). John Wiley & Sons, Ltd.:941–966. DOI: 10.1002/job.413.

Newman, Alexander, Ross Donohue, and Nathan Eva. 2017. Psychological safety: A systematic review of the literature. Human Resource Management Review 27(3):521–535. DOI: 10.1016/j.hrmr.2017.01.001.

Pearsall, Matthew J., and Aleksander P. Ellis. 2011. Thick as thieves: The effects of ethical orientation and psychological safety on unethical team behaviour. Journal of Applied Psychology 96(2):401–411. DOI: 10.1037/a0021503.

Pinder, Craig C., and Karen P. Harlos. 2001. Employee silence: Quiescence and acquiescence as responses to perceived injustice. In Research in Personnel and Human Resources Management, edited by Ferris Gerald, 331–369. Emerald Group Publishing Limited, ISBN: 978-0762308408.

Schein, Edgar H., and Warren G. Bennis. 1965. Personal and organizational change through group methods: The laboratory approach. Wiley, ISBN: 978-0471758501.

Spicer, André. 2017. Business bullshit. Routledge, ISBN: 978-1138911673.

USNC. 2011. Report to the president: National commission on the BP Deepwater Horizon oil spill and offshore drilling. US Government Publishing Office, ISBN: 978-0160873713. https://www.gpo.gov/fdsys/pkg/GPO-OILCOMMISSION/pdf/GPO-OILCOMMISSION.pdf.

Published: