Implicit Bias and Why It Matters to the Field of Political Methodology

The following post is written by Hazel Morrow Jones and Janet M. Box-Steffensmeier of the Ohio State University. Jones is Associate Provost for Women’s Policy Initiatives and Director of The Women’s Place. She is also Professor of City and Regional Planning, and Professor of Geography by courtesy appointment. Box-Steffensmeier is Director of the Program in Statistics and Methodology, University Distinguished Scholar, Professor of Political Science, and Professor of Sociology by courtesy appointment. She received the Political Methodology Career Achievement Award in 2013.

What is Implicit Bias?

The human brain is amazing.  It handles over 11 million pieces of information every moment (Staats 2013) and we are only consciously aware of about 40 of those pieces of information.  The brain is processing all of the others without our conscious knowledge and it makes decisions about what is important for us to notice, what we can ignore, how things can be put together, what they mean and how we should react.  Being trained scientists does not change the fact that our brains are doing a lot of work of which we are unaware.

This ability is enormously helpful to survival and to efficiently handling all of the information we need to deal with.  For example, most of us can stand in front of a class and lecture without being conscious of which muscles are working, how our balance is keeping us upright, what is going on in the hallway outside the room (assuming we are focused on what we are saying),  and so on.  We are consciously focused on how to get information across and whether the students appear to be grasping the material – our brain is taking care of everything else.

A telling example was played out recently on a television show called “What Would You Do?” In this show three young actors were sent into a park separately to “steal” a bicycle.  One actor was a young white man, one a young black man and one a young white woman.  The young white man worked for quite a while, including using various tools, to break the lock on the bicycle he was “stealing” and was seen by many people before someone finally stopped him.  The young black man hardly had a chance to start working on the lock when he had gathered a crowd who all insisted that he stop.  The young white woman was working on breaking the lock when a man stopped and offered to help her!

These kinds of reactions are often based on attitudes that we are not aware of and that we may not want.  They go by different names in the literature:  schemas (Nosek, Banaji, and Greenwald 2002; Fiske, Cuddy, Glick, and Xu 2002), unconscious attitudes (Staats 2013), implicit bias (Staats 2013), implicit attitudes (Petty, Fazio and Brinol 2009), and blind spot (Banaji and Greenwald 2013), for example. Some authors argue that there are subtle differences between these phrases (see for example, Equality Challenge Unit, 2013) but for the purposes of this article we will use them interchangeably.

These unconscious attitudes are part of being human.  We may not all have exactly the same ones, though people raised in the same culture with similar experiences are likely to share many of them.  No one is entirely free of biases.  Most importantly, we may not be aware of our biases, or blind spots, and they affect our behaviors in ways we may not be aware of and ways we might be very unhappy about.

Assessing Implicit Bias

In an outstanding literature review on the topic, the Kirwan Institute for the Study of Race and Ethnicity at The Ohio State University, argues that implicit bias has certain key attributes (Staats 2013).  First, it is unconscious.  Second, it causes one’s judgment to move away from neutral toward a positive or negative assessment of something or someone.  Third, it can come into play without any intention on the part of the person holding the bias.  Finally, implicit biases are “robust and pervasive” (p. 7).

In our experience working with university audiences on this topic, the most difficult aspect of the idea seems to be for people to understand that each of us has these blind spots whether we want to have them or not.  Most audiences grasp the idea and understand that people have unconscious attitudes that affect their behaviors and may cause them to behave in ways that run counter to the values of equity that we espouse.  People can often also accept the intellectual idea that they have biases themselves.  When we discuss ways to counteract those biases, however, it becomes clear that most people do not actually believe their behavior is influenced by attitudes of which they are unaware and some become quite angry at the idea.  It takes a significant amount of self-awareness and open-mindedness to accept that one is biased and to work on de-biasing.

One of the major research teams working in this area is Project Implicit at Harvard University. This team has developed and tested an assessment that allows individuals to understand their own biases – and awareness is the first step to being able to counteract these unconscious attitudes. The Implicit Associate Test (IAT) offers the opportunity to quickly and privately discover something about one’s own biases and how they compare with others.  For example, the assessment on the degree to which one associates specific genders with science or with liberal arts indicates that 72% of those who have taken the assessment have at least a slight automatic association of male with science and female with liberal arts (26% have a strong association and 28% a moderate association).  Only 10% of those taking the assessment have the opposite association – at least a slight association of female with science and male with liberal arts.  We should note that the IAT is not the only assessment available and though it has been criticized in the literature (Mitchell and Tetlock 2006), it is the most common method of assessment (Blair, et al 2013).

Implicit Bias and Gender Equity in the Academy

Most universities, their leaders and their faculty members espouse the ideal of gender equity. Likewise, this has been an important goal for the Political Methodology Section. We have seen change in the proportions of female faculty and administrators in the United States. However,  those changes have been slow and not always consistent over time or across different parts of the campus. For example, the Ohio State University has seen a continuous increase in the proportion female in the faculty ranks (over at least the last 13 years) to the point where the tenure track faculty is now 32% female, still a distressingly low number.  However the colleges in the Science, Technology, Mathematics and Medicine cluster (STEMM) only have a proportion of 23% female.  Even within that cluster, the colleges range from 17% female faculty in the College of Engineering and 18% in Math and Natural Sciences to 90% in the College of Nursing in 2011. The OSU Political Science Department has 29% women. This is close to the discipline average of 32%; the Political Methodology Section has about 20% women (Breuning and Sanders 2007, 348). For the Political Methodology Conference, 10-20% of paper authors or coauthors have been women (Dion and Mitchell 2012).

Universities have often worked very hard to create policies that support women and men equally in their academic careers (e.g. Philipsen and Bostic 2010) and to recruit gender diversity for the faculty (e.g. NSF’s ADVANCE program). OSU’s Political Science Department has had a strong reputation and culture in mentoring, especially for women. Much of that credit goes to the chairs (Paul Beck, Kathleen McGraw, Herb Weisberg, and Rick Herrmann). There is universal agreement that the Founders of Political Methodology are also exemplars in mentoring. The support of Brian Humes, NSF Program Director, as well as the support of the founders and leaders of Political Methodology led to the Visions In Methodology (VIM) Conferences.

In spite of these policies and the good intentions of most of those involved, we find ourselves asking with Virginia Valian “Why So Slow?” (1999) when looking at the level women involved in the subfield, field, and academia in general. One part of the answer may be implicit bias, so it is important to be aware of it.  These biases affect everything from the pre-collegiate pipeline to treatment of graduate students, to hiring decisions and promotion and tenure.  In spite of good intentions the implicit biases of those involved (both men and women) create barriers for women. There is a great deal of literature on this topic and significant work on practical applications as well. We cannot review even a portion of it in this article. A good review of the literature is in Equality Challenge Unit’s Unconscious Bias and Higher Education (2013), a publication from the UK.  We will focus on a few articles of particular importance.

We know that everyone is subject to unconscious bias and that those biases can cause us to act in ways that are counter to our explicit values.  Biases may particularly appear when we are under time pressure (Bertrand, Chugh and Mullainathan 2005), when we are stressed (Resdking 2005; Payne 2006), and when significant ambiguity is present (Payne 2006).  Time pressure, stress and ambiguity are facts of academic life and thus we often find ourselves in the kinds of situations in which unconscious biases are most likely to appear.

Moss-Racusin, Dovidio, Brescoll, Graham, and Handelsman (2012) use a randomized, double-blind study to examine bias in science faculty members assessing a student’s application for a position as laboratory manager.  The application materials were identical except for male or female names.  Both male and female faculty members rated the male student more competent and more hirable and also were willing to offer a higher salary and more mentoring to the male student.  This result supported earlier work by Steinpreis, Anders, and Ritzke (1999) looking at hiring an assistant professor. Again, both male and female professors rated the male applicant superior (two to one) over the identically qualified female applicant.

In discussing these results with faculty audiences we all ask ourselves: how this could happen?  Would the results have been the same had we been involved in the study?  Our conclusion is that they probably would be the same and we can imagine scenarios that might lead to this result.  One sees the name at the top of the CV and a frame of reference or context is established that one is not even aware of.  Then, for example, if one sees a coauthored article on the CV the frame of reference for the male applicant might lead one to comments such as:  “interesting topic”, “good journal.”  The frame of reference for the female applicant might instead lead one to think something like “I wonder if the coauthor is her advisor?”  In addition, there is likely to be a double whammy for a female coauthor on a quantitatively sophisticated publication. Note that there is nothing wrong with any of those points – all are relevant to the search, but the positive ones came out in the context of the male applicant and the questioning one came out in the context of the female applicant.  Adding up a lot of these “molehills” can lead to “mountains” of evidence against a female candidate (Valian 1999), even though her record is equal to the male candidates getting rave reviews. The frame of reference creates a context in which her work is viewed differently.

Letters of reference are another place where we can check ourselves for possible biases. Trix and Psenka (2003) examined letters of reference for successful applicants for faculty positions in medical schools.  They found that letters for men were longer, and contained more references to the curriculum vitae, publications, patient interactions and work with colleagues.  Letters for women were shorter, contained more references to personal life and contained more doubt raising phrases.  Looking back at the letters of reference we have written can give us a window into the possibility that our own biases show up in ways we would rather they did not. The Visions in Methodology Conferences, which are “designed to address the broad goal of supporting women who study political methodology” discussed this research and other related topics.

The final general issue we want to raise is the concept of “stereotype threat” (Steele, Spencer, and Aronson 2002; Lynch, Sjomeling, and Darley 1999).  This refers to the subtle pressure a person can feel when they know they are in a minority in a given situation.  For example, Shih, Pittinsky, and Ambady (1999) find that Asian American girls when reminded that they are Asian American (by a simple questionnaire) before taking a math test do better on the test than Asian American girls who are not reminded.  On the other hand, Asian American girls reminded that they are girls before the test (by a different simple questionnaire), do worse than Asian American girls who are not reminded of either fact before the test.  This sheds light on the issue some women report of feeling that they are not at their best in interacting with their male professional colleagues – being one of only a few women can leave one constantly reminded that one is the odd person and that pressure can lead to being less successful than one would be without those reminders.

The combination of gender and math is likely a reason for the Political Methodology Section seeing lower numbers than other sections of the American Political Science Association. Perhaps reflection upon the literature on implicit bias has implications for equalizing success at the Political Methodology Meetings and in our classrooms, such as general encouragement and positive reinforcement. Most likely, the problem starts much earlier in the pipeline and thus the outreach at even earlier levels, including the “Motivating Politics as a STEM Discipline for Middle and High School Students through Participatory Experiments and Demonstrations,” is important.  The Motivating Politics Program was a collaborative among the Political Methodology Section, NSF, and the Midwest Political Science Association (with great thanks for James Rogers at Texas A&M and Shane Nordyke at the University of South Dakota as the leaders).

Relatedly, Ely (1994) talks about how being one of an underrepresented group in a community can lead some members of the group to not want to identify with the group in order to enhance their status within the group. That is, sometimes women contribute to these biases when they try to distance themselves from other women within the community.  To us, this is a complicated issue. It does help explain some of the reactions to the Women’s Dinner kickoff at the Methods Meetings. It was designed to provide a friendly welcome at the start of the conference, but has met mixed success. In contrast, VIM (which is a conference only for women with both substantive and professionalization topics) has had uniform success.

Although we have focused on gender issues in implicit bias, the same mental processes are at play in all kinds of other inequities including race, sexual-orientation, age, weight, and many other personal characteristics that have no particular relevance to qualifications for a position.  These ideas are crucial to bringing more diversity into academic fields and to helping others succeed.  All of us, but particularly those of us who have a role in these decisions or have an opportunity to mentor, have a responsibility to become aware of our implicit biases and to take steps to de-bias ourselves and to set up processes that diminish the ability of biases to play a role.

Moving Forward

NSF’s ADVANCE institutions provide many examples of changes to policy and practice.  For example, the University of Michigan’s STRIDE program offers excellent examples of best practices for de-biasing faculty search processes.  See also Washington State University’s ADVANCE Program on mentoring, Texas A&M’s ADVANCE Program on retention and promotion, and the University of California at Davis’ ADVANCE Program on creating a level playing field for success. Ohio State University’s Kirwan Center has resources on debiasing.

Many of the ideas are easily adapted to recruitment searches in Political Methodology, invitations extended to speakers in our departments (avowing homophily tendencies), or even for recruiting women discussants at the Political Methodology Meetings or the Political Methodology Section at other major professional meetings.  The recent leadership of the Political Methodology Section have not only supported, but pushed for such innovation and change. The most important steps start with the individual in educating oneself and decision makers about implicit bias because awareness serves to reduce reliance on stereotypes (Correll 2013).  We hope that this article starts to illuminate that process and the progress already made with programs such as Motivating Politics and VIM.


ADVANCE at WSU.  2014.  March 01, 2014.

ADVANCE UCDAVIS. 2013.  March 01, 2014.

Banaji, M.R. and A.G. Greenwald. 2013. “Blindspot: Hidden Biases of Good People.” New York, NY: Delacorte Press.

Bertrand, M., D. Chugh, and S. Mullainathan 2005. “Implicit Discrimination. The American Economic Review 95(2): 94-98.

Blair, I. V., E.P. Havranek, D.W. Price, R. Hanratty, D.L. Fairclough, T. Farley, H.E. Katz, and J.F. Steiner. 2013.  “Assessment of Biases Against Latinos and African Americans Among Primary Care Providers and Community Members.”  American Journal of Public Health, 103(1): 92-98.

Breuning, M. and K. Sanders. 2007. “Gender and Journal Authorship in Eight Prestigious Political Science Journals.” PS: Political Science & Politics 2: 347-351.

Correll, S. 2013. “Creating a Level Playing Field.” March 01, 2014.

Dion and Mitchell. 2013. March 01, 2014.

Ely, R. 1994. “The Effects of Organizational Demographics and Social Identity on Relationships Among Professional Women.” Administrative Science Quarterly 39: 203-38.

Equality Challenge Unit. 2013. March 01, 2014.

Fiske,S.T.; A. Cuddy, P. Glick, and J. Xu. 2002. “A Model of (Often Mixed) Stereotype Content: Competence and Warmth Respectively Follow From Perceived Status and Competition” J Personality and Social Psychology, 82(6), 878-902.

Mitchell, G., and P.E. Tetlock. 2006. “Antidiscrimination Law and the Perils of Mindreading.” Ohio State Law Journal, 67(5), 1023-1121.

Moss-Racusin C.A., J.F. Dovidio, V.L.Brescooll, M.J. Graham, and J. Handelsman. 2012. “Science Faculty’s Subtle Gender Biases Favor Male Students.” Proceedings of the National Academy of Sciences 109(41): 16474-16479.

Nosek, B.A., M.R. Banaji, and A.G. Greenwald. 2002. “Harvesting Implicit Group Attitudes and Beliefs from a Demonstration Web Site.” Group Dynmaics: Theory, Research and Practice 6(1): 101-115.

Payne, B.K. 2006. “Weapon Bias: Split-Second Decisions and Unintended Stereotyping.” Current Directions in Ppychological Science 15(6): 287-291.

Petty, R.E., R.H. Fazio and P. Brinol. 2009. “The New Implicit Measures: An Overview. In R.E. Petty, R.H. Fazio, and P. Brinol (Eds.), Attitudes: Insights from the New Implicit Measures (pp. 3-18). New York, NY: Psychology Press.

Philipsen, M.I. and T.B. Bostic 2010. Helping Faculty Find Work-Life Balance. San Francisco, CA: John Wiley & Sons.

Project Implicit, Harvard University. 2011. March 01, 2014.

Reskin, B. 2005. “Unconsciousness Raising.” Regional Review 14(3): 32-37.

Shih, M., T.L. Pittinsky, and N. Ambady. 1999. “Stereotype Susceptibility: Identity Salience and Shifts in Quantitative Performance.” Psychological Science: 10(1): 80-83.

Staats, Cheryl. 2013. “State of the Science: Implicit Bias Review.” March 01, 2014.

Steele, C.M., S.J. Spencer, and J. Aronson. 2002.”Contending with Group Image: The Psychology of Stereotype and Social Identity Threat.”  Advances in Experimental Social Psychology 34: 379-440.

Steinpreis, R.E., K.A. Anders, and D. Ritzke 1999. “The Impact of Gender on the Review of Cruricula Vitae of Job Applicants and Tenure Candidates: A National Empirical Study.” Sex Roles 41(7/8): 509-528.

Stone, J., C.I. Lynch, M. Sjomeling, and J.M. Darley. 1999. “Stereotype Threat Effects on Black and White Athletic Performance.” Journal of Personality and Social Psychology 77(6): 1213-1227.

STRIDE, Univeristy of Michigan. 2013. March 01, 2014.

Trix, F. and C. Psenka. 2003. “Exploring the Color of Glass: Letters of Recommendation for Female and Male Medical Faculty.”  Discourse & Society 14(2): 191-220.

Valian, V. 1999. Why So Slow? Cambridge, MA: MIT Press.

Visions In Methodology.2014. March 01, 2014.

The Women’s Place, Ohio State University. 2013. March 01, 2014.

“What Would You Do?” May 27, 2010.  March 01, 2014.


About Megan Shannon

Assistant professor of political science at Florida State University
This entry was posted in Uncategorized and tagged , , . Bookmark the permalink.

6 Responses to Implicit Bias and Why It Matters to the Field of Political Methodology

  1. Pingback: An Academic Woman’s Rant of the Week: Negotiation » Duck of Minerva

  2. Pingback: Here’s a list of smart women political scientists. They know stuff, too. - The Washington Post

  3. Pingback: Here's a list of smart women political scientists. They know stuff, too. – Washington Post

  4. Pingback: Why are political experts mostly men? Women also know stuff -

  5. Pingback: #WomenAlsoKnowStuff (Even About Politics)

  6. Pingback: #WomenAlsoKnowStuff (Even About Politics) | Flex Buy All

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s