A week ago, a crappy picture of a dress popped up on the interwebs, and sparked a seemingly insipid controversy. Before the official blue and black color was announced, people debated its true color and split themselves into #teamblueandblack and #teamwhiteandgold, while feeling baffled that others could see the dress so differently.

This story reminds us that vision is an interpretation of the world, not objective reality.

Vision

Most of us think very little about how we perceive color. We don’t think about the fact that our experience of color is just our brains transforming light into useable information, or the fact that nothing is objectively blue or black or white or gold. Typically, our color vision serves us very well, helping us distinguish between foods that are more or less edible, or snow that is white or yellow.

Although, we’ve all seen optical illusions, whether it’s a vase with a face or a Magic Eye poster, our vision generally seems predictable. Besides, we expect that illusions will mess with us, but we don’t expect our vision to fail us in everyday life. As a result, many people were probably experiencing what’s called a meaning threat. Anna Kendrick even tweeted, “If that's not White and Gold the universe is falling apart. Seriously what is happening????” She wasn’t the only person who reacted that way.

The world just ain’t right…

Meaning threats make us question our worlds and cause us to experience confusion, which we often react to with hostility, aggression or denial. When something we take for granted fails us, it suggests that our worlds are no longer predictable. This is scary.

So, instead of being annoyed that I’m still talking about #thedress, consider how a crack in our perceived reality evokes threat in all of us. Even if this story had no effect on you, there are probably other parts of your life where meaning threat pops up, and elicits a similar reaction.

Question your beliefs

Use this story as example of how we experience our beliefs as facts. Think about disagreements you’ve had with others — maybe you’re each perceiving things differently and that neither of you is objectively right. Cultivate a sense of okayness about the subjectivity of your perceptions, and use this information to see the world a bit differently.

References

Lilienfeld, S. O., Lynn, S. J., Namy, L. L., Woolf, N. J., Cramer, K. M., & Schmalz, R. (2011). Psychology: From Inquiry to Understanding. Pearson Canada: Toronto.

Proulx, T., Heine, S. J., & Vohs, K. D. (2010). When is the unfamiliar the uncanny? Meaning affirmation after exposure to absurdist literature, humor, and art. Personality and Social Psychology Bulletin, 36, 817–829.

vanDellen, M. R., Campbell, K. W., Hoyle, R. H., & Bradfield, E. K. (2011). Compensating, resisting, and breaking: A meta-analytic examination of reactions to self-esteem threat. Personality and Social Psychology Bulletin, 15, 51-74.

 

Posted
AuthorTara Giller

Critical Thinking, Part 3

Let’s say, for the sake of this post that I believe in dragons (and I’m not talking komodo). How, you might ask, could I continue to believe something that is not backed by the facts?

1. I could, and most definitely would, look for information that confirms my existing beliefs, and take that evidence at face value (1).

            Last week, on the Internet I saw a picture of “baby dragon,” named Agni, which means fire in Sanskrit. Seriously, awesome name for a dragon, am I right? (2) Fired up by the potential that dragons are real, I could avoid critiquing the potentially shoddy evidence put forth in the above fanciful article and search for MOAAAR information telling me that dragons are real. As I noted in last week’s post, people are motivated to prove themselves right and do so by searching only for evidence that confirms their beliefs, processing information in terms of the appropriately named “Myside” bias (1).

2. Ignore and avoid looking for information that contradicts my beliefs.

            If conflicting evidence does happen to present itself, I would be less likely to notice it in the first place because I’m more motivated to find information that is in line with my “dragons are real,” hypothesis. However, if I was presented with information telling me about how unlikely it is that dragons do exist, I could argue that whoever told me was not a reputable source while painstakingly tearing down their “terrible” argument (1).

“Wait a minute”… you say, “I’m too smart to do any of this”. Unfortunately for all of you MENSA candidates, intelligence does not protect people from these biases, and some theorists have argued that smart people can be more biased (2). Their reasoning is that intelligence provides people with an enhanced ability to find an explanation for anything (in other words, an enhanced capacity for B.S.). Intelligence does help, however, when people are asked to take their prior beliefs out of the equation (1).

So, what’s the solution?

·      Be aware of these biases.

·      Try to forget about your beliefs and commit to even-handed evaluation.

·      I also think it’s important to cultivate an attitude toward learning more. This means that we have something to gain even when we’re wrong. It feels bad to come up with evidence highlighting our wrongness, but it feels less bad when we’re able to view it as an opportunity to improve our understanding of the world.

·      Lastly, ask yourself honestly if your explanations are B.S., because they really really could be.

Also, maybe dragons do exist and the government is trying to hide it from us on account of all the gold they’re hoarding….

 

References

1. Stanovich, K. E., West, R. F., & Toplak, M. E. (2013). Myside bias, rational thinking, and intelligence, Current Directions in Psychological Science, 22, 259-264.

2. www.theauditorium.com/blog/2012/03/06/dragons-discovered-in-indonesia.

3. Shermer, M. (2003). Why smart people believe weird things. Skeptic, 10(2), 62-73.

 

Posted
AuthorTara Giller