Imagine being able to take a history textbook and rewrite any one sentence of your choosing. Which sentence would have the biggest impact on world history? It’s easy to think of wars, conquests, and elections, and the like, but on a more macroscopic level, what if you wrote that Pangea never separated into distinct continents? This single revision raises many important questions. Would humans still have the incredible biological diversity that is seen today? Would we have borders or live together as one? And perhaps the most important question of all: What would this mean for the development of culture?
The question of how our culture developed is never an easy one to answer, but it is certainly an important one to ask. Many people fall into the trap of thinking that culture is only studied, but never recognize that humans have a natural tendency to create culture as well, which in turn plays a part in creating us. In fact, culture represents a major intersection between our mind and body. This interplay implies that the development of culture, which is typically thought of as a historical discipline, has several connections to biology.
How exactly can we understand our present biology using culture? To start, there is strong evidence suggesting that stress and depression have a causal relationship with deadly diseases, and interestingly enough these conditions are more common in Western cultures. Why is this the case? One particular explanation to this phenomenon may be the focus on individualism and materialism by Western cultures. These harmful priorities didn’t just appear out of nowhere. To really understand why we are where we are today—in a society with social divisions and skyrocketing mental health issues—we don’t necessarily need to make a new biological discovery. Instead, we can look to history for answers. While America’s tradition of rugged individualism and personal liberties is well known, the origins of European individualism are less apparent. Through the Catholic Reformation, many in Europe began to reject the Catholic church and placed greater emphasis on the relationship between God and the individual. Later, the Enlightenment gave rise to political individualism, which would grow to characterize much of European politics. All of these events that contribute to the development of individualism and materialism interact in complex manners, and we can see the biochemical consequences of these psychological forces by studying neurobiology.
In a 2007 study, it was found that Western subjects had higher medial prefrontal cortex activity when asked to describe themselves, while Chinese participants exhibited higher levels of medial prefrontal cortex activity when asked about themselves and someone else. The medial prefrontal cortex is responsible for some of our decision making and memory consolidation. Therefore, the study’s results indicate that non-western cultures are likely to take other people into consideration when making a decision while Western cultures focus primarily on themselves in the decision making process. This is not much of a surprise since individualistic cultures are more primed to think about themselves as a central unit, while collectivist cultures are primed to think about the community. As Todd F Heatherton puts it, “Culture can have an impact on how the self is constructed on a neural level,” and new discoveries in neuroscience show he couldn’t be closer to the truth.
It is important to note that there is more nuance to culture than what may initially meet the eye. Understanding the differences between cultures does not mean that a scientist can use an fMRI to guess what culture someone belongs to. Like with anything else in life, there exists a spectrum. Thus, the extent to which our understanding of the brain demonstrates the individualism of the West or the interconnectedness of the East is highly variable on the region and the individual themselves. In fact, the lines of culture are becoming increasingly blurred as social media brings us all into an increasingly similar global culture.
When looking at the human lifespan as a whole, it may initially seem as if we do the bulk of our learning in our early years. In reality, we are constantly learning and developing throughout our lives. Most of this learning and neurogenesis (development of new neurons) does, in fact, occur early in our life, when we are learning how to walk, talk, and perform other functions. (This is part of the reason why it’s easier to learn languages when you’re younger). However, scientists have recently discovered that neurogenesis also peaks during puberty. Why would that be? Well, it is partly due to an adaptive mechanism. During puberty, we are gaining a heightened understanding of our culture, which prepares us to enter the so-called “reproductive marketplace,” in which we compete with others to find a mate and reproduce. Culture dictates the conditions favorable for our mate-finding endeavors. It is through these so-called “formative years,” that individuals acquire more information on culture in which we can redefine our schemas and cultivate an identity that will serve us in the reproductive marketplace.
It is clear that cultural standards and norms can cause numerous issues when they conflict with our inner identity. In fact, one research study identified that our neural activity is more influenced by our cultural values than the behaviors present in the situation, suggesting that our brain’s patterns can reflect cultural values before we choose to exert a behavior ourselves. This means that culture can become imprinted in our biology, and our thoughts become culturally based patterns. While this evidence is empirically valuable, it is likely a phenomenon that many of us have experienced in our own lives time and time again. As children, we first encounter certain cultural values and standards when we are in peak neurogenesis— early childhood. From that moment on, the imprint of culture can continue to be reinforced throughout our lives.
This conditioning is to our benefit in many cases. Do you even think twice that you need to go to a bowl called a toilet to urinate? No, it is a cultural standard. In some cases, however, the conditioning can be particularly damaging. For instance, someone may have a repeated negative thought pattern over and over again, leading to depression, and popular media may only serve as a reinforcement of this pattern and those cultural standards that initiated the negative thought pattern.
However, cultural conditioning doesn’t seal our fate entirely. In fact, one of the most remarkable characteristics of the brain is its neuroplasticity, or its ability to change. One of the most promising dimensions of neuroplasticity is just beginning to emerge through neurobic exercises that are designed to change the neural pathways that we develop over our lives. A study published in September 2020 shows that neurobic exercises in residents at an elderly home had an improvement in memory and decrease in depressive tendencies compared to those who had not undergone the treatment. This is still a relatively new but promising area of study, and one that is now often utilized by therapists.
Hopefully, now it is easier to see how the study of culture represents an important intersection between history, psychology, and biology. Historical study often tracks the development of our culture over time, and psychobiological research is able to examine how our history impacts us on a neurological level. The overlap of these disciplines can generate very profound conclusions, such as that of a “reproductive marketplace”, which allow us to understand ourselves the way we are meant to be understood: a collection of past experiences and present brain activity.