Who Has Midlife Crises and Why

Psychologist Carl Jung spoke of a process called ‘individuation’ whereby one gains an elevated degree of self-awareness and is therefore able to take crucial steps toward cultivating his ideal personality (i.e. ‘self actualization’ in Maslownian terms). In layman’s terms, this process is called a ‘midlife crisis’. My proposal is that this is a period of growth that everyone experiences, and the sooner it happens, the easier it is to overcome.

According to social convention and many professional circles of psychology, a midlife crisis is considered a bad thing. For example, a psychiatrist named Sue may claim to have seen this instance many times before. Sue describes it empirically as stress at work and in the family that has accumulated over time, and then it was suddenly unleashed in different forms. This places the blame on the individual for not communicating his inner thoughts and feelings as they arose, so Sue will offer her therapy services to fix the problem by teaching better communication.

A neurotherapist named Ben might also claim to have seen this many times before, but he will take a more materialist approach. Ben will confine the problem to the brain by assuming that something simply went wrong with his neural functioning, and that the matter is beyond his control. He might suggest that the only solution is to undergo neurotherapy in his clinic to realign normal neural pathways in the frontal lobe of the brain.

Both the Sue and Ben, as well as most people in general, see this crisis as a problem that needs to be fixed, and that the only way to do that is via the specific methods in which they have been trained. “I understand. Let me handle it. You can trust me.” is what they will tell their potential patient. Given their wall of shiny degrees in there cozy, inviting office, it is difficult to turn down their offer no matter the cost, as long as they can convince you that you need it.

More likely than not, both Sue and Ben are acting in their own self interests first. They are business people as well as medical professionals. Indeed, the term ‘crisis’ itself carries a derogatory tone, and the professionals have learned to capitalize on that. Their outward warmth, their technical language, their comfortable offices, their alleged understanding the situation, etc. are tactics that they use to keep their business running. That is not to say that their practices are completely useless, but rather, that either service will likely have more or less the same effect for the very same condition because neither comes close to attacking the root of the issue. In fact, they unknowingly focus on fixing the same exact thing (outward communication of inward feelings) since language expressions are actually channeled through the frontal lobe of the brain!

Meet my friend Jay. Jay is 38 years old, and he is an officer in the military. To this point, Jay has led a respectable life of service and duty. He is a devout Christian, goes to church every Sunday, and does community service with his church. He worked hard in high school and in Boy Scouts; he graduated and became an Eagle Scout; he went to college, worked hard, graduated, joined the military as a lieutenant, worked hard, got married, worked hard, had two kids, and then he continued to work hard to maintain that for the years following. Jay is a doer: Make a decision, work hard at it, and you will lead a successful life.

Jay never really questioned the position he was in, and things seemed to be going great, but then, seemingly out of nowhere, he began to have what is commonly known as a midlife crisis. He became a bit depressed and self-conflicted. His temper shortened, and he frequently had emotional outbursts at his wife and kids. With some reluctance, he finally agreed to grant his wife’s request and seek help. He began going to Sue, the psychiatrist, both alone and with his wife. Things seemed to improve for one or two days following each session, but then he would revert back to his ordinary behavior. Sue’s methods weren’t really working for Jay. He got impatient and started to believe that the process was being prolonged, and that he was spending more money than he needed to.

Jay began to seek other forms of help, and then he discovered Ben’s neurotherapy practice. Upon first meeting Ben, he felt a bit more confident moving forward. Ben explained, using much technical jargon, how important the brain is in processing information and making decisions. Though the claim that the brain is important is true, indeed it is necessary, he went on to convince Jay further that his methods were “more scientific” than traditional therapy because they are “backed by modern neuroscientific research”. Jay became convinced that neurotherapy was the answer, and he began treatment. After a few months, however, as Jay’s optimism wore off, so did his patience; his behavior took the same turn that it did before and after psychiatric therapy. He began to feel misled into thinking that these therapists were offering a sure-fire, algorithmic solution that was actually, in some sense, a scam. It turns out that he was right.

The absolute root of a “crisis” is unknown to Sue and Ben because it is, in the conventional sense, unknowable. A crucial part of it deals with knowledge that does not likely have its foundations in the material world, nor is it solvable by simply making a few practical, sure-fire adjustments in one’s everyday life. Therefore, it should come as no surprise that most people like Jay have so much trouble wrapping their minds around something that is different in nature from their materialism-based work and education and their practical, habit-based personal lives, especially when the people who they put their trust and money in are misleading them. It is difficult for them to realize that there is more to themselves than their brains, bodies, and the feedback they gather from the external social and material world. This was exactly Jay’s predicament. He wanted to put his trust into a system to manage his life from the outside-in, but nothing was working. He was forced to turn inward and deal with it himself.

There is a continuous process of personality development in everyone, and without its sufficient maturation, one simply cannot optimally handle the stresses of life. Understanding a midlife crisis, or any crisis for that matter, and taking steps to solve it is a personal journey. It requires one to discover, embrace, and cultivate the auxiliary side of the personality in conjunction with the continuing development of the dominant side. What I am alluding to is certainly not to have solved this puzzle for everyone, necessarily, but rather that it is each person’s job to solve their own puzzle for themselves. There is indeed a highly-effective model one can keep in mind to better understand the self and its place in the world: the cognitive functions as described by Carl Jung.

Immediately, one might question this method. Good. You should, but don’t question it without knowing anything about it, or at least in a way that presupposes bias. It is a continuously developing theory outside of institutional psychology. The reason for this is simply that it does not seem to fit the existing ideology of institutional science on a broader scale: materialism – all reality in the universe is founded on and comprised of quantifiable matter and energy. I have explained in several previous posts, just as well as several professional scientists and philosophers have explained in recent years, why science must move past the materialist worldview in order to progress, no matter the cost. That is not up for debate, so I will prevent any further discussion on the matter by saying this: To dismiss Jungian psychology on the basis of their being “no evidence” for it presupposes that the only evidence is the type that materialism relies on. This is circular reasoning. There in fact has been no materialist attempt to disprove it to begin with. In other words, to stick to such an unsupported principle is to assume it is “guilty until proven innocent”, as in wrong-until-proven-by-materialism. The premise for my proposal here is about people. All people are unique, but there are baseline psychological tendencies by which we operate. This is, as we should all agree, indeed obvious upon any amount of close observation of one’s social environment. That, I will submit, is in itself a form of evidence worthy of a discussion. Having said that…

Each person’s dominant cognitive function, according to Jung, is either introverted or extroverted, and either a mode of judgment or perception. There are two ways of making judgments (thinking and feeling) and two modes of perception (sensing and intuiting). If one’s dominant function is inwardly perceptive, say, introverted intuiting (Ni), then his auxiliary (secondary) function will be an outward mode of judgment, either extroverted thinking or feeling (Te/Fe), to balance out the dominant function.

Of course, everyone necessarily has the capacity to both perceive and make judgments, to extrovert output and introvert input, to think and feel, to sense and intuit; we otherwise would not be able to survive in any social or professional setting. We all do all of those things to varying degrees, indeed. One of those functions, however, is naturally dominant. It is our own personal “standard operating procedure” under normal conditions. When we are confronted with a crisis, we are forced to operate with more depth; i.e. we must work harder do deal with the death of a loved-one than to decide what to wear to go to church, obviously. This does not mean we abide by our SOP more closely than usual. In fact, it implies the opposite: that we must be more flexible about our dominant function. We need balance between our most dominant modes of perception and judgment in order optimally deal with stressful situations. The auxiliary function is what we all struggle with cultivating at some point in our young adulthood to middle-aged lives. It is the one that is more repressed, but it is necessary to use in support of our dominant function if we are to deal with crises healthily.

Whether one is introverted or extroverted in general depends on whether his dominant function is introverted or extroverted. An introvert will likely develop his extroverted auxiliary function earlier in life than an extrovert will develop his introverted auxiliary function because, especially in extroverted-dominated western societies like the United States, functioning in an extroverted fashion is forced upon introverts. Extroverts more easily fit in right from the start, but they have personal crises later in life.

Jay, for example, is Te (extroverted thinking) dominant, which means he is an extrovert with left-brained thinking tendencies. He is outgoing, decisive, and abides by cold, hard, logical systems (e.g. mathematics, law, protocol, etc.) to make judgments about reality. This is very useful in his military environment which values this type of rule-based reasoning very highly. He has a wide circle of social and professional connections and makes a good living. From the outside-looking-in, he is viewed as a success by his peers; the American dream is very Te-focused, and Te-dominants (and Fe) are the most likely to buy into it. However, on a more personal level, as he is learning in his midlife, he is only outwardly, not inwardly, organized. An introverted thinking-dominant (Ti) personality, by contrast, will have a well-structured, internal set of logical rules and principles, but to other people, he may seem outwardly messy and disorganized because he dismisses conventional rules.

For his entire life to this point, Jay has identified himself based on the rules that he followed (by his commanding officer at work, by the Bible in his moral decisions, and by his wife at home). He lived the first half of his life constantly focused on planning for the future and managing himself in an outward fashion. He was accustomed to getting things done – acting now and thinking later. Now that things have settled down, there is no more planning to be done. What is he to do?

The answer is: Don’t do anything. Think. Process. Reflect. Jay’s most obvious problem is that he was not able to turn inward and think independently, apart from the rules set before him. He had been so busy living up to standards external to himself, he had never even considered himself to be a conscious, independent, introspective being. In fact, he was afraid to because he naively associated introspection with feelings, and feelings with weakness. That, after all, is the popular opinion in American culture.

Jay’s midlife crisis is common among all left-brained judging (Te or Fe dominant) personalities, who encompass about half of the American population according to psychologist David Keirsey who was a leader in modernizing Jung’s principles in the 70s and 80s. This process manifests itself in different ways and at different times.

First thing’s first: we need to change our terminology. This crisis is not really a “crisis” at all, in fact; it is a period of growth whereby the extrovert discovers the introverted side of his or her personality, or the introvert attempts to align his internal rules with outer reality. Jay’s dominant function, as I have mentioned, is called extroverted thinking. It is a way of making judgments: being quickly decisive and taking impartial action based on established rules. What he lacks is a cultivated ability to inwardly process the information that he is acting on. That function is a mode of perception. Jay’s perceiving function, once cultivated, will act as the support for his decision-making, and will improve that process to a huge degree. The perceiving function specific for Jay is called introverted sensing (Si). This function collects data based on personal experience, traditions, and principles for the sake of themselves. His personality suits the military and other managerial positions perfectly. When his auxiliary Si is underdeveloped, he follows the rules and doesn’t question them, while almost entirely neglecting his own interests.

What it means for Jay to develop his auxiliary Si function is to improve the way he collects and interprets data and flexibly adapts his existing principles to the constantly-changing environment. This is an internal process. It will improve the way he perceives himself in relation to the data as well as the way he perceives the data itself. He will use this introverted Si perception in conjunction with his dominant Te judgment to make well-rounded decisions.

I used Jay as an example because he possesses the most common type of Jungian personality construction among men in the United States (ESTJ according to Myers-Briggs). The most common type for females (ESFJ) is very similar (Fe/Si dominant/auxiliary instead of Te/Si). If you don’t relate to Jay or his Fe counterpart, that is fine. There are 14 other forms of cognitive functioning, according to Jung. And that is not to take anything away from the individuals within each of those categories. As with anything, there is an immeasurably wider variety of uniqueness among individuals within each group than there are generalized differences among the groups themselves. Having said that, Jungian cognitive typology is not more than a guideline, albeit a very effective one, to keep in mind as one deals with the struggles of life. At the same time, however, don’t blame anyone other than yourself if you reject the system out of principle alone amid a personal crisis.

Cheers!

Current Methods of Usage – Language as a Collective Social Skill

Language has developed as a collective social skill to the extent that society needs to use it to function. Different dialects develop in different regions out of the necessities that those regions are subjected to. Languages spoken by small bands in the rural Amazon are structurally simple compared to English, which is spoken in most of the developed world. Amazonian lifestyles are also structurally more simple in contrast to the complicated (but certainly no better) lifestyles of the developed west. This makes sense. Their language is suited to their lifestyle. Their lifestyle has one main focus: survival.

Let us suppose one were to raise himself in the wild, isolated from all other humans, he would not be able to create a complex private language because he would not need to. He may develop some way of communicating with nature around him (e.g. mimicking bird calls to attract birds so he can catch them for food), but his language would be nothing like the one we understand. He would need no complex grammatical rules or extensive vocabulary to survive in the wild because there is nothing in the wild either that would need to reciprocate understanding of such a language.

Communication as we know it could never occur. It would not need to. However, the isolated Amazonian would be communicating with the birds, in a sense, if they respond in the way he hopes so that he can catch them to eat. (Whether or not this is considered language can be debated, but if the goal of language is to communicate, then language and communication should be equivalent.) He is using his bird call as a tool to attract a bird just as I am using English to convey an idea to you now. Both he and I can be successful or not in achieving our respective goals. Whether or not we are successful can be due to any number of circumstances. In fact, the Amazonian could very possibly communicate with the bird more effectively than I am now communicating with you. Therefore, he (and the bird) would be more proficient in his language than I am in mine. In fact, I would hope that to be the case so I can further support the claims of this essay!

To “Know” a Language is NOT to have “Knowledge”

We have taken for granted that language is knowledge when it should, in fact, be thought of as a skill. We cannot imagine a world in which we have no knowledge of language, but that is because we have developed the skill of using it so well. We are so good – too good – at using this skill. We can lie to and manipulate others to achieve our ends. In fact, this is a tactic in capitalistic business rhetoric. The main focus of such business is not productivity, conversation, or healthy relationships. The focus can be reduced to one entirely superficial entity: money. Everyone wants as much as they can get, so they employ tactics of rhetoric (i.e. linguistic manipulation) in order to achieve that goal. It is only the loudest and most cunning who succeeds at this, not the smartest, most thoughtful, or most honest.

In the Amazon, on the other hand, the goal is survival. There is no place for wasting resources or time. Nor is there a place for the use of expressions of language which are irrelevant to the tasks at hand. The precise reason that there is so much excess language in English and other western languages is because our lifestyles are not as directly oriented toward primal survival. Our irrelevant distractions have given rise to irrelevant expressions of language.

Language, more broadly, is something that we take for granted. It is difficult, sometimes almost impossible, to communicate complex ideas without language, so we are misled to believe that such ideas cannot even exist without our mastery of a complex language. This is not the case. Our experiences of the world, the patterns we draw from those experiences, and our creative, subjective manipulation of those patterns are what formulate our ideas. We use language to simply (and sometimes not so simply) express our understanding. So, in this sense, expression in general, not our mechanical ability to produce words, is the real evolutionary phenomenon of humans. Every bit as impressive and complex as our ability to express ourselves using written or oral language are our abilities to express ourselves using musical instruments, paintbrushes, sports equipment, hammers and nails, and our bodily movements in dance. Language is a tool, and like any tool, we can misuse it by lying, manipulating, and mistreating others, or, more preferably, we can use it honestly.

The Slate, the Chalk, and the Eraser

Prerequisite reading: “WARNING: Your Kid is Smarter Than You!”

A mark of good critical thinking, let’s say, as it applies to science, is that it is always attempting to prove itself wrong. It challenges its most fundamental assumptions when unexpected results arise. We can do this in our everyday lives when we make decisions and formulate our own views. We are only truly challenging ourselves by trying to find flaws in our own reasoning rather than trying to confirm our views. It is easy to confirm our beliefs.

Let’s take astrology as a personal-scientific example. Sparing you the details, based on what little research has been done to refute it, astrology is seen as invalid, and therefore, a pseudoscience, by the standards of modern mechanistic science. However, that does not preclude one from believing in it – in confirming it or any of its insights to themselves. Now, one is not thinking critically by simply believing that astrology is a pseudoscience (or that it is legitimate science). That would be to put too much trust in other people’s thinking. What reasons can you give to support your own belief, and what does it mean?

One can wake up every morning, read their daily horoscope, and upon very little reflection, come up with a reason or two for how that horoscope applies to his or her life. On one hand, those reasons might be good ones, founded on an abundance of personal experience. The horoscope’s insights might serve as something to keep in mind as one goes about his or her day, and that can be a very helpful thing. On the other hand, however, the reasons might be mere, self-confirming opinions. They might be the result of the person’s ideological belief in astrology in general. That can be harmful if the person attempts to apply astrological insights to contexts which it is inapplicable. This is an example of how the confirmation of a specific belief, not the belief in itself, can be good or bad, helpful or harmful, depending on how one thinks about it and the reasons he or she gives for it. The question of whether it is right or wrong, correct or incorrect, is neither important nor provable.

In order to formulate views that are not mere opinions, we must expose ourselves to views that oppose the ones we already hold dear to our hearts. This is difficult for adults. Most of us have been clinging to the same beliefs since we were children or young adults. This is where children have a huge advantage. They don’t yet have views of their own. The sky is the limit to how they can think and what they might believe. Their handicap, though, is that they do not control what they are exposed to. They cannot (or perhaps, should not) search the internet alone, drive themselves to the library, proficiently read, or precisely express themselves through writing or speech. They are clean slates, and that ignorance not only gives them huge potential, but it also leaves them extremely vulnerable.

The Analogy

You may have heard this analogy before, but I will attempt to add a bit of depth to it.

A child’s mind is a slate, as are those of adults (though, arguably, much less so). It is a surface on which we can take notes, write and solve equations, draw pictures, and even play games. We can create a private world with our imaginations. For all intents and purposes, there are no innate limits to how we can use our slates. Maximizing our potential, and that of children, is up to the tools we use.

First, we need something to write with, but we shouldn’t use just any writing tool. Chalk is meant to be used on slate because it is temporary. It can be erased and replaced. If one were to write on a slate with a sharpie marker, that would be permanent. One could not simply erase those markings to make room for others. A slate has a limited amount of space.

Though our minds may not have a limited amount of space in general (there is not sufficient evidence that they do), there is a limit to how much information we can express at any given moment. That, not our mind in general, is our slate – that plane of instant access. The writing tool is our voice – our tool of expression. If we write with a sharpie, it cannot be erased. We leave no room to change our minds in the face of better evidence to the contrary. If we write with chalk, we can just as clearly express our ideas, but we also leave our ideas open to be challenged, and if necessary, erased and changed. It is also easier, for in the process of formulating our ideas with chalk, we need not be so algorithmic. We can adjust our system accordingly as we learn and experience new things.

The smaller the writing on the slate is, the more one can fit, but the more difficult it is to read. Think of a philosopher who has a complexly structured system of views. One detail leads into the next, and they all add up to a bigger-picture philosophy. It might take one’s reading all of it to understand any of it. That can be difficult and time-consuming, and not everyone has the patience for it. The larger the words on the slate, however, the easier it is to read, but the less there will be, so it risks lacking depth. Think of a manager-type personality who is a stickler for rules. He is easy to understand because he is concise, but he may lack the ability to explain the rules. People are irritated by him when he repetitively makes commands and gives no reasons for them. Likewise, children are annoyed when their parents and teachers make commands with no reasons to support them, or at least, no good ones (e.g. “because I said so”).

So, the slate represents the plane of instant access and expression of information, and the writing tool, whether it be chalk or a sharpie, represents our voice – our tool for expressing information and ideas. What does the eraser represent? The eraser represents our willingness to eliminate an idea or bit of information. It represents our willingness to refute our own beliefs and move forward. It represents the ability to make more space on our slate for better, or at least more situation-relevant, information. It represents reason. If one writes with chalk, the eraser – reason – holds the power. If we write with a sharpie, the eraser becomes becomes useless.

The Analogy for Children

I explained in my last post “WARNING: Your Kid is Smarter Than You” that it is important for parents and teachers to teach their kids how to think – not what to think – but I did not offer much advice on how to actually do that. I will not tell anyone, in detail, how to raise or educate their children. Each has a different personality and needs to be catered to through different means. I will, however, offer a bit of general advice based on the analogy above.

The way to teach children how to think (after already having done it for yourself, of course, which is arguably much more difficult) is NOT to hand the kids sharpies, for they will never learn to use an eraser. Their statements and beliefs will be rigid and lack depth of understanding. Granted, this might make them a lot of money in the short-term, but it will also significantly reduce their flexibility when they encounter real-life situations (outside of the institutions of school and work) that require them to think for themselves. This will inevitably limit their happiness from young adulthood to beyond.

Instead, simply hand them a piece of chalk. It is not even important to hand them an eraser, initially. Kids will figure out, after much trial and error, their own way to erase their slates. Eventually, they will find on their own that the eraser is a very efficient method to do so. Literally-speaking, they will express themselves and reason through their problems until they find the most efficient methods – by thinking for themselves, but only as long as they have the right tool.

WARNING: Your Kid is Smarter Than You!

Everyone is born with some capacity for critical thinking, but most people lose the skill over time. Children, specifically those aged 3-5, happen to be the best at it. This can be proven by a single word: ‘why’.

When someone asks a ‘why’-question, they are asking a question of reason, which is to say they are thinking critically to some degree. Children do this much more openly than adults, which is why most adults think children are simply being pests when they do. That is incorrect. The root of their questioning is philosophical. Children challenge assumptions, premises, and claims more openly than anyone. They are learning as much as they can about the world, and they demand reason to back up that knowledge. They are not lazy in the way that they tend to develop beliefs. Unfortunately, most parents do not share such genuine, open curiosity, nor are they readily able to cater to it. This is most obvious in grandparents, as the saying goes, “you can’t teach an old dog new tricks”. Elderly people tend to be the most firmly set in their ways and resistant to new ideas. Who can blame them? Thinking is calorie-intensive. Quite frankly, old people just don’t have the energy for it. Parents and teachers, however, have an important job to do. They have no excuse.

Though a child’s tendency to ask these types of questions will persist for some time, his continuance to do so will depend greatly on how open and able his parents and teachers are to dealing with it. In a perfect world, adults would take this as an opportunity to think critically about those questions themselves. Instead, they get frustrated or annoyed, make up a poor answer (e.g. “because I said so”), and send their kid straight to the TV or to bed; whatever it takes to keep them occupied and out from under the their skin. This is an uninspired and very resistant approach to parenting. The child’s curiosity is repressed, and they gradually stop asking questions and start submitting more and more to an ideology. The more naive children give in more quickly to the rules set before them. Others might become rebellious. Those rule-followers are certainly no smarter than the rebels, despite what social convention will tell you. Either way, their guardians’ repression has a lasting, negative effect on how they think.

I would like to now disclose that I do not have any children of my own, and I do not plan to have children in the foreseeable future. On that basis, someone who is guilty of the above might already feel offended and accuse me of having an incredible opinion on the matter. I would like to think that the contrary is true for two main reasons. First, I am a good planner. I am fully aware of the challenges of raising a child, and that is precisely why I am responsible enough to take the necessary precautions to prevent having one. Secondly, experience isn’t everything. I can observe the effects of bad parenting with a high level of objectivity because my thoughts about the matter are not distorted by the feelings caused by having a child of my own – feelings which unavoidably inhibit one’s ability to reason well.

Having said that, as you are a rational, autonomous agent, let me tell you a story.

I have a friend who has a four-year-old daughter. Immediately, there is a problem: He did not intend to. No, the fact that so many other people accidentally have children does not excuse him. That would be to commit the bandwagon fallacy. Nor does the fact that he is married and is financially able to support his daughter excuse him. In fact, he and his wife planned on holding out for five to seven years after their marriage to have a child, as they were aware of their not being ready. Instead, they ended up getting pregnant within only one year of their marriage. She was not planned, and my friend was not ready for the challenge of raising her. This is obvious upon close observation.

What does it mean for one to “be ready” to raise a child? That seems like a personal, descriptive question that everyone has their own unique answer to. That is true in a sense, but there is also a very normative aspect to this question. What “readiness” should mean here is that one is willing to accept the intellectual challenge of teaching a little person how to think – not what to think. That involves, not shrugging every time the child asks ‘why’, but, also, more crucially, asking ‘why’ for oneself. There is a modern saying that goes, “grade school teaches one what to think whereas college teaches one how to think”. My argument is that by the time someone gets to college age, they have already become a person to a degree, with their own thoughts, feelings, and system of beliefs. Therefore, it is almost certainly too late to teach one how to think. Small children ask the most critical questions. Parents should help them improve that ability at that point, before they have subscribed to an ideology that will most likely be founded in poor reasoning. The obstacle here is that the parents have previously adopted certain beliefs and have therefore surrendered their own ability to think well, much less will they be able to teach that ability to a child. Leading by example is vital, as kids learn by copying.

My friend is no exception. He holds some rather radical beliefs – mainly those of scientism and atheism, which normally go hand-in-hand. Therefore, he is not the type, no matter the subject, to be truly open to the question ‘why’. His beliefs dictate specific answers to those questions. i.e. All knowledge in the universe, including that of supernatural entities (such as God), has been or will be confirmed or falsified on the basis of physical, quantifiable matter.

The other day, my friend’s daughter was at preschool when some of her classmates were talking about a discussion they had in Sunday School the weekend before. When she got home that afternoon, she began to ask her father questions about God. She wasn’t doing so in a way that presupposed God’s existence, nor was she making any such claims. She was simply asking out of genuine curiosity, as children do with everything. To this point in her life, she had never even heard of God because my friend, being a serious atheist, had kept all sources of religion from her access at home. So, as you might imagine, he was quite disturbed that she was asking these questions. He felt he had done all he could do at home to keep religion out of her life, and now she was confronting him, backing him into a corner. His quick-fix decision was to, first, reject her questioning, and second, become more militant in forcing scientism upon her. He went out and bought children’s books about Darwinian evolution to fill the gap of there being no religion (e.g. bible story books). His hope was that she would believe in science (actually, scientism) instead of religion.

My friend, on an elusive, yet vital note, is trapped in a very conflicted way of thinking. He wants his daughter to “think according to reason”, as he says, but he also wants her to believe in some very specific ideologies. The two, at least in principle, cannot coexist. As I have clearly explained in earlier posts, reason and ideology are nearly polar opposite mindsets. If one is to reason well, he should find that no general ideology, is worth submitting to. There are only specific, situational exceptions to that fact. For example, when one takes a math test, he tunes into the deductive, mathematical way of thinking. When he takes a history test, he tunes into the material he studied for that test. Each way of thinking is useful in its own contexts. If he tries to apply math to the history test, or vise versa, he will fail the test.

On a more obvious note, my friend’s attempt to relentlessly control what is exposed to his daughter is a hopeless endeavor. She is going to get out of the house and away from her parents, as she already has to a degree. She is going to experience the world. She is going to have conversations with people who have views that conflict with her own. Most of all, she is going to be challenged. If she is taught what to think (whether evangelical Christianity, scientism, atheism, democratic or republican ideologies, etc.) she will be defenseless in such encounters. She will only be able to think and express herself according to those strict systems of thought, and that will be very limiting.

This approach to parenting, in some form or another, is widespread in the western world, and it is wrong. It is like trying to understand how the brain of a rat works by killing the rat, taking the brain out, and observing the brain in a non-working state, independent of the body. When one attempts to control all variables from happening, such conditions fail to represent those in the real world, for the real world is that which contains all the variables uncontrolled! Anything learned via such a method cannot be meaningfully applied in the real world. In fact, such methods will produce literally no meaningful results whatsoever.

How these analogies and examples can help us improve things, I will soon explain. There are constructive methods and solutions. The details of those methods will be for the individual parents and teachers to determine. All I will do is offer insights. You know your children the best, so adapt the concepts in your own way toward the one common goal: development of flexible thinking and viewpoints. There is a route for everyone. It is up to you to carve it for your children and for yourself.

There is not one generalized system of government, education, and economy that will satisfy all individuals. The ways individuals see things can change instantaneously. Creating a better world starts with better-thinking individuals. We can only hope that future systems will adapt accordingly.

To be continued…

 

Reason – The Business of Philosophy

“To say that a stone falls to Earth because it is obeying a law makes it a man and even a citizen.”  -C. S. Lewis

People who believe in science as a worldview rather than a method of inquiry – I call them scientismists – are fascinated by science because they cannot grasp it, just like all people who are not magicians are fascinated by magic. What little understanding they do have of it, in principle, is superficial. The difference between people’s perception of science and that of magic is that magic can always be explained. Magic plays a trick on one’s perception. That is magic’s nature as well as its goal. Science, on the other hand, cannot always be figured out. There simply is not a scientific explanation for everything (or of most things). Nor is it science’s goal to explain everything! Science is an incremental process of collecting empirical data, interpreting it, and attempting to manipulate aspects of the environment accordingly for (mostly) human benefit. It is experimental and observable. It is, as I will explain, inductive. Unfortunately, sometimes unknowingly, human subjectivity intervenes in at least one of these three steps, exposing its limits through ours. So, where does reason fit in to this process?

What “Reason” is NOT

One problem with scientism is that it equates science and reason. This is incorrect. Although philosophers of science, most of whom are scientists themselves, have debated the definition of science since it was called ‘Natural Philosophy’, there is one thing that we do know about it and the difference between it and reason. Science deals with questions of ‘how’. It describes the inner-workings, the technicalities, of observable processes and states of affairs. Reason deals with questions of ‘why’. It explores lines of thinking – fundamental goals, purposes, and meanings – for those processes and states of affairs as well those for many other non-scientific processes and states of affairs. Having said that, reason is necessary for science, but it is immeasurably more broad.

Science cannot alone answer why-questions. Claiming that it can is a mark of scientism. Why is that?

I will now give reasons for that by using an example from Dr. Wes Cecil’s 2014 lecture about scientism at Peninsula College:

Engineering, which is a type of science that has its foundations in calculus, can tell us how to build a bridge. Engineering can build the biggest, longest, strongest bridge one could possibly imagine. How will the bridge look? We marry science and art to make the bridge beautiful as well as functional. So, even at this first stage of building a bridge – design – science cannot stand independent from even art, which seems so much more abstract.

Furthermore, why do we need to build a bridge? This is a question of reason, not of science. The answer seems to be “to get to the other side of the river”. But what the engineer (who is also a business man who wants to land the deal for this highly-lucrative project) might neglect is that building a bridge is not the only way to get to the other side of the river. Perhaps a ferry would be an easier, more cost-effective option. The engineer can tell us how to build a ferry too, but making the decision between the bridge and the ferry, ultimately, is not the engineer’s business.

Even once the decision has been made to build the bridge, several more questions arise: who will pay for the bridge?; how will they pay for it?; where exactly will the bridge be?; who will be allowed to use the bridge? Motorized vehicles only? Bikes? Pedestrians?; etc. These are not scientific questions, and nor are most questions in our everyday lives. They are economic, ethical, and political questions that, much like the scientific question of how to build the bridge, require some application of reason, but they cannot themselves be equated with reason. Reason is something as different as it is important to these goals, processes, and states of affairs.

What is Reason?

Reason is a skill and a tool. It is the byproduct of logic. Logic is a subfield of philosophy that deals with reasoning in its purest forms. So, if someone wants to believe that science and reason are the same thing, then they are clearly admitting that science is merely a byproduct of a subfield of philosophy. I am sure that most scientismists egos would not be willing to live with that. Although some similar claim could still otherwise be the case, that is not what I am attempting to prove here. Let’s focus on reasoning.

We say that an argument is valid when the truth of the claim follows from the truth of its evidence. There is a symbolic way to express this. For example:

If p, then q; p; Therefore q.

What we have here is not a statement, but rather, a statement form called Modus Ponens. It is a formula in which we can plug anything for variables p and q, and whether or not the statement is true, it will be valid according to the rules of logic. Try it for yourself! But remember, ‘validity’ and ‘truth’ are not the same thing.

The example above describes deductive reasoning; it is conceptual. Immanuel Kant called the knowledge we gain from this process a priori – knowledge which is self-justifiable. Mathematics is a classic example of deductive reasoning. It is a highly systematic construction that seems to work independent from our own experience of it, that we can also apply to processes like building a bridge.

There is another type of reasoning called inductive reasoning. It is the process of reasoning based on past events and evidence collected from those events. The type of knowledge that one gains from inductive reasoning, according to Kant, is called a posteriori. This is knowledge that is justified by experience rather than a conceptual system. For example: We reason that the sun will rise tomorrow because it has everyday for all of recorded human history. We also have empirical evidence to explain how the sun rises. However, the prediction that the sun will rise tomorrow is only a prediction, not a certainty, despite all the evidence we have that it will rise. The prediction presupposes that not one of countless possible events (Sun burns out, asteroid knocks Earth out of orbit, Earth stops rotating, etc.) will occur to prevent that from happening.

Illusions of Scientism

The mistake that scientism makes is that it claims that the methods of science are deductive when they are actually inductive. Reductive science (that which seeks to explain larger phenomena by reducing matter down to smaller parts) most commonly makes this mistake. More often than not, those “smallest parts” are laws or theories defined by mathematical formulas. Scientismists believe that the deductions made by mathematical approaches to science produce philosophically true results. They do not. The results are simply valid because they work within a strict, self-justifiable framework – mathematics. But, how applicable are mathematics to the sciences, and how strong is this validity?

“The excellent beginning made by quantum mechanics with the hydrogen atom peters out slowly in the sands of approximation in as much as we move toward more complex situations… This decline in the efficiency of mathematical algorithms accelerates when we go into chemistry. The interactions between two molecules of any degree of complexity evades mathematical description… In biology, if we make exceptions of the theory of population and of formal genetics, the use of mathematics is confined to modelling a few local situations (transmission of nerve impulses, blood flow in the arteries, etc.) of slight theoretical interest and limited practical value… The relatively rapid degeneration in the possible uses of mathematics when one moves from physics to biology is certainly known among specialists, but there is a reluctance to reveal it to the public at large… The feeling of security given by the reductionist approach is in fact illusory.”

-Rene Thom, Mathemetician

Deductive reasoning and its systems, such as mathematics, are human constructs. However, how they came to be should be accurately described. They were not merely created, because that would imply that they came from nothing. Mathematics are very logical and can be applied in important ways. However, the fact that mathematics works in so many ways should not cause us the delusion that they were discovered either, for that would imply that there is some observable, fundamental, empirical truth to them. This is not the case either. Mathematics and the laws they describe are found nowhere in nature. There are no obvious examples of perfect circles or right angles anywhere in the universe. There are also no numbers. We can count objects, yes, but no two objects, from stars to particles of dust, are exactly the same. What does it mean when we say “here are two firs” when the trees, though of the same species, have so many obvious differences?

What a statement about a number asserts, according to Gottlob Frege, is a concept, because any application of it is deductive. So, I prefer to say of such systems that they were developed. They are constructed from logic for a purpose, but without that purpose – without an answer to the question ‘why do we use them?’ – they are nonexistent. Therefore, there is a strong sense in which the application of such systems is limited to our belief in them. Because we see them work in so many ways, it is difficult to not believe in them.

Physics attempts to act as the reason, the governing body of all science, but it cannot account for all of the uncertainty that scientific problems face. Its mathematical foundations are rigid, and so are the laws that they describe. However, occurrences in the universe are not rigid at all. They are random and unpredictable and constantly evolving. Therefore, such “laws” are only guidelines, albeit rather useful ones.

As Thom states, “the public at large” is unaware of the lack of practical applications of mathematics to science, and it is precisely that illusion of efficiency that scientism, which is comprised of both specialists and non-specialists, takes for granted. It is anthropocentric to believe that, because we understand mathematics, a system we developed, we can understand everything. Humans are not at the center of the universe. We’re merely an immeasurably small part of it.

The Solution

In the same way Rene Thom explains mathematical formulas do not directly translate to chemistry and biology, deductive reasoning, more generally, has very limited application in most aspects of our everyday lives. Kids in school ask, “I’ll never use algebra; why am I learning it?” It turns out, they are absolutely right. Learning math beyond basic addition, subtraction, multiplication, and division is a waste of time for most. What they should be learning instead are the basics of reasoning. Deduction only proves validity, not truth, and induction has even greater limits, as David Hume and many others have pointed out. People, especially young children, are truth-seekers by nature, which is to say they are little philosophers.

There is a solution: informal logic, the study of logical fallacies – the most basic errors in reasoning. Informal logic is widely accessible and universally applicable. If people are to reason well, informal logic is the most fundamental way to start, and start young we should. Children, in fact, have a natural tendency to do this extremely well.

To be continued…

“Ideology and the Third Realm” – What is Philosophy?

In Dr. Alva Noë’s book Varieties of Presence, many important aspects of perception are discussed. He makes a convincing case that we achieve contact with the world through skill-based action. Our understanding of a new experience is a collective recurrence, both conscious and unconscious, of past experiences. It is a dense work that deserves the attention of other contemporaries who concern themselves with matters in cognitive science and philosophy of mind. Perhaps I will do a full review of this book at a later date, but for now I would like to focus on a matter addressed in the final chapter entitled “Ideology and the Third Realm” which takes an important departure from the philosophy of consciousness and neuroscience.

What this chapter does is something that every philosopher should do periodically: broadly revisits the fundamental importance of philosophy as it relates to the context of his work. I will be a bit more general than that since I am not  “professional” philosopher. The role that philosophy plays in the world seems to constantly be changing. But is it? Perhaps it is only the popular understanding of what philosophy is that changes. I think that is, in part, the case, but it has more to do with the uses of philosophy. Some of those uses have remained constant since the beginning of recorded thought while others change by the minute. For this reason, it is impossible to pin down. But one need not pin it down. Philosophy exists to be used, and it is set of skills that will hopefully never become extinct. There is no dictionary definition that can sufficiently explain it, much less emphasize the field’s vital presence. I will give a general overview of the chapter but mainly share my thoughts about what philosophy is and why it is not only relevant, but necessary. Before I continue, I should define an important term which will be mentioned several times in this piece.

Q.E.D. (Latin) quod erat demonstrandum –  meaning “which had to be proven”

Many people, in and out of academia, naively think that philosophy deals with questions that warrant a Q.E.D. response. When you take a philosophy course, you often have to write at least one argumentative essay where you choose a position of a philosopher who you have read, you attempt to prove him wrong, and then you attempt to formulate a complete view of your own by supporting evidence. This way of “doing philosophy” is popular in undergraduate, base-level courses. It helps you to develop reasoning skills that can be applied anywhere. This is important, no doubt, but this is not where philosophy ends. Why? First, writing is not even necessary for “doing philosophy”. The only thing that is necessary, I would argue, is thinking. Thinking must be assisted by reasoning, but this is only the start.

This does not imply that we should identify the philosopher as one who locks himself up in his ivory tower and speculates of a deluded, idealized world. To philosophize well, one must also be able to communicate his ideas in some way, and that will involve language, whether spoken or written. This is one reason philosophy courses are difficult: one must already have a certain level of reading, writing, and speaking proficiency to succeed. The full title of the final chapter of Noë’s book is “Ideology and the Third Realm (Or, a Short Essay on Knowing How to Philosophize)”. Since language is such a crucial part of this issue, I will begin by taking a language-based example from that chapter:

‘The King’s carriage is drawn by four horses’ is a statement about what?

a) the carriage;  b) the horses;  c) the concept it asserts;  d) other

Immediately, one might think that the answer is ‘a) the carriage’. This seems completely logical, given how most of us understand language. ‘Carriage’ is the subject of the sentence, so any terms that follow should (theoretically) describe it. It is certainly not ‘b) the horses’ because that is the object receiving the action, and nor can the answer be ‘c) the concept it asserts’ because nine out of ten people in the room don’t know what the hell that means. Right? Good. It’s settled.

Gottlob Frege had other ideas. He thought that a statement about numbers is a statement about a concept. When we attempt to answer the question about the subject matter of the “king’s carriage” statement, we are speaking in conceptual terms. We are not using the statement to assert anything. So, the answer must be ‘c’. He gives more reasons for this, of course, and he makes us realize that there is a sense in which we become confused about what we mean when we say ‘The king’s carriage is drawn by four horses’. However, despite the piercing quality of Frege’s argument, we have a much stronger sense that we are unconvinced by his theory of language.

The problem with Frege’s claim, for most of us, seems to be that he had a preconception of the meaning of the statement ‘the king’s carriage is drawn by four horses’ before he was even asked the question. He had already established that any statement about a number, without exception, is a statement about a concept, so he was able to answer the question without thinking. The problem with our rejection of his claim is that we are doing exactly the same thing. We also answered without thinking. We held the preconception that every sentence is about its subject. This preconception is guided by the larger logical construction by which we understand language, and it is certainly no more correct than Frege’s view simply because nine out of ten people in the room agree that it is (that would be to commit ad populum). We take our theory of language for granted, and perhaps Frege takes his for granted too. There seems to be no Q.E.D. conclusion here. What we are all doing, if we become inflexible, if we stick to our answer to the question without sufficient evidence to support it, is committing what I call the ideological fallacy.

However, subscribing to ideologies is not always a fallacious thing. It is only when the ideology is applied in a dogmatic way that it becomes wrong. When an evangelical christian lives by Jesus’ principle, “love your enemies”, that can have very positive effects. It may minimize conflict in the person’s life. It may allow them to stand strong in the face of racial adversity. It may allow them to accept people more openly, and very often the favor will be returned. However, the favor is not always returned if the christian is careless and thoughtless. Despite his belief that he loves his enemies, participating in radical evangelical activism would invade on others and create more conflict, leaving his conception of “love” to be questioned. It takes Christianity out of context and misapplies it to the world in a negatively ideological way. There is nothing about the beliefs in themselves that are illogical, destructive, or even wrong. It is in how they are used will determine that. I will use another example. Evolutionary biology can study preserved skeletons of million-year-old homo erectus figures and learn about how we sapiens evolved three stages of evolution later. This could contribute to our understanding of how humans will continue to evolve (or devolve). However, evolutionary biology can only contribute a small piece to the puzzle of predicting the future of humankind. It needs influence from many other fields to even begin to solve any of its own problems. So, when Richard Dawkins uses the broad concept of evolution to attempt to disprove creationism in any one of its countless forms, he is taking his work out of context and applying it in a radical, dogmatic, negatively ideological way. There is nothing about evolutionary biology, as a field, that is wrong. It is a highly-useful method of inquiry. But there is still plenty we do not know about how humans have evolved. We generally just accept that they did with the minimal evidence that we have just as the evangelical accepts his own conception of loving his enemies based solely on Jesus’ teachings. In this case, both parties look equally silly.

Of course, the example above presents two extreme cases. Although we answer this “king’s carriage” question one way, Frege answers it in another, and we seem to have to agree to disagree, there is still a sense in which both sides think the issue is objective in nature and that it deserves further discussion. In order to have this discussion in a logical, respectful, open manner, we must become philosophers, and one may not need to go school to achieve this. Alva Noë wonders how we might categorize our dealing with the “king’s carriage” question. It is not in the realm of the material (e.g. biology), nor is it in the realm of belief (e.g. religion). It seems to be within some third realm. Noë begins to explain with this quote:

The point is not that Frege or we are entitled to be indifferent to what people say or would say in answer to such a questionnaire. The point is that whatever people say could be at most the beginning of our conversation, not its end; it would be the opportunity for philosophy, not the determination of the solution of a philosophical problem. (Noë, 173)

at most…“, Noë says “(what other people say is) the beginning of our conversation… the opportunity for philosophy…” This is another reason philosophy is so difficult! At the very most, when our view stands in opposition to another, we may only have the opportunity to do philosophy. We rarely get there. When we do get there, two or more people are concerning themselves with the third realm of a problem. What is the third realm? It is the realm of possibilities with minimal influence from ideologies. It is abstractly objective yet, as I will explain later, not in the realm of matters Q.E.D.

Where is this third realm? Well, ‘where’ is the wrong question. Bertrand Russell once said of philosophy that it is “in the no-man’s land between science and religion” because it always seems to be under scrutiny from both sides. Perhaps, in some cases, this is correct. It can serve as a mediator between two extremes, but, on the surface, this only explains one of unlimited applications of philosophy.

Upon first reading or hearing Russell’s quote, one might be inclined to place philosophy in between science and religion because it deals with reason over belief (like science) and thought without quantifiable experimentation (like religion). This would be a shallow interpretation that lacks crucial insight. Russell was perhaps a bit too concise for the average interpreter. He did not mean, as I understand him, that philosophy is inside the space between science and religion. It has deeper implications which resonate with those of Noë (despite the fact that Russell was a logical positivist, and Noë is a phenomenologist, so they would probably have a duel for other reasons). Explaining philosophy has nothing to do with where we should fit it in relation to other fields. It has to do with how we can apply its skills, and in that way it is most unique. Those skills are skills of thought. Developing those skills first requires one to look inward, rid himself of bias, and then turn outward to consider all possibilities. This is still only the beginning. Once we achieve this skill of thought, what do we do with it? We continue to practice and improve it. How? The answer is simple, but the application seems, in some cases, impossible. We communicate.

We share our ideas with others who have, to some degree, developed the skill of clear thinking. Of course, communication, whether written, oral, or otherwise, is a practical skill in itself that will be developed naturally, mostly prior to but also simultaneously, alongside the skill of thinking. We tend to adapt our ability to communicate only to the situational extents that we need them, and that can be limiting. When doing philosophy, anyone can participate, but only to the extent that they can think clearly. Philosophy tests those limits, which is why both science and religion so often scrutinize it. Though they deal with subject matter that seems contradictory, (mechanistic) science and religion do have one general thing in common: dogmatic ideology. Philosophy, on the other hand, is perhaps the only field that dedicates the elimination of dogmatism as one of its primary goals.

Doing philosophy is not only about increasing the degree to which people can think, but about being open to different forms of thought as well. What is fortunate in this regard is that each person in the conversation, if one is to find himself in such a conversation, has probably achieved their skill of thought through different means. For example:

There may be one who developed his thinking through philosophy itself, who rigorously studied informal logic to learn how not to commit errors in reasoning. He also may be able to contribute history of thought to the conversation and explain why certain schools of thought are obsolete in academic philosophy. There might also be a more scientifically-minded person who, in a graduate school lab, performed the same experiment under the same conditions hundreds of times, but got variable results. He questioned why this was happening (if the laws of physics are supposed to be constant), so he turned his research to the inconsistencies and realized that uncertainty transcends mathematical equations. He is now able to think more broadly about his work. There might also be a Buddhist in the group who practices intensive meditation. He can turn off influence from his sensory world and walk on hot coals without getting burned, or he can submerge himself into freezing-cold water without catching hypothermia. He is able to clear his mind from all unnecessary matter. Each person achieves the same thing – to think clearly, skeptically, critically – through different means. They each learn from one another and gain a broad range of insights.

Also, and perhaps most importantly, each person in the conversation should be genuinely interested in learning new perspectives in order to improve their own points of view. There is a sense in which someone may have achieved access to the third realm of conversation to a lesser degree than the others, and at a deeper point in the discussion, he gets flustered and has to back out. This is perfectly fine as long as he does back out, at least until his temper cools (if he does not back out, he will disrupt the conversation). He has pushed his boundaries of clear thinking to a level that the others have not, and that can be a very constructive or destructive thing, depending on his mindset. But it is vital that all parties directly involved maintain self-preservation throughout the conversation. If there are any unsettled nerves, it is almost certain that at least one participant is not being genuine, but rather, is too outwardly focused and is perhaps ultimately trying too hard to prove himself right or the others wrong. Although they might seem to contribute insight to the conversation, they will inevitably expose themselves as operating from within an ideology, thereby rendering themselves a nuisance. Philosophy is no activity for the pretentious or egocentric, contrary to popular belief. In fact, the absolute contrary is the case.

Do any philosophical questions warrant a Q.E.D. response? (Does philosophy ever prove anything?)

No. In case this is not already clear, there are, in a sense, no “philosophical questions”. There are only philosophical approaches to questions. Approaching the third realm of a problem requires one to be, as stated earlier, abstractly objective (or perhaps objectively abstract). There are limits to how objective one can be, no doubt, but the aim of advancing thought is to learn more and more about the world and how those in it think, so we can improve on that, both individually and collectively. It exposes dogmatism and reveals the sheer grey-ness in any concrete matter. Need I give examples as to when this might be useful? I challenge anyone to give an example of when it is not, and thereby present an opportunity for doing philosophy! This is why philosophy is so widely-applicable.

To draw an analogy – toward the end of Noë’s final chapter, he mentions Immanuel Kant’s aesthetic view that the reality of one’s response to a work of art is based in feeling – it is not contingent on his ability to explain it. Similarly, Clive Bell described a “peculiar aesthetic emotion” that must (first) be present in something for it to be considered art. It is that feeling you get when you listen to a beautiful composition, watch a film that evokes tears, or look at Picasso’s Guernica after you have heard the gruesome story behind the painting. I had experienced this aesthetic emotion many times, but it was my former professor at the University of New Orleans, Rob Stufflebeam, who, whether he intended to or not, led me to realize that all of those experiences involved the same exact emotional response. Perhaps only for those who have experienced it, it is certainly something that need not, and often cannot be explained.

Likewise, a philosophical approach to a problem is, instead of an emotional experience as with art, at its very best, an all-encompassing intellectual experience. It is not a heated argument, nor is it even a controlled debate. It is a respectful, open-ended discussion about ideas between two or more people in an intimate setting. It raises the awareness of each involved to a broad level of skepticism that, perhaps very strangely, brings with it an aura of contentment. It is obviously not the same feeling one gets with the peculiar aesthetic emotion, but it is parallel in the sense that when you are part of it, you really know. That reality seems to transcend explanation.

Final Thoughts

Alva Noë has developed this idea about perception: “The world shows up for us, but not for free. We achieve access to it through skill-based action.” It is a combination of developing our conceptual and practical skills that allows us to understand the world and live in it. Achieving access to the third realm of a question, as I would consider it, is one of those countless skills. It comes more easily for some than for others. Just as one person might naturally have ideal physiological makeup for learning how to swim (lean, broad shoulders, webbed feet, etc.), another person’s brain might seem to be better wired for clear thinking. Everyone, to some degree, with the proper amount of training, can swim. Likewise, everyone can, with practice, think clearly. The more one practices by looking inward, ridding himself of bias, and working up the courage to subject himself to critique, the more he can contribute to the conversation in his own unique way. How much one wants to participate is solely up to him, but to not participate at all is to miss out on a hugely important (and my personal favorite) part of the human experience.