Tinder Fun With a Feminist

I’m Britton, as you should know, and below you’ll find the bio I wrote for my Tinder profile. If you don’t know what Tinder is, then get your head out of the sand, and read about it here.

2017-05-22 16.49.15

I was in New Orleans the other day, getting my swipe on, and then I came across this fine, older lady.

2017-05-22 12.40.00

The first things, ‘politically progressive’ and “the f-word”, I admit, probably should have raised red flags before even her shitty taste in music did. Those terms on their own hint at far-left political views, but the two of them together scream ‘SJW‘. However, she was hot, and that’s very rare of feminists, so I read into her words and saw deeper possibilities. I was hoping that maybe we could talk some philosophy, giving her the benefit of the doubt that her knowledge on that subject wasn’t confined to new-wave feminist crap. Hey, maybe she was even a feminist of the second-wave, non-radical kind, and ‘progressive’ just meant that she was kind of liberal and open to reasonable and necessary change. Maybe she’d even have a cat named Elvira. With this optimistic attitude, I swiped right and immediately tested her humor to see how “open” she really was.

2017-05-22 12.24.23

BOOM! No fun or games with this one. Did I “proudly proclaim” that I am politically incorrect? Reread my bio, and let me know. I think I’m just straightforward about what I want out of my Tinder experience. She could have easily swiped me left if my intentions didn’t line up with hers. Looking back, though, maybe I should have ended my first message with a winky face. 😉

2017-05-22 12.26.28

Do you value truth, Jessica? DO YOU? We’ll find out. Also, Jessica, I’ll be addressing you directly from here on. Wait, is it ok that I call you by your name, or would you prefer something else? I don’t want to be too incorrect and risk “invalidating your existence“.

2017-05-22 14.12.41

Yeah, let’s define a term together! That sounds like a fun philosophical exercise. Maybe you’ll even return the favor by asking me how I would define the term, and then we’ll find some common ground, bettering both of our conceptions of the world. Learning stuff is fun! You read philosophy, so you agree, right?

2017-05-22 12.29.22

Annnnnd there it is. You pretty much nailed it, Jessica. I’m guilty of whiteness, so there’s no need to ask me what I think ‘political correctness’ means. Your understanding of how language works, on the other hand, seems a bit strange, and the philosophy you read may be of questionable quality. My validity on that topic comes from my education in linguistics and philosophy of language. But, you’re attempting to “invalidate” me because I’m… white? Hmmm.

I don’t think that speech is an activity so consciously aimed toward respect, nor do I think it’s a good idea to blindly respect people at all. In fact, it’s dangerous. I’ll spare you the technical linguistic part of the argument because I’m starting to sense that you have a screw or two loose, but I still must address the respect-issue.

Also, how are you so sure that I’m not black or transgender? If you respected me, then you would have asked about my preferred identity because race and gender are determined whimsically and have no biological basis, correct? No, you should have simply requested a dick pic, Jessica. Truth requires evidence, and I have plenty of it.

2017-05-22 12.31.40

So, maybe there’s more to political correctness than your definition, Jessica, and maybe I know some stuff that you don’t. Maybe you’d be interested in hearing it. Maybe if you weren’t so keen on blindly respecting others, then you wouldn’t be so liable to get mugged and raped in a dark alley in New Orleans. Or, maybe you’d like that because you’d become a martyr for your ideology. At this point, you’re not giving me any reason at all to respect you, but I do fear for your safety. After all, you’re right that the world isn’t a very kind place.

2017-05-22 14.39.072017-05-22 14.40.34

I figured I’d play the “patriarchy” card since you already accused me of being part of it by virtue of my straightness, whiteness, and maleness. What did you expect? Why did you swipe me right if you hate me by default, unless you wanted to hate-fuck me (shit, I may have missed my shot)? I mean, you’ve seen my pictures. Chances are that I’m not black under my clothes. In fact, I’m even WHITER there. Well, actually, there is a very small part of me that is kind of tan.

2017-05-22 12.35.42

2017-05-22 15.00.48

*ignores grammatical errors and moves on*

I know I’m an asshole, Jessica. There is no need to repeat yourself. But, does being an asshole make me wrong? No, Jessica, you’re the meanie who committed ad hominem. I also didn’t appeal to emotion to argue my point. You just took it that way. Taking offense and giving it are NOT the same thing. That’s Philosophy 101.

But…do save me! Please save me from my problematic ways so I can be more compassionate like you and make the world a more progressive place! Or, do I need a degree in women’s studies to be infected with your profound wisdom? If it’s LSU that infected you, then you’re right that there is no hope for me because I dropped out of that poor excuse for a higher-education institution after just one semester of grad school.

On the other hand, I could help you by revealing your greatest contradiction, and maybe even give you one more chance to get laid by me, knowing well that so few men would have gotten even this far with you. I mean, this is Tinder. Why else would you be here? Yeah, that’s what I’ll do because I want some too. I’ve learned to accept that liking sex makes women delicate flowers and men oppressive misogynists. It’s cool, really, I don’t need to be reeducated. I’ll even let you play the role of misogynist, and I’ll be the victim, and you can oppress deez nuts all you want.

2017-05-22 15.11.27

That’s where it ended. So…

What the hell is going on here?

I don’t think that I need to go into detail about what is going on here. There are plenty people who have done that very well already. For example, Dr. Jordan B. Peterson in this brilliant snippet from the most popular podcast in the world. The general point I want to make is that we are in a strange place where people like Jessica are multiplying exponentially by the semester, thanks to politically correct ideology infecting universities, business administrations, legislature, and now even Tinder (as if Tinder doesn’t already have enough spam)! This is the time for talented and capable people, mostly men, to stop ceding power to the people who live in those boxes; they’re wrong, and they’ve snuck their way into power without truly earning it. To stand up for truth is to stand up for yourself. However painful that may be now, it is absolutely necessary for the survival of our species. After all, if we were all angry, 35-year-old feminist virgins, of course humanity would end.

Since we aren’t all like Jessica, one day we will be without these people completely. Let’s give them what they want: spare their feelings, thus depriving them of the open, truth-seeking dialogue that would mold them into stronger moral beings and free them from the narrow and suffocating constraints of the feminist ideology. Since they aren’t open to that sort of thing, they will eventually self-extinguish under their childless philosophy and rot in the miserable hell that they’ve created for themselves.

The Personality-Character Distinction

As you may know, I am a big fan of personality studies. The system that has been researched the most by academics in psychology is the Big 5, and that’s where one who is interested in the most cutting-edge, fact-based research should look, but there are others that have taken off in popularity. Myers-Briggs, Enneagram, and HBDI are just a few that are used by individuals to improve their lives, and consulting groups all over North America to revamp businesses. Despite what little formal, scientific research has been done to confirm the cogency of these systems, the results generally speak for themselves. They have all had reasonable success in boosting employee satisfaction and productivity, and they have increased profit for those businesses too. Regardless of whether or not you “believe in” personality, there is something to it, and to explain these systems away because they may not “have all of their facts straight” is to overlook the utility they provide in personal development.

It’s not that the facts don’t matter, but what constitutes ‘fact’ isn’t easy to determine. The average half-life of a scientific fact is only seven years, and that is an average across all scientific fields. As we know, facts in physics tend to hold out longer than those in the social sciences, but when a fact in physics turns out to be wrong, it is often much more broadly and profoundly wrong and thus more difficult to accept because so much has been built on that foundation.

There is nothing that people hate more than an idea which compromises the integrity of their foundation, but when truth happens, we must be willing to change accordingly. This is where science can become its own worst enemy, because after all, it takes humans with subjective goals and motivations to interpret numbers and to make something useful out of those findings. Remember, science is a tool, not a belief system. But, that’s a discussion for another time. This is especially important today for social issues in the universities. I think that moving forward, personality research will play a crucial role in creating a stronger foundation for the humanities and social sciences (which are currently corrupt by neo-Marxism) and for better understanding how to sort out this massive mess which was made very real to me just the other night in the pub.

I was out for some beers, having an argument with a couple of old grad school friends. They were debating each other why women are underrepresented in philosophy departments when they are overrepresented in the other humanities. Their disagreements seemed to be between narrow social issues, as one might expect from two young, impressionable minds who’s opinions haven’t yet been optimized to think outside the social constructionist box of academia. One (the female) argued that direct oppression of women was the cause, and the other (the male) argued that systematic corruption was the problem, which, unbeknownst to them, is more or less the same thing, so their disagreements were fundamentally semantic. When I brought up differences in male and female personalities as a solution, leaving open for discussion what those differences might be (even though I already knew), they seemed to reject it without giving it any serious consideration. Shocker. They didn’t want to accept that people might actually be innately different (as a philosopher type, why wouldn’t you want to be different, I thought?). That is not to say, as I tried to explain, that nurture doesn’t play some crucial role, but they insisted on sticking to the nurture side of the debate while rejecting altogether the nature side. I was even being more centrist about the issue than I should have been because I wanted to facilitate good discussion, but that didn’t work as it was two versus one.

There were a couple of ironies in their rejection of my ideas. The first is that they were clearly embodying their natural male-female differences in the specific positions they originally took. Generally speaking, women are more agreeable and are more interested in people, while men are more interested in ideas and systems, so it’s no wonder my female friend was defending the group-identity-based female oppression position, and my male friend was defending the politico-systematic corruption position. I dared not point that out but they became more aggressive once they began to realize that their positions were more or less the same and only founded on semantic disagreement. From that point, their team approach in attempting to defeat me brought up the second irony — that in agreeing with each other in the fashion that they did, they were acting out the group identity role that is so characteristic of people who take the far-left position on social issues, which is something that they had admitted to. They oriented their arguments onto a foundation of equality, kindness, and compassion rather than on a desire to get to the truth, or to let truth present itself through three-way discussion. When I explained what a Pareto distribution is, the phenomenon where, if given equal opportunity, people’s natural differences will manifest thereby causing a necessary unequal distribution of success, they simply got mad (to make a short story shorter). In my male friend’s defense, he unknowingly proved that his constructionist position was at least somewhat justified by virtue of the simple fact that he is from Seattle. He is a slave of his own cunty-liberal reasoning, after all. My female friend, on the other hand, comes from a conservative family in Georgia, and she carries a gun in her purse, so, what the hell is her excuse?

Anyway, it seemed that the further we went down the rabbit hole, the more we started to talk past one another, for we were operating at different levels of analysis. They thought I was flat out wrong, and I thought they were missing the point, so we were going nowhere fast. They first disagreed with each other about which of the narrow social issues was the cause of the lack of women in philosophy, but they both agreed on the broader presupposition that social constructionism was correct. When I questioned that point, they got angry. This is what we’re supposed to do in philosophy, though – broaden an issue as much as we possibly can in order to find the most reasonable general perspective on which we can ground the known facts. If you can’t think that broadly, or at least keep your emotions in check while others are doing so, then philosophy is not for you. As we are all graduate-level philosophizers, I thought that would have been fun. Well, it was, but it was just a bit dirtier than any of us would have liked!

Looking back, a crucial distinction arose that I now see should have been dealt with from the beginning. That is the distinction between personality and character. Personality is what I consider to be one’s innate, baseline temperament. This is obviously difficult to control for scientifically because there are so many layers of environmental, social, and cultural influence accrued over a lifetime and stacked on top. But, there is still the personality which is your default mode of temperament that goes largely unchanged throughout your life. This is why two or more siblings raised under identical conditions will turn out so different – it’s because they are different. They require different sorts and degrees of attention. How that personality is cultivated, though, encompasses one’s character (which is more or less the same concept as Aristotle’s “State of Character” that he describes in his Nicomachean Ethics). This is where free moral will comes in. One habituates himself into making the right moral decisions to cultivate his virtues, and that forms the character. Perhaps I should call personality temperament, and character personality. Perhaps this semantic point is where my friends didn’t get it. Whatever. Semantics. I’ll be clear from here on.

What the social constructionist has more right than the radical materialist personality advocate (Eric Braverman, for example) is that, at the end of the day one’s character is what is important, and that we can habituate ourselves into projecting a certain image that can lead us to a successful and fulfilling life. What they get wrong is that we are a slave to societal norms, that we’re all the same, and to push back against the patriarchy is the only thing we can do about that. Funny, this view can be explained from a personality perspective. Social constructionists are liberal in their political views, which implies that they are generally low in Big 5 trait conscientiousness which deals with orderliness, industriousness, organization, etc., so they wouldn’t want to put in the necessary work to make positive changes in their lives to begin with. By this logic, they’re simply not allowed to deny the existence of personality. What the materialists correctly presuppose, probably without knowing, is that we should come to understand our baseline temperament, and when cultivating our personality into character, we should not stray from that default mode of being, or else we will live a dishonest and unfulfilling life. What they get wrong, ironically, is that life has no purpose and that we are nothing more than our biology. Pragmatically speaking, this can’t work either. I challenge a materialist to go out into the world and actually attempt to live as though his life has no purpose – as though his thoughts and actions are predetermined by brain functions because he has no free will. One will necessarily fall into a nihilistic, self-deprecating philosophy which would lead to a quick and painful demise, not only for him, but for everyone around him for whom he is a purpose.

Our personality/temperament is our default mode that we should strip from our societal influence to properly understand our potential, that is, if we are individual enough to manage that. Allow Terence McKenna to give you some advice: psychedelic drugs can help. Our character is what we have made of that potential, and it is only a good character if we have taken the time to understand what lies beneath it. Those are both good and evil things. Our character — our being — is the ever-evolving vessel we use to navigate the world that only we have the power to control. We cannot wholly exist apart from our environment. Our being is not our nature or our nurture, but it is precisely the abstract interplay between the two, and how we choose to act accordingly, without regret.

The False-Dilemma of the Nature vs. Nurture Debate

Before I begin, allow me to explain what I mean by false dilemma. A false dilemma is an error in reasoning whereby one falsely assumes that the truth of a matter is limited to one of two (or a select few) explanations. For example, the American presidential election. For another example, have you ever been stumped by a question on multiple choice test because you saw more than one possible correct answer (or no correct answers all)? — perhaps you got frustrated because you felt that the test was unfairly trying to trick you? Well, you were probably right. This may have been an instance of your ability to recognize the false dilemma fallacy. Sometimes there are indeed any number of correct answers given any number of circumstances. There is often simply not enough information provided in the question for one choice to clearly stick out as correct. This might lead you to question the test in a broader sense. What is the purpose of this (presidential election, or) test? What is it trying to measure or prove? Without getting into that answer in too much detail (as this is not a post about the philosophical state of academic testing), I can say such tests aren’t really concerned with truth and meaning as they are about the specific program they support. That program may or may not have the best interests of the people in mind, and it may or may not be directly governed by the amount of money it can produce in a relatively short period of time. Anyway, that’s another discussion.

In a previous post entitled The Slate, the Chalk, and the Eraser, I compared a child’s mind to a slate, and I argued that as long as we write on it with chalk by teaching him how to think (rather than a permanent marker/what to think), then he will be able to erase those markings to make way for better and more situation-relevant ones in the future, once he develops the ability to make conscious judgments. This is an example that you may have heard before, and it can be useful, but by some interpretations, it may seem to rest on a false presupposition. Such an interpretation may raise the “nature-nurture” question that is so common in circles of science and philosophy. One might argue that if a child’s mind is truly analogous to a slate in the way I have put forth, then I should commit myself to the “nurture” side of that debate. That was not my intention. In fact, that debate, in its most common form, presents a false dilemma, so I can only commit to both or neither side depending on what is meant by ‘nature’ and ‘nurture’. The conventional definitions of these terms are limited in that they create a spectrum on which to make truth-value judgments about objects, experiences, phenomena, etc. We commit to one end of the spectrum or the other, and we take that position as true and the other as illusory. This is similar to the subject-object distinction I described in an earlier post. Perhaps comically, even the most radical (and supposedly-yet-not-so-contrary) ends of scientific and religious belief systems sometimes agree one which side to commit to, albeit for different reasons. That particular conflict, however, is usually caused by a semantic problem. The terms ‘nature’ and ‘nurture’ obviously mean very different things for radical mechanistic scientists and evangelical Christians.

Please keep in mind throughout that I am not criticizing science or religion in general, so I am not out to offend anyone. I am merely criticizing radical misinterpretations of each. Consequently, if you’re an idiot, you will probably misinterpret and get offended by this post as well.

Taking this description a step further, false dilemma can be committed to any number of degrees. The degree to which it is committed is determined by at least two factors: the number of possible options one is considering and the level of complexity at which one is analyzing the problem. Any matter we might deal with can be organized conceptually into a pyramid hierarchy where the theoretical categorical ideal is at the top, and the further one goes down the pyramid, the more manageable but trivial the matters become. As a rule of thumb, the fewest options (one or two) and the lowest level of analysis (bottom of the pyramid) should give rise to the highest probability of a logical error because the bottom level of analysis has the highest number of factors to consider, and those factors culminate up the pyramid toward the categorical ideal. Fortunately, committing an error at the lowest levels of analysis usually involves a harmless and easily-correctable confusion of facts. Committing the error at higher levels of analysis are more ontological in nature (as the categorical ideals are per se) and can have catastrophic consequences. All sciences and religions structure their methods and beliefs into such pyramid hierarchies, as do we individually. They start with a categorical ideal as their assumption (e.g. materialism for some science; the existence of God for some religion), and they work down from there. However, neither religion nor science are meant to be top-down processes like philosophy (which is likely the only top-down discipline that exists). They’re meant to be bottom-up processes. For science, everything starts with the data, and the more data that is compiled and organized, the more likely we are able to draw conclusions and make those conclusions useful (in order to help people, one would hope). For religion, everything starts with the individual. Live a moral and just life, act kindly toward others, and you will be rewarded through fulfillment (heaven for western religions, self-actualization for eastern religions). These can both be good things (and even reconcilable) if we go about them in the right way. What are the consequences, however, if we go about them radically (which is to say blindly)? In short, for radical belief in a self-righteous God, it is war, and therefore the loss of potentially millions of lives. In short, for radical materialism, it is corruption in politics, education, and the pharmaceutical industry, the elimination of health and economic equality, and the potential downfall of western civilization as we know it. That’s another discussion, though.

For the nature-nurture debate, the false dilemma is the consequence of (but is not limited to) confusion about what constitutes nature and nurture to begin with, and even most people who subscribe to the very same schools of thought have very different definitions of each. First, in the conventional form of this debate, what do people mean by ‘nature’? Biology, as far as I can tell, and nothing more. We each inherit an innate “code” of programmed genetic traits passed down from our parents, and they from theirs, and so on. This code determines our physiology and governs our behavior and interaction with the outside world. Our actions are reactive and governed by our brain-computer, and free will is consequently an illusion. What is meant by ‘nurture’ on the other hand? Our experienced environment, and nothing more. Regardless of our chemical makeup, how we are raised will determine our future. There is no variation in genetics that could make once person significantly different from another if raised in identical fashion by the same parents, in the same time and place. We have no control over the objective environment we experience, so free will still seems to be illusory.

These positions seem equally shortsighted, and therefore, this problem transcends semantics. Neither accounts for the gray in the matter — that reality, whatever that is, does not follow rules such as definitions and mathematical principles. These are conceptions of our own collectively-subjective realities which make it easier for us to explain phenomena which are otherwise unfathomable. On this note, we could potentially  consider both nature and nurture phenomenal. That is an objective point on the matter. The first subjective problem is that both positions imply that we don’t have free will. Sure, there are unconscious habits of ancient origins that drive our conscious behavior (e.g. consumption, survival, and reproduction), but there other more complex structures that these positions don’t account for (e.g. hierarchical structures of dominance, beliefs, and abstract behavior such as artistic production), and those are infinitely variable from person to person and from group to group. This comes back to the point I just made about phenomenal reality and the conceptions we follow in order to explain them as if they are somehow out there in the objective world that we are not part of.

Not to mention, we all take differently to the idea that free will might not exist. Religious people are often deeply offended by this idea whereas many scientists (theoretical physicists in particular) claim to be humbled by it. Both reactions, I would argue, are disgustingly self-righteous and are the direct consequence, not of truly understanding the concept of free will per se, but of whether or not free will simply fits into his or her preconstructed hierarchical structure of beliefs. One should see clearly, on that note, why a materialist must reject free will on principle alone, and a radical christian must accept it on principle alone. Regardless of the prospect that the religious person has a right to be offended in this case, and that it is contradictory of the scientist to commit to a subjective ontological opinion when that very opinion does not permit one to have an opinion to begin with (nor can it be supported with any sufficient amount of “scientific” evidence whatsoever), the point here transcends the matter of free will itself: that rejecting or accepting anything on principle alone is absurd. This calls into question matters of collective ideological influence. There is power in numbers, and that power is used for evil every bit as often as it is used for good. When individuals, however, break free from those ideologies, they realize how foolish it is to be sheep and to believe in anything to the extent that it harms anyone in any way (physiologically, financially, emotionally, etc.). The scary part about this is that literally any program might trap us in this way (ideologically), and blind us from the potentially-innate moral principles that underlie many of our actions. On that note, we are all collectively very much the same when we subscribe to a program, and we are all part of some program. We are individually very different, however, because we each have the potential to arrive at this realization through unique means. We each have a psychological structure that makes up our personality. It is undeniably innate to an extent, yet only partially biological. This reveals the immeasurable value in developing the one’s intrapersonal intelligence through introspection and careful evaluation of one’s own thoughts, feelings, perceptions, and desires.

Furthermore, conventional nature-nurture positions are polarities on a spectrum that doesn’t really exist. If we had clearer definitions of each, perhaps the debate would not present a false dilemma. We should reconstruct those definitions to be inclusive of phenomena — think of these terms as categories for ranges of processes rather than singular processes themselves. If we think of these terms as being on a spectrum, we are led to ask the impossible question of where the boundary is between them. If we think of them as categories, we are forced to embrace the reality that most, if not all, processes can fall into either category given a certain set of circumstances, and thus, those categories become virtually indistinguishable. E.g. in the case of inherited skills: practice makes perfect, yet natural talent seems so strongly to exist. If the truth-value-based spectrum between nature and nurture were a real thing, then neither position would be able to account for both nurtured ability and natural talent; it would simply be either/or. This is a consequence of the false dilemma. It leads us to believe that this gray matter is black and white. If we one is decent at learning anything, he/she knows that there is only gray in everything.

But is there? I hope I have explained to some conceivable extent why scientific and metaphysical matters should not be structured into a polar truth-spectrum, and why any attempt to do so would likely present a false dilemma. However, it seems more reasonable to apply spectrum structures to value theory matters such as aesthetics, ethics, and even other personal motivators such as love. This, I will explain further in a later post.

 

Collective Subjectivity = Reality :: The Utility of Phenomenological Thought

In my last post, I explained the differences between and the proper uses of the terms ‘subjective’ and ‘objective’. To recap, these terms do not describe the positions from which one perceives. Of course, everyone perceives subjectively, and objects don’t perceive at all. Therefore, the subject/object spectrum is not a spectrum on which one may judge a matter’s truth-value. The spectrum simply describes the nature of the matter at hand — subjective means “of a subject” and objective means “of an object”. Having said that, how can we define truth more broadly? What determines it?

I think that we can, in many conceivable instances, equate truth with reality. This is based on one of two popular definitions of reality. The first, more popular definition in which we cannot equate truth and reality, and the one I reject, is that of objective, Newtonian-scientific reality. This holds that there are mathematical laws and principles out there in the universe, already discovered or waiting to be discovered, which the forces of nature can be reduced to. Proponents of this view hold “rationality”, in all of its vagueness, as the singular Platonic ideal which dictates what is true, real, and meaningful. It follows from this that mechanistic science holds the key to all knowledge. The problem here is that mechanistic science (not all science) is founded in the metaphysical belief in materialism. Materialism suggests that all reality is comprised of quantifiable matter and energy. Humans, and all living things, are “lumbering robots”, as Richard Dawkins claims. Consciousness, ethics, morality, spirituality, and anything else without a known material basis is subjective in nature and thus superstitious, irrational, and not real. As I have already explained, this worldview rests on a straw-man distinction between what constitutes subjective and objective, for it assumes that this distinction creates a spectrum on which to judge a matter’s truth-value (the more objective, the more true).

Remaining consistent with how I have distinguished subjective and objective is the second, less popular, and in my view, much more useful way of defining truth and reality: what is real is what affords us action and drives us toward a goal. The definition is as simple as that, but its implications have a tremendous amount of depth rooted in the unknown. Instead of holding one Platonic ideal (like rationality) as the key to all truth, there are an infinite number of ideals that humans conceptualize, both individually and collectively, in order to achieve their ends. Therefore, this view affords relevance to a wide range of perspectives even if the nature of the objects being perceived is unknown. The rationalist view, by contrast, is limited to the assumption that the nature of everything has already been determined to fit into one of two metaphysical categories: objective reality or subjective delusion. (This Newtonian theory of reality I have just explained, by the way, is a long-winded way of defining ‘scientism’, a term I often use in my posts.)

Nature doesn’t obey laws; humans do, so we tend to compartmentalize everything else in that way because that makes it easier for us to explain what we want to know and explain-away anything we don’t want to know. What we don’t want to know is what we are afraid of, and as it turns out, what we are afraid of is the unknown. So, when anomalies, whether personal or scientific, that don’t fit the already-established laws arise, a Newtonian thinker will categorize it as illusory in order to explain it away. This doesn’t work because even we humans have a propensity to break the laws that we create for ourselves, and this can be a very productive thing. The degrees to which this is the case depends on our individual psychological makeups. People who are high in the Big-5 personality trait conscientiousness, for example, tend to obey rules because of their innate need for outward structure and order. Those who are low in that trait are more likely to break rules, especially if they are also low in agreeableness which measures one’s tendency to compromise and achieve harmony in social situations. Openness, on the other hand, the trait correlated with intellect and creativity, allows one to see beyond the rules and break them for the right reasons — when they are holding one back from progress, for example. These are just three of five broad personality traits that have an abundance of scientific research to potentially confirm their realness and usefulness, even as a rationalist/Newtonian might perceive them. However, the tendency of someone to break rules as a result of their psychological makeup does not only apply to political laws. We also create collective social rules among groups of friends and unconscious conceptual rules for ourselves in order to more easily understand our environment, and those systems satisfy the same basic human needs and take the same hierarchical forms as political order does, and they serve purposes that contrast only in terms of their widespread-ness.

Regardless of our individual psychologies, there are commonalities that all humans share in terms of which types of goals we have and which types of things drive us toward or away from action. Those things are, therefore, collectively subjective across humanity and are what I would like to propose the most universally real and true things (insofar as anything can be universally real or true at all). This leads me to elaborate further on this goal-oriented view of reality.

Since I used Newton as a scientific lens through which to understand the rationalist theory of reality, I will do the same thing to explain the goal-based theory that I am proposing, but this time using Darwin. Philosophically speaking, Darwin did not commit himself to his theories in the same law-sense that Newton did his. In fact, many of Darwin’s ideas have recently been found to be rooted in psychology rather than in hard mechanistic biology. His main principle can be summed up with this: nature selects, and we make choices, based on what we judge to be most likely to allow us to survive and reproduce. That is all. Everything else is just detailed justification which may or may not be true or relevant. In fact, Darwin left open the possibility that the details of his evolutionary theory not only could be wrong, but that they probably were, and he was very serious about that. To take all of those details literally leads one into the same logical trap that the “skeptics/ new atheists” fall into when they obsess over the details of the Bible — they oversimplify and misrepresent its meaning, and therefore overlook the broader, most important points that exist. These are straw-man arguments, and they demonstrate a persistent, juvenile lack and rejection of intellect.

The reason Darwin’s main evolutionary principle is psychological is because it is consistent with Carl Jung’s idea of the archetype. An archetype is any ancient, unconscious pattern of behavior common among groups or the entirety of the human population and their ancestors. The need for all living beings, not only humans, to survive and reproduce, is undoubtedly real. It is something we understand so little, yet it drives an inconceivably wide range of behaviors, most of which are taken for granted to the extent that they are unconscious (e.g. sex-drive is causally related to the desire to reproduce). It is not only in the natural world that humans would have to desperately fight for their life against other species, but even among ourselves in the civilized world have there been instances of radical attempts to wipe out masses of people because one group saw another group’s ideologies as threatening to their own survival and prosperity (e.g. both Hitler and Stalin led such endeavors in the 20th century).

Perhaps, instead, if we equate truth with this archetypal, goal-oriented conception of reality, then we can come to a reasonable conclusion about what constitutes truth: that which affords and drives us to action. That is to say that (capital-T) Truth, in the idealistic, rationalist sense, probably does not exist, and if it does, our five senses will never have the capacity to understand it. The best we can achieve and conceive is that which is true-enough. For what? For us to achieve our goals: survive, reproduce, and make ends meet, and if we are very sophisticated and open, to also introspect, to be honest with ourselves and others, and to live a moral and just life.

Subjectivity vs. Objectivity: Not a Distinction of Truth

I wonder which is worse: the fear of the unknown? Or knowing for sure that something terrible is true?

@pennyforyourbookthoughts

Or, if I might add, the negative, unforeseen consequences of that terrible thing being true?

The answer is: “fear of the unknown”, and it’s a little complicated.

Most things one might know “for sure” lie at either end of the subject/object spectrum. What is known on the subjective end of that spectrum is generally thought to deal with personal or value truths of an that are understood qualitatively by that individual. What is known on the objective end is generally thought to deal with fact and scientific truth that is understood quantitatively by a group. This is generally correct, but it is only the world of objects that convention accepts as ‘truth’, while the subjective is understood to not contain truth-value at all unless we are speaking about it in material (and thus, objective) terms. So, this spectrum actually seems to measure truth; the more objective it is, the more true it is. Here is an interesting misconception that leads me to attempt to make clear the proper uses of these terms.

What does it mean for something to be ‘subjective’ or ‘objective’? First, what they DO NOT describe are points from which one perceives. In other words, ‘subjective’ does not mean “opinion – from the point of view of a particular subject”, and ‘objective’ does not mean “rationally – from the point of view of an object or the world of objects” as, say, Richard Dawkins’ or Ayn Rand’s pseudo-philosophies suggest. They consider the vaguely defined term ‘rationality’ as the universal ideal — Dawkins through materialism and Rand through radical capitalism/individualism. This is shallow and wrong. The reasons for this should be clear. First, everyone perceives subjectively, from their own point of view, and objects don’t have the capacity to perceive to begin with — that is precisely what makes us subjects and things objects! No human perceives at the level of subatomic particles or, by the same token, God. Second, the differences between what constitutes ‘subjective’ and ‘objective’, for the sake of this conversation, depend on how ‘truth’ is defined more broadly. In fact, these terms have nothing to do with truth at all.

Rather, these terms describe the nature of a matter at hand. ‘Subjective’ simply means “dealing with matters of the subject or set of subjects”, and that can range from intrapersonal matters to interpersonal ones. ‘Objective’ means “dealing with matters of an object or set of objects”, and that can range from logical to quantitative to empirical. They DO NOT distinguish any degree of truth. Science, for example, is not objective because it it more true; it is objective simply because it deals with objects. Medicinal practice (which is not a science, by the way), on the other hand, is subjective in nature because it is interpersonal; it deals with human subjects on a case-by-case basis (many physicians do, however, treat their patients as objects, and they in turn view their practice as an objective matter).

This is not to say, however, that each subject perceives and makes judgments to the same degree of truth or accuracy. Each subject analyses any given situation to the degree that is consistent with their unique set of intellectual capacities; those include intrapersonal, interpersonal, conceptual, spatial, experiential, etc. A good IQ assessment tends to measure a combination of all of those things, but most people are only strong in one or two of those areas. For example, one might have a high level of intrapersonal intelligence (they know themselves well and understand their own mental and emotional states) but lack the ability to impartially deal with other people or objective matters because of how strongly they are affected by the outside world. On the other hand, one might have be high in logical or spatial intelligence but lack the ability to admit or even be aware of their emotional states or internal biases that govern the way they deal with personal matters (having one capacity does not imply deficiency in another capacity, necessarily, as people high in IQ might prove).

Given all of this personality variability among subjects, can an argument be made about the question stated above? Which is worse: fear of the unknown, knowing something terrible is true, or the negative consequences that accompany knowledge? I can only speak about this in a normative fashion. I also must presume that anything “good”, as it pertains to knowledge, should broaden one’s perception, and anything “bad” should narrow it. Knowing anything “for sure”, insofar as that is possible, should be a good thing in that it should teach us something meaningful, whether it is pleasant or not. The goodness of that knowledge, because it is sometimes unpleasant, is not contingent on the goodness of its specific consequences. Nietzsche was correct when he said that “people do not fear being deceived; they fear the negative consequences of being deceived”. The consequences, after all, are merely a result of cause and effect, and any cause can produce any number of variable effects depending on the set of circumstances under which it occurs. It is that potential for unforeseen chaos that people fear, at least on the surface. But, such matters are too variable and trivial to direct action in a meaningful way when certain higher-level truths (e.g. how should we think about x, why does x matter to us, etc.) have not been accounted for, so to simply fear consequences is shortsighted. To know something “terrible”, on the other hand, is usually just a case of knowing one side of a particular occurrence without knowing the reasons it happened or being familiar with any perspectives apart from the first one that is presented. In other words, it is knowledge without understanding.

It is the unknown that contains that crucial knowledge that will afford us understanding and drive us to action. That is where real truth comes from. We should be prepared to face the unknown at any time, for it is all around us, and the world so rarely unfolds as we expect it to. In fact, there is nothing that I can think of that any one person has complete control over. There are an infinite number of effects and consequences that our actions can and will cause, so perhaps having minimal expectations to begin with is the most healthy way to prepare for the future. Do not fear the unknown, for to fear the unknown is to fear truth. Facing the unknown will prevent one from accepting any knowledge as “terrible”, and it will in turn not only minimize negative consequences, but it will open many unforeseen, positive opportunities.

 

“Strange Tools: Art and Human Nature”

Some years ago, I was talking with an artist. He asked me about the science of visual perception. I explained that the vision scientists seek to understand how it is we see so much–the colorful and detailed world of objects spread out around us in space–when what we are given are tiny distorted upside-down images in the eyes. How do we see so much on the basis of so little?

I was startled by the artist’s reply. Nonesense! he scoffed. That’s not the question we should ask. The important question is this: Why are we so blind, why do we see so little, when there is so much around us to see?     –Alva Noë

The quote above is from the Preface of Alva Noë’s latest book Strange Tools: Art and Human Nature. Noë is a philosopher at UC-Berkeley who focuses his research on mind and cognition. I have been a fan of his work for the last year or so, so I was excited when he came out with this latest book which deals with a subject that I concern myself with in my own work. His work initially caught my attention because he already does very well what I seek to do to some degree: blurring boundaries between disciplines and shattering harmful ideologies. After all, is this not necessary if we are to advance thought?

It turns out that there is a lot that Noë and I agree on concerning art, and there was even more for me to learn concerning the relationship between art and philosophy more generally. He argues that both art and philosophy are transformative in that they force us to look at the world in different ways. As he explains in Chapter 8, a good work of art carries the message “See me if you can!” One cannot understand it with one simple glance. It takes a timely process of organizing and reorganizing our conception of a work of art to fully understand it, just as we must organize and reorganize many aspects of our lives. It is not until we understand the work to this degree that we are qualified to make a critical judgment about it. Art is a transformative tool, like philosophy or language, for shaping our understanding and expression of reality and of ourselves.

In one of his previous works entitled Out of our Heads, Noë makes a convincing and nearly irrefutable case that we are not merely our brains. There is more to consciousness than neural functioning inside the brain. When we confine the mind to the brain, we leave out a crucial part of our being. We are not mere “lumbering robots” as Richard Dawkins argues. We have rights, responsibilities, and the power to make conscious decisions. What exactly is beyond our brains, whatever its nature, is still up for debate. Regardless, this transformation is not something that happens in the art itself, nor does it happen in us, as in, in our brains, as neuroaesthetics would suggest. Rather, art, and also philosophy, happen to us. Yes, there will be correlative changes in brain functioning, but those are merely byproducts of our active engagement with the work.

Art is to the artist as philosophy is to the philosopher. It is the beginning of a conversation that can cause controversy or enlightenment. It might make us uncomfortable at first because it causes us to question our perception as philosophy forces us to question our beliefs. But, it is later humbling, rewarding, and intellectually engaging. It is a tool for thinking critically, and it is strange because it is difficult to understand. Art and philosophy are both Strange Tools.

Who Has Midlife Crises and Why

Psychologist Carl Jung spoke of a process called ‘individuation’ whereby one gains an elevated degree of self-awareness and is therefore able to take crucial steps toward cultivating his ideal personality (i.e. ‘self actualization’ in Maslownian terms). In layman’s terms, this process is called a ‘midlife crisis’. My proposal is that this is a period of growth that everyone experiences, and the sooner it happens, the easier it is to overcome.

According to social convention and many professional circles of psychology, a midlife crisis is considered a bad thing. For example, a psychiatrist named Sue may claim to have seen this instance many times before. Sue describes it empirically as stress at work and in the family that has accumulated over time, and then it was suddenly unleashed in different forms. This places the blame on the individual for not communicating his inner thoughts and feelings as they arose, so Sue will offer her therapy services to fix the problem by teaching better communication.

A neurotherapist named Ben might also claim to have seen this many times before, but he will take a more materialist approach. Ben will confine the problem to the brain by assuming that something simply went wrong with his neural functioning, and that the matter is beyond his control. He might suggest that the only solution is to undergo neurotherapy in his clinic to realign normal neural pathways in the frontal lobe of the brain.

Both the Sue and Ben, as well as most people in general, see this crisis as a problem that needs to be fixed, and that the only way to do that is via the specific methods in which they have been trained. “I understand. Let me handle it. You can trust me.” is what they will tell their potential patient. Given their wall of shiny degrees in there cozy, inviting office, it is difficult to turn down their offer no matter the cost, as long as they can convince you that you need it.

More likely than not, both Sue and Ben are acting in their own self interests first. They are business people as well as medical professionals. Indeed, the term ‘crisis’ itself carries a derogatory tone, and the professionals have learned to capitalize on that. Their outward warmth, their technical language, their comfortable offices, their alleged understanding the situation, etc. are tactics that they use to keep their business running. That is not to say that their practices are completely useless, but rather, that either service will likely have more or less the same effect for the very same condition because neither comes close to attacking the root of the issue. In fact, they unknowingly focus on fixing the same exact thing (outward communication of inward feelings) since language expressions are actually channeled through the frontal lobe of the brain!

Meet my friend Jay. Jay is 38 years old, and he is an officer in the military. To this point, Jay has led a respectable life of service and duty. He is a devout Christian, goes to church every Sunday, and does community service with his church. He worked hard in high school and in Boy Scouts; he graduated and became an Eagle Scout; he went to college, worked hard, graduated, joined the military as a lieutenant, worked hard, got married, worked hard, had two kids, and then he continued to work hard to maintain that for the years following. Jay is a doer: Make a decision, work hard at it, and you will lead a successful life.

Jay never really questioned the position he was in, and things seemed to be going great, but then, seemingly out of nowhere, he began to have what is commonly known as a midlife crisis. He became a bit depressed and self-conflicted. His temper shortened, and he frequently had emotional outbursts at his wife and kids. With some reluctance, he finally agreed to grant his wife’s request and seek help. He began going to Sue, the psychiatrist, both alone and with his wife. Things seemed to improve for one or two days following each session, but then he would revert back to his ordinary behavior. Sue’s methods weren’t really working for Jay. He got impatient and started to believe that the process was being prolonged, and that he was spending more money than he needed to.

Jay began to seek other forms of help, and then he discovered Ben’s neurotherapy practice. Upon first meeting Ben, he felt a bit more confident moving forward. Ben explained, using much technical jargon, how important the brain is in processing information and making decisions. Though the claim that the brain is important is true, indeed it is necessary, he went on to convince Jay further that his methods were “more scientific” than traditional therapy because they are “backed by modern neuroscientific research”. Jay became convinced that neurotherapy was the answer, and he began treatment. After a few months, however, as Jay’s optimism wore off, so did his patience; his behavior took the same turn that it did before and after psychiatric therapy. He began to feel misled into thinking that these therapists were offering a sure-fire, algorithmic solution that was actually, in some sense, a scam. It turns out that he was right.

The absolute root of a “crisis” is unknown to Sue and Ben because it is, in the conventional sense, unknowable. A crucial part of it deals with knowledge that does not likely have its foundations in the material world, nor is it solvable by simply making a few practical, sure-fire adjustments in one’s everyday life. Therefore, it should come as no surprise that most people like Jay have so much trouble wrapping their minds around something that is different in nature from their materialism-based work and education and their practical, habit-based personal lives, especially when the people who they put their trust and money in are misleading them. It is difficult for them to realize that there is more to themselves than their brains, bodies, and the feedback they gather from the external social and material world. This was exactly Jay’s predicament. He wanted to put his trust into a system to manage his life from the outside-in, but nothing was working. He was forced to turn inward and deal with it himself.

There is a continuous process of personality development in everyone, and without its sufficient maturation, one simply cannot optimally handle the stresses of life. Understanding a midlife crisis, or any crisis for that matter, and taking steps to solve it is a personal journey. It requires one to discover, embrace, and cultivate the auxiliary side of the personality in conjunction with the continuing development of the dominant side. What I am alluding to is certainly not to have solved this puzzle for everyone, necessarily, but rather that it is each person’s job to solve their own puzzle for themselves. There is indeed a highly-effective model one can keep in mind to better understand the self and its place in the world: the cognitive functions as described by Carl Jung.

Immediately, one might question this method. Good. You should, but don’t question it without knowing anything about it, or at least in a way that presupposes bias. It is a continuously developing theory outside of institutional psychology. The reason for this is simply that it does not seem to fit the existing ideology of institutional science on a broader scale: materialism – all reality in the universe is founded on and comprised of quantifiable matter and energy. I have explained in several previous posts, just as well as several professional scientists and philosophers have explained in recent years, why science must move past the materialist worldview in order to progress, no matter the cost. That is not up for debate, so I will prevent any further discussion on the matter by saying this: To dismiss Jungian psychology on the basis of their being “no evidence” for it presupposes that the only evidence is the type that materialism relies on. This is circular reasoning. There in fact has been no materialist attempt to disprove it to begin with. In other words, to stick to such an unsupported principle is to assume it is “guilty until proven innocent”, as in wrong-until-proven-by-materialism. The premise for my proposal here is about people. All people are unique, but there are baseline psychological tendencies by which we operate. This is, as we should all agree, indeed obvious upon any amount of close observation of one’s social environment. That, I will submit, is in itself a form of evidence worthy of a discussion. Having said that…

Each person’s dominant cognitive function, according to Jung, is either introverted or extroverted, and either a mode of judgment or perception. There are two ways of making judgments (thinking and feeling) and two modes of perception (sensing and intuiting). If one’s dominant function is inwardly perceptive, say, introverted intuiting (Ni), then his auxiliary (secondary) function will be an outward mode of judgment, either extroverted thinking or feeling (Te/Fe), to balance out the dominant function.

Of course, everyone necessarily has the capacity to both perceive and make judgments, to extrovert output and introvert input, to think and feel, to sense and intuit; we otherwise would not be able to survive in any social or professional setting. We all do all of those things to varying degrees, indeed. One of those functions, however, is naturally dominant. It is our own personal “standard operating procedure” under normal conditions. When we are confronted with a crisis, we are forced to operate with more depth; i.e. we must work harder do deal with the death of a loved-one than to decide what to wear to go to church, obviously. This does not mean we abide by our SOP more closely than usual. In fact, it implies the opposite: that we must be more flexible about our dominant function. We need balance between our most dominant modes of perception and judgment in order optimally deal with stressful situations. The auxiliary function is what we all struggle with cultivating at some point in our young adulthood to middle-aged lives. It is the one that is more repressed, but it is necessary to use in support of our dominant function if we are to deal with crises healthily.

Whether one is introverted or extroverted in general depends on whether his dominant function is introverted or extroverted. An introvert will likely develop his extroverted auxiliary function earlier in life than an extrovert will develop his introverted auxiliary function because, especially in extroverted-dominated western societies like the United States, functioning in an extroverted fashion is forced upon introverts. Extroverts more easily fit in right from the start, but they have personal crises later in life.

Jay, for example, is Te (extroverted thinking) dominant, which means he is an extrovert with left-brained thinking tendencies. He is outgoing, decisive, and abides by cold, hard, logical systems (e.g. mathematics, law, protocol, etc.) to make judgments about reality. This is very useful in his military environment which values this type of rule-based reasoning very highly. He has a wide circle of social and professional connections and makes a good living. From the outside-looking-in, he is viewed as a success by his peers; the American dream is very Te-focused, and Te-dominants (and Fe) are the most likely to buy into it. However, on a more personal level, as he is learning in his midlife, he is only outwardly, not inwardly, organized. An introverted thinking-dominant (Ti) personality, by contrast, will have a well-structured, internal set of logical rules and principles, but to other people, he may seem outwardly messy and disorganized because he dismisses conventional rules.

For his entire life to this point, Jay has identified himself based on the rules that he followed (by his commanding officer at work, by the Bible in his moral decisions, and by his wife at home). He lived the first half of his life constantly focused on planning for the future and managing himself in an outward fashion. He was accustomed to getting things done – acting now and thinking later. Now that things have settled down, there is no more planning to be done. What is he to do?

The answer is: Don’t do anything. Think. Process. Reflect. Jay’s most obvious problem is that he was not able to turn inward and think independently, apart from the rules set before him. He had been so busy living up to standards external to himself, he had never even considered himself to be a conscious, independent, introspective being. In fact, he was afraid to because he naively associated introspection with feelings, and feelings with weakness. That, after all, is the popular opinion in American culture.

Jay’s midlife crisis is common among all left-brained judging (Te or Fe dominant) personalities, who encompass about half of the American population according to psychologist David Keirsey who was a leader in modernizing Jung’s principles in the 70s and 80s. This process manifests itself in different ways and at different times.

First thing’s first: we need to change our terminology. This crisis is not really a “crisis” at all, in fact; it is a period of growth whereby the extrovert discovers the introverted side of his or her personality, or the introvert attempts to align his internal rules with outer reality. Jay’s dominant function, as I have mentioned, is called extroverted thinking. It is a way of making judgments: being quickly decisive and taking impartial action based on established rules. What he lacks is a cultivated ability to inwardly process the information that he is acting on. That function is a mode of perception. Jay’s perceiving function, once cultivated, will act as the support for his decision-making, and will improve that process to a huge degree. The perceiving function specific for Jay is called introverted sensing (Si). This function collects data based on personal experience, traditions, and principles for the sake of themselves. His personality suits the military and other managerial positions perfectly. When his auxiliary Si is underdeveloped, he follows the rules and doesn’t question them, while almost entirely neglecting his own interests.

What it means for Jay to develop his auxiliary Si function is to improve the way he collects and interprets data and flexibly adapts his existing principles to the constantly-changing environment. This is an internal process. It will improve the way he perceives himself in relation to the data as well as the way he perceives the data itself. He will use this introverted Si perception in conjunction with his dominant Te judgment to make well-rounded decisions.

I used Jay as an example because he possesses the most common type of Jungian personality construction among men in the United States (ESTJ according to Myers-Briggs). The most common type for females (ESFJ) is very similar (Fe/Si dominant/auxiliary instead of Te/Si). If you don’t relate to Jay or his Fe counterpart, that is fine. There are 14 other forms of cognitive functioning, according to Jung. And that is not to take anything away from the individuals within each of those categories. As with anything, there is an immeasurably wider variety of uniqueness among individuals within each group than there are generalized differences among the groups themselves. Having said that, Jungian cognitive typology is not more than a guideline, albeit a very effective one, to keep in mind as one deals with the struggles of life. At the same time, however, don’t blame anyone other than yourself if you reject the system out of principle alone amid a personal crisis.

Cheers!

The Slate, the Chalk, and the Eraser

Prerequisite reading: “WARNING: Your Kid is Smarter Than You!”

A mark of good critical thinking, let’s say, as it applies to science, is that it is always attempting to prove itself wrong. It challenges its most fundamental assumptions when unexpected results arise. We can do this in our everyday lives when we make decisions and formulate our own views. We are only truly challenging ourselves by trying to find flaws in our own reasoning rather than trying to confirm our views. It is easy to confirm our beliefs.

Let’s take astrology as a personal-scientific example. Sparing you the details, based on what little research has been done to refute it, astrology is seen as invalid, and therefore, a pseudoscience, by the standards of modern mechanistic science. However, that does not preclude one from believing in it – in confirming it or any of its insights to themselves. Now, one is not thinking critically by simply believing that astrology is a pseudoscience (or that it is legitimate science). That would be to put too much trust in other people’s thinking. What reasons can you give to support your own belief, and what does it mean?

One can wake up every morning, read their daily horoscope, and upon very little reflection, come up with a reason or two for how that horoscope applies to his or her life. On one hand, those reasons might be good ones, founded on an abundance of personal experience. The horoscope’s insights might serve as something to keep in mind as one goes about his or her day, and that can be a very helpful thing. On the other hand, however, the reasons might be mere, self-confirming opinions. They might be the result of the person’s ideological belief in astrology in general. That can be harmful if the person attempts to apply astrological insights to contexts which it is inapplicable. This is an example of how the confirmation of a specific belief, not the belief in itself, can be good or bad, helpful or harmful, depending on how one thinks about it and the reasons he or she gives for it. The question of whether it is right or wrong, correct or incorrect, is neither important nor provable.

In order to formulate views that are not mere opinions, we must expose ourselves to views that oppose the ones we already hold dear to our hearts. This is difficult for adults. Most of us have been clinging to the same beliefs since we were children or young adults. This is where children have a huge advantage. They don’t yet have views of their own. The sky is the limit to how they can think and what they might believe. Their handicap, though, is that they do not control what they are exposed to. They cannot (or perhaps, should not) search the internet alone, drive themselves to the library, proficiently read, or precisely express themselves through writing or speech. They are clean slates, and that ignorance not only gives them huge potential, but it also leaves them extremely vulnerable.

The Analogy

You may have heard this analogy before, but I will attempt to add a bit of depth to it.

A child’s mind is a slate, as are those of adults (though, arguably, much less so). It is a surface on which we can take notes, write and solve equations, draw pictures, and even play games. We can create a private world with our imaginations. For all intents and purposes, there are no innate limits to how we can use our slates. Maximizing our potential, and that of children, is up to the tools we use.

First, we need something to write with, but we shouldn’t use just any writing tool. Chalk is meant to be used on slate because it is temporary. It can be erased and replaced. If one were to write on a slate with a sharpie marker, that would be permanent. One could not simply erase those markings to make room for others. A slate has a limited amount of space.

Though our minds may not have a limited amount of space in general (there is not sufficient evidence that they do), there is a limit to how much information we can express at any given moment. That, not our mind in general, is our slate – that plane of instant access. The writing tool is our voice – our tool of expression. If we write with a sharpie, it cannot be erased. We leave no room to change our minds in the face of better evidence to the contrary. If we write with chalk, we can just as clearly express our ideas, but we also leave our ideas open to be challenged, and if necessary, erased and changed. It is also easier, for in the process of formulating our ideas with chalk, we need not be so algorithmic. We can adjust our system accordingly as we learn and experience new things.

The smaller the writing on the slate is, the more one can fit, but the more difficult it is to read. Think of a philosopher who has a complexly structured system of views. One detail leads into the next, and they all add up to a bigger-picture philosophy. It might take one’s reading all of it to understand any of it. That can be difficult and time-consuming, and not everyone has the patience for it. The larger the words on the slate, however, the easier it is to read, but the less there will be, so it risks lacking depth. Think of a manager-type personality who is a stickler for rules. He is easy to understand because he is concise, but he may lack the ability to explain the rules. People are irritated by him when he repetitively makes commands and gives no reasons for them. Likewise, children are annoyed when their parents and teachers make commands with no reasons to support them, or at least, no good ones (e.g. “because I said so”).

So, the slate represents the plane of instant access and expression of information, and the writing tool, whether it be chalk or a sharpie, represents our voice – our tool for expressing information and ideas. What does the eraser represent? The eraser represents our willingness to eliminate an idea or bit of information. It represents our willingness to refute our own beliefs and move forward. It represents the ability to make more space on our slate for better, or at least more situation-relevant, information. It represents reason. If one writes with chalk, the eraser – reason – holds the power. If we write with a sharpie, the eraser becomes becomes useless.

The Analogy for Children

I explained in my last post “WARNING: Your Kid is Smarter Than You” that it is important for parents and teachers to teach their kids how to think – not what to think – but I did not offer much advice on how to actually do that. I will not tell anyone, in detail, how to raise or educate their children. Each has a different personality and needs to be catered to through different means. I will, however, offer a bit of general advice based on the analogy above.

The way to teach children how to think (after already having done it for yourself, of course, which is arguably much more difficult) is NOT to hand the kids sharpies, for they will never learn to use an eraser. Their statements and beliefs will be rigid and lack depth of understanding. Granted, this might make them a lot of money in the short-term, but it will also significantly reduce their flexibility when they encounter real-life situations (outside of the institutions of school and work) that require them to think for themselves. This will inevitably limit their happiness from young adulthood to beyond.

Instead, simply hand them a piece of chalk. It is not even important to hand them an eraser, initially. Kids will figure out, after much trial and error, their own way to erase their slates. Eventually, they will find on their own that the eraser is a very efficient method to do so. Literally-speaking, they will express themselves and reason through their problems until they find the most efficient methods – by thinking for themselves, but only as long as they have the right tool.

WARNING: Your Kid is Smarter Than You!

Everyone is born with some capacity for critical thinking, but most people lose the skill over time. Children, specifically those aged 3-5, happen to be the best at it. This can be proven by a single word: ‘why’.

When someone asks a ‘why’-question, they are asking a question of reason, which is to say they are thinking critically to some degree. Children do this much more openly than adults, which is why most adults think children are simply being pests when they do. That is incorrect. The root of their questioning is philosophical. Children challenge assumptions, premises, and claims more openly than anyone. They are learning as much as they can about the world, and they demand reason to back up that knowledge. They are not lazy in the way that they tend to develop beliefs. Unfortunately, most parents do not share such genuine, open curiosity, nor are they readily able to cater to it. This is most obvious in grandparents, as the saying goes, “you can’t teach an old dog new tricks”. Elderly people tend to be the most firmly set in their ways and resistant to new ideas. Who can blame them? Thinking is calorie-intensive. Quite frankly, old people just don’t have the energy for it. Parents and teachers, however, have an important job to do. They have no excuse.

Though a child’s tendency to ask these types of questions will persist for some time, his continuance to do so will depend greatly on how open and able his parents and teachers are to dealing with it. In a perfect world, adults would take this as an opportunity to think critically about those questions themselves. Instead, they get frustrated or annoyed, make up a poor answer (e.g. “because I said so”), and send their kid straight to the TV or to bed; whatever it takes to keep them occupied and out from under the their skin. This is an uninspired and very resistant approach to parenting. The child’s curiosity is repressed, and they gradually stop asking questions and start submitting more and more to an ideology. The more naive children give in more quickly to the rules set before them. Others might become rebellious. Those rule-followers are certainly no smarter than the rebels, despite what social convention will tell you. Either way, their guardians’ repression has a lasting, negative effect on how they think.

I would like to now disclose that I do not have any children of my own, and I do not plan to have children in the foreseeable future. On that basis, someone who is guilty of the above might already feel offended and accuse me of having an incredible opinion on the matter. I would like to think that the contrary is true for two main reasons. First, I am a good planner. I am fully aware of the challenges of raising a child, and that is precisely why I am responsible enough to take the necessary precautions to prevent having one. Secondly, experience isn’t everything. I can observe the effects of bad parenting with a high level of objectivity because my thoughts about the matter are not distorted by the feelings caused by having a child of my own – feelings which unavoidably inhibit one’s ability to reason well.

Having said that, as you are a rational, autonomous agent, let me tell you a story.

I have a friend who has a four-year-old daughter. Immediately, there is a problem: He did not intend to. No, the fact that so many other people accidentally have children does not excuse him. That would be to commit the bandwagon fallacy. Nor does the fact that he is married and is financially able to support his daughter excuse him. In fact, he and his wife planned on holding out for five to seven years after their marriage to have a child, as they were aware of their not being ready. Instead, they ended up getting pregnant within only one year of their marriage. She was not planned, and my friend was not ready for the challenge of raising her. This is obvious upon close observation.

What does it mean for one to “be ready” to raise a child? That seems like a personal, descriptive question that everyone has their own unique answer to. That is true in a sense, but there is also a very normative aspect to this question. What “readiness” should mean here is that one is willing to accept the intellectual challenge of teaching a little person how to think – not what to think. That involves, not shrugging every time the child asks ‘why’, but, also, more crucially, asking ‘why’ for oneself. There is a modern saying that goes, “grade school teaches one what to think whereas college teaches one how to think”. My argument is that by the time someone gets to college age, they have already become a person to a degree, with their own thoughts, feelings, and system of beliefs. Therefore, it is almost certainly too late to teach one how to think. Small children ask the most critical questions. Parents should help them improve that ability at that point, before they have subscribed to an ideology that will most likely be founded in poor reasoning. The obstacle here is that the parents have previously adopted certain beliefs and have therefore surrendered their own ability to think well, much less will they be able to teach that ability to a child. Leading by example is vital, as kids learn by copying.

My friend is no exception. He holds some rather radical beliefs – mainly those of scientism and atheism, which normally go hand-in-hand. Therefore, he is not the type, no matter the subject, to be truly open to the question ‘why’. His beliefs dictate specific answers to those questions. i.e. All knowledge in the universe, including that of supernatural entities (such as God), has been or will be confirmed or falsified on the basis of physical, quantifiable matter.

The other day, my friend’s daughter was at preschool when some of her classmates were talking about a discussion they had in Sunday School the weekend before. When she got home that afternoon, she began to ask her father questions about God. She wasn’t doing so in a way that presupposed God’s existence, nor was she making any such claims. She was simply asking out of genuine curiosity, as children do with everything. To this point in her life, she had never even heard of God because my friend, being a serious atheist, had kept all sources of religion from her access at home. So, as you might imagine, he was quite disturbed that she was asking these questions. He felt he had done all he could do at home to keep religion out of her life, and now she was confronting him, backing him into a corner. His quick-fix decision was to, first, reject her questioning, and second, become more militant in forcing scientism upon her. He went out and bought children’s books about Darwinian evolution to fill the gap of there being no religion (e.g. bible story books). His hope was that she would believe in science (actually, scientism) instead of religion.

My friend, on an elusive, yet vital note, is trapped in a very conflicted way of thinking. He wants his daughter to “think according to reason”, as he says, but he also wants her to believe in some very specific ideologies. The two, at least in principle, cannot coexist. As I have clearly explained in earlier posts, reason and ideology are nearly polar opposite mindsets. If one is to reason well, he should find that no general ideology, is worth submitting to. There are only specific, situational exceptions to that fact. For example, when one takes a math test, he tunes into the deductive, mathematical way of thinking. When he takes a history test, he tunes into the material he studied for that test. Each way of thinking is useful in its own contexts. If he tries to apply math to the history test, or vise versa, he will fail the test.

On a more obvious note, my friend’s attempt to relentlessly control what is exposed to his daughter is a hopeless endeavor. She is going to get out of the house and away from her parents, as she already has to a degree. She is going to experience the world. She is going to have conversations with people who have views that conflict with her own. Most of all, she is going to be challenged. If she is taught what to think (whether evangelical Christianity, scientism, atheism, democratic or republican ideologies, etc.) she will be defenseless in such encounters. She will only be able to think and express herself according to those strict systems of thought, and that will be very limiting.

This approach to parenting, in some form or another, is widespread in the western world, and it is wrong. It is like trying to understand how the brain of a rat works by killing the rat, taking the brain out, and observing the brain in a non-working state, independent of the body. When one attempts to control all variables from happening, such conditions fail to represent those in the real world, for the real world is that which contains all the variables uncontrolled! Anything learned via such a method cannot be meaningfully applied in the real world. In fact, such methods will produce literally no meaningful results whatsoever.

How these analogies and examples can help us improve things, I will soon explain. There are constructive methods and solutions. The details of those methods will be for the individual parents and teachers to determine. All I will do is offer insights. You know your children the best, so adapt the concepts in your own way toward the one common goal: development of flexible thinking and viewpoints. There is a route for everyone. It is up to you to carve it for your children and for yourself.

There is not one generalized system of government, education, and economy that will satisfy all individuals. The ways individuals see things can change instantaneously. Creating a better world starts with better-thinking individuals. We can only hope that future systems will adapt accordingly.

To be continued…

 

Typology as a Step Toward Critical Thinking

One of the key aims of philosophy, for the individual, is to simply become more open-minded. It is to broaden one’s understanding of what is logical and illogical, rational and irrational, not merely to himself, but actually. This is extremely difficult, so most philosophy course syllabi will include a disclaimer such as this one:

WARNING!
Doing philosophy requires a willingness to think critically. Critical thinking does not consist in merely making claims. Rather, it requires offering reasons/evidence in support of your claims. It also requires your willingness to entertain criticism from
others who do not share your assumptions. You will be required to do philosophy in this class. Doing philosophy can be hazardous to your cherished beliefs. Consequently, if you are unwilling to participate, to subject your views to critical analysis, to explore issues that cannot be resolved empirically, using computers, or watching Sci-Fi, then my course is not for you.
Rob Stufflebeam (University of New Orleans)

Harsh? For many, it is. After extensive, philosophical examination of our beliefs via criticism from others or otherwise, we should find that they are founded on many assumptions. Of course, one cannot make any argument without some preexisting assumption(s). Perhaps the challenge, for some, lies in choosing which assumptions to submit to and which to debate. For the philosopher, though, the challenge is much more broad and often more difficult. Philosophy isn’t about formulating beliefs from nothing, but rather, if not to develop beliefs which can be justified and maintained in a logically consistent way, to eliminate belief altogether.

It may seem ironic that the aim of the philosopher is precisely to not have “a philosophy” in the conventional sense of the term. I would argue, though, that this is not a conscious aim of philosophy (perhaps that is the conscious aim of art). After all, good philosophers are not grumpy, old, bigoted skeptics in the way some may think. Rather, this unbelief is merely a byproduct of having explored a subject in a philosophical way, i.e. impartially. As I explained in a previous post, There are no “philosophical problems” per se; there are only philosophical approaches to a problem, and one can approach any problem philosophically.

What do philosophical approaches to problems do for us? The short answer is “lots of stuff”. Let us consider this example: Let us suppose that a man named Scott stands at the foot of a deciduous forest in the winter long after all of the trees’ leaves have fallen off. In front of the forest, in Scott’s plain view, are two large, lush, and green coniferous firs. Scott’s wife, Cindy, asks him “how many trees do you see?”. He answers “two”, for the firs, so green and lush, are the clearest things in his immediate view that resemble what he conceives to be trees.

Cindy’s question initially seemed like a very straight-forward, mathematical question. But Scott jumped to the conclusion that firs, not trees in general, were the objects Cindy wanted him to count. Of course there are many deciduous trees directly behind the two firs. He could have very well replied ‘a bunch; too many to count’ if he had simply looked past the eye-catching firs to the vast-yet-barren, leafless forest. As we know, the deciduous trees are every bit as alive as the firs; they’re only dormant for the winter. Even if Cindy’s question specified the firs as the trees for Scott to count, answering in a straightforward way might pose more questions, leading to a philosophical discussion about, say, mathematics (e.g. what is meant by “two firs” when the trees literally have so little in common?).

This is just a metaphoric example, but the point is this: an aim of approaching questions in a philosophical (and similarly, an artistic) manner is to gain the ability to see past what is immediately present to us. After all, what is immediately present to us are often dubious assumptions formulated by culture, nurture, institutions, etc.

Immediately, one might see why this type of “critical thinking” can not only be difficult, but get us into trouble, and it often does. Not only are individual’s beliefs founded on assumptions which are very often irrational, but the same is the case for belief systems of businesses, institutions, and personal relationships. People in these contexts can be very sensitive to criticism. “Power-in-numbers” exists and is very often harmful in a philosophical sense, for collective bodies are generally more easily influenced by foolish belief systems than individuals are (cult mentality). Those who break from the group and question things in a fundamental way are only thinking for themselves. They become outcasts, albeit curious and honest ones. Just as an individual should strive for harmony between his outer world and inner self, so should a group be resistant to any type of dogmatism. How do we achieve this?

There is no sure-fire solution, for if there were, it would follow that all people innately think the same way, and this is obviously not the case. In fact, thinking for yourself, which is to say, thinking differently from everyone else, is absolutely vital if you want to thrive in any regard. Philosophy and critical thinking in general can help if one is up for the challenge, but it is not advisable for just anyone to dive right into philosophical reading and discussion (Philosophy is difficult, and few people have the natural tendency to think openly about sensitive subjects to the extent that one must to be successful in philosophical discussion – see the WARNING above). There are other ways.

Each person has a different mind which presents a new set of challenges – challenges for which they will find solutions only if they come to terms with themselves first. For an outwardly-focused extrovert, this generally means finding comfort in one’s own skin. For an inwardly-focused introvert, it means finding one’s place in the outer world. However, it is much more complicated than that. This has been one basis for why I think Jungian typology, personality psychology, and light aesthetics, for the general population, present more relatable ways to deal with questions that are normally of concern to ethics and moral philosophy. No one broad ethical theory will satisfy everyone, and I find it nearly impossible to adapt such a theory to a wide range of people in a conceptual sense, and even less so in a practical sense. Typology is an extremely effective method for understanding one’s self and others.

How can each individual maximize his or her ability to think, act, and thrive? First of all, we must acknowledge that every person has his or her own version of the “good life”, so it is his or her goal to figure out what that is and aspire to it by maximizing his or her cognitive potential, so ethics does not, at least initially, seem to be of much use. This sort of “self-actualization” can be vital, also, for maximizing one’s participation in philosophical discussion. However, before one subjects him or herself to harsh philosophical criticism, it is advisable for one to come to know him or herself. Jungian typology is a great method for taking that first step, and then, perhaps, philosophy can pave the rest of the path.

To be continued…