The False-Dilemma of the Nature vs. Nurture Debate

Before I begin, allow me to explain what I mean by false dilemma. A false dilemma is an error in reasoning whereby one falsely assumes that the truth of a matter is limited to one of two (or a select few) explanations. For example, the American presidential election. For another example, have you ever been stumped by a question on multiple choice test because you saw more than one possible correct answer (or no correct answers all)? — perhaps you got frustrated because you felt that the test was unfairly trying to trick you? Well, you were probably right. This may have been an instance of your ability to recognize the false dilemma fallacy. Sometimes there are indeed any number of correct answers given any number of circumstances. There is often simply not enough information provided in the question for one choice to clearly stick out as correct. This might lead you to question the test in a broader sense. What is the purpose of this (presidential election, or) test? What is it trying to measure or prove? Without getting into that answer in too much detail (as this is not a post about the philosophical state of academic testing), I can say such tests aren’t really concerned with truth and meaning as they are about the specific program they support. That program may or may not have the best interests of the people in mind, and it may or may not be directly governed by the amount of money it can produce in a relatively short period of time. Anyway, that’s another discussion.

In a previous post entitled The Slate, the Chalk, and the Eraser, I compared a child’s mind to a slate, and I argued that as long as we write on it with chalk by teaching him how to think (rather than a permanent marker/what to think), then he will be able to erase those markings to make way for better and more situation-relevant ones in the future, once he develops the ability to make conscious judgments. This is an example that you may have heard before, and it can be useful, but by some interpretations, it may seem to rest on a false presupposition. Such an interpretation may raise the “nature-nurture” question that is so common in circles of science and philosophy. One might argue that if a child’s mind is truly analogous to a slate in the way I have put forth, then I should commit myself to the “nurture” side of that debate. That was not my intention. In fact, that debate, in its most common form, presents a false dilemma, so I can only commit to both or neither side depending on what is meant by ‘nature’ and ‘nurture’. The conventional definitions of these terms are limited in that they create a spectrum on which to make truth-value judgments about objects, experiences, phenomena, etc. We commit to one end of the spectrum or the other, and we take that position as true and the other as illusory. This is similar to the subject-object distinction I described in an earlier post. Perhaps comically, even the most radical (and supposedly-yet-not-so-contrary) ends of scientific and religious belief systems sometimes agree one which side to commit to, albeit for different reasons. That particular conflict, however, is usually caused by a semantic problem. The terms ‘nature’ and ‘nurture’ obviously mean very different things for radical mechanistic scientists and evangelical Christians.

Please keep in mind throughout that I am not criticizing science or religion in general, so I am not out to offend anyone. I am merely criticizing radical misinterpretations of each. Consequently, if you’re an idiot, you will probably misinterpret and get offended by this post as well.

Taking this description a step further, false dilemma can be committed to any number of degrees. The degree to which it is committed is determined by at least two factors: the number of possible options one is considering and the level of complexity at which one is analyzing the problem. Any matter we might deal with can be organized conceptually into a pyramid hierarchy where the theoretical categorical ideal is at the top, and the further one goes down the pyramid, the more manageable but trivial the matters become. As a rule of thumb, the fewest options (one or two) and the lowest level of analysis (bottom of the pyramid) should give rise to the highest probability of a logical error because the bottom level of analysis has the highest number of factors to consider, and those factors culminate up the pyramid toward the categorical ideal. Fortunately, committing an error at the lowest levels of analysis usually involves a harmless and easily-correctable confusion of facts. Committing the error at higher levels of analysis are more ontological in nature (as the categorical ideals are per se) and can have catastrophic consequences. All sciences and religions structure their methods and beliefs into such pyramid hierarchies, as do we individually. They start with a categorical ideal as their assumption (e.g. materialism for some science; the existence of God for some religion), and they work down from there. However, neither religion nor science are meant to be top-down processes like philosophy (which is likely the only top-down discipline that exists). They’re meant to be bottom-up processes. For science, everything starts with the data, and the more data that is compiled and organized, the more likely we are able to draw conclusions and make those conclusions useful (in order to help people, one would hope). For religion, everything starts with the individual. Live a moral and just life, act kindly toward others, and you will be rewarded through fulfillment (heaven for western religions, self-actualization for eastern religions). These can both be good things (and even reconcilable) if we go about them in the right way. What are the consequences, however, if we go about them radically (which is to say blindly)? In short, for radical belief in a self-righteous God, it is war, and therefore the loss of potentially millions of lives. In short, for radical materialism, it is corruption in politics, education, and the pharmaceutical industry, the elimination of health and economic equality, and the potential downfall of western civilization as we know it. That’s another discussion, though.

For the nature-nurture debate, the false dilemma is the consequence of (but is not limited to) confusion about what constitutes nature and nurture to begin with, and even most people who subscribe to the very same schools of thought have very different definitions of each. First, in the conventional form of this debate, what do people mean by ‘nature’? Biology, as far as I can tell, and nothing more. We each inherit an innate “code” of programmed genetic traits passed down from our parents, and they from theirs, and so on. This code determines our physiology and governs our behavior and interaction with the outside world. Our actions are reactive and governed by our brain-computer, and free will is consequently an illusion. What is meant by ‘nurture’ on the other hand? Our experienced environment, and nothing more. Regardless of our chemical makeup, how we are raised will determine our future. There is no variation in genetics that could make once person significantly different from another if raised in identical fashion by the same parents, in the same time and place. We have no control over the objective environment we experience, so free will still seems to be illusory.

These positions seem equally shortsighted, and therefore, this problem transcends semantics. Neither accounts for the gray in the matter — that reality, whatever that is, does not follow rules such as definitions and mathematical principles. These are conceptions of our own collectively-subjective realities which make it easier for us to explain phenomena which are otherwise unfathomable. On this note, we could potentially  consider both nature and nurture phenomenal. That is an objective point on the matter. The first subjective problem is that both positions imply that we don’t have free will. Sure, there are unconscious habits of ancient origins that drive our conscious behavior (e.g. consumption, survival, and reproduction), but there other more complex structures that these positions don’t account for (e.g. hierarchical structures of dominance, beliefs, and abstract behavior such as artistic production), and those are infinitely variable from person to person and from group to group. This comes back to the point I just made about phenomenal reality and the conceptions we follow in order to explain them as if they are somehow out there in the objective world that we are not part of.

Not to mention, we all take differently to the idea that free will might not exist. Religious people are often deeply offended by this idea whereas many scientists (theoretical physicists in particular) claim to be humbled by it. Both reactions, I would argue, are disgustingly self-righteous and are the direct consequence, not of truly understanding the concept of free will per se, but of whether or not free will simply fits into his or her preconstructed hierarchical structure of beliefs. One should see clearly, on that note, why a materialist must reject free will on principle alone, and a radical christian must accept it on principle alone. Regardless of the prospect that the religious person has a right to be offended in this case, and that it is contradictory of the scientist to commit to a subjective ontological opinion when that very opinion does not permit one to have an opinion to begin with (nor can it be supported with any sufficient amount of “scientific” evidence whatsoever), the point here transcends the matter of free will itself: that rejecting or accepting anything on principle alone is absurd. This calls into question matters of collective ideological influence. There is power in numbers, and that power is used for evil every bit as often as it is used for good. When individuals, however, break free from those ideologies, they realize how foolish it is to be sheep and to believe in anything to the extent that it harms anyone in any way (physiologically, financially, emotionally, etc.). The scary part about this is that literally any program might trap us in this way (ideologically), and blind us from the potentially-innate moral principles that underlie many of our actions. On that note, we are all collectively very much the same when we subscribe to a program, and we are all part of some program. We are individually very different, however, because we each have the potential to arrive at this realization through unique means. We each have a psychological structure that makes up our personality. It is undeniably innate to an extent, yet only partially biological. This reveals the immeasurable value in developing the one’s intrapersonal intelligence through introspection and careful evaluation of one’s own thoughts, feelings, perceptions, and desires.

Furthermore, conventional nature-nurture positions are polarities on a spectrum that doesn’t really exist. If we had clearer definitions of each, perhaps the debate would not present a false dilemma. We should reconstruct those definitions to be inclusive of phenomena — think of these terms as categories for ranges of processes rather than singular processes themselves. If we think of these terms as being on a spectrum, we are led to ask the impossible question of where the boundary is between them. If we think of them as categories, we are forced to embrace the reality that most, if not all, processes can fall into either category given a certain set of circumstances, and thus, those categories become virtually indistinguishable. E.g. in the case of inherited skills: practice makes perfect, yet natural talent seems so strongly to exist. If the truth-value-based spectrum between nature and nurture were a real thing, then neither position would be able to account for both nurtured ability and natural talent; it would simply be either/or. This is a consequence of the false dilemma. It leads us to believe that this gray matter is black and white. If we one is decent at learning anything, he/she knows that there is only gray in everything.

But is there? I hope I have explained to some conceivable extent why scientific and metaphysical matters should not be structured into a polar truth-spectrum, and why any attempt to do so would likely present a false dilemma. However, it seems more reasonable to apply spectrum structures to value theory matters such as aesthetics, ethics, and even other personal motivators such as love. This, I will explain further in a later post.

 

Current Methods of Usage – Language as a Collective Social Skill

Language has developed as a collective social skill to the extent that society needs to use it to function. Different dialects develop in different regions out of the necessities that those regions are subjected to. Languages spoken by small bands in the rural Amazon are structurally simple compared to English, which is spoken in most of the developed world. Amazonian lifestyles are also structurally more simple in contrast to the complicated (but certainly no better) lifestyles of the developed west. This makes sense. Their language is suited to their lifestyle. Their lifestyle has one main focus: survival.

Let us suppose one were to raise himself in the wild, isolated from all other humans, he would not be able to create a complex private language because he would not need to. He may develop some way of communicating with nature around him (e.g. mimicking bird calls to attract birds so he can catch them for food), but his language would be nothing like the one we understand. He would need no complex grammatical rules or extensive vocabulary to survive in the wild because there is nothing in the wild either that would need to reciprocate understanding of such a language.

Communication as we know it could never occur. It would not need to. However, the isolated Amazonian would be communicating with the birds, in a sense, if they respond in the way he hopes so that he can catch them to eat. (Whether or not this is considered language can be debated, but if the goal of language is to communicate, then language and communication should be equivalent.) He is using his bird call as a tool to attract a bird just as I am using English to convey an idea to you now. Both he and I can be successful or not in achieving our respective goals. Whether or not we are successful can be due to any number of circumstances. In fact, the Amazonian could very possibly communicate with the bird more effectively than I am now communicating with you. Therefore, he (and the bird) would be more proficient in his language than I am in mine. In fact, I would hope that to be the case so I can further support the claims of this essay!

To “Know” a Language is NOT to have “Knowledge”

We have taken for granted that language is knowledge when it should, in fact, be thought of as a skill. We cannot imagine a world in which we have no knowledge of language, but that is because we have developed the skill of using it so well. We are so good – too good – at using this skill. We can lie to and manipulate others to achieve our ends. In fact, this is a tactic in capitalistic business rhetoric. The main focus of such business is not productivity, conversation, or healthy relationships. The focus can be reduced to one entirely superficial entity: money. Everyone wants as much as they can get, so they employ tactics of rhetoric (i.e. linguistic manipulation) in order to achieve that goal. It is only the loudest and most cunning who succeeds at this, not the smartest, most thoughtful, or most honest.

In the Amazon, on the other hand, the goal is survival. There is no place for wasting resources or time. Nor is there a place for the use of expressions of language which are irrelevant to the tasks at hand. The precise reason that there is so much excess language in English and other western languages is because our lifestyles are not as directly oriented toward primal survival. Our irrelevant distractions have given rise to irrelevant expressions of language.

Language, more broadly, is something that we take for granted. It is difficult, sometimes almost impossible, to communicate complex ideas without language, so we are misled to believe that such ideas cannot even exist without our mastery of a complex language. This is not the case. Our experiences of the world, the patterns we draw from those experiences, and our creative, subjective manipulation of those patterns are what formulate our ideas. We use language to simply (and sometimes not so simply) express our understanding. So, in this sense, expression in general, not our mechanical ability to produce words, is the real evolutionary phenomenon of humans. Every bit as impressive and complex as our ability to express ourselves using written or oral language are our abilities to express ourselves using musical instruments, paintbrushes, sports equipment, hammers and nails, and our bodily movements in dance. Language is a tool, and like any tool, we can misuse it by lying, manipulating, and mistreating others, or, more preferably, we can use it honestly.

Where Is Meaning?

Indeed, to deny that Wittgenstein’s later work improves on his early work is to commit two errors: 1) to overlook or submit to the intellectualist nature of Tractatus; 2) to fail to grasp the crucial insight that his later work provides. Tractatus claims that the better one masters the syntax of a language, the broader his experience and understanding of the world. This is a misled intellectualist view because it values the skill of applying language (as a priori) over and above all other skills and, more importantly, the matters themselves to which language is applied (i.e. any set of circumstances in the world that we attempt to describe). I have only seen shallow and insufficient evidence to support this view. After all, it is the things to which language is applied that matter, not the language itself.

Because there are no limits to how one can experience the world, we should never be misled into believing there are strict boundaries that limit our usage of a word. Our statements are an expression of our understanding. Our statements do not dictate understanding, as early-Wittgenstein thought. In fact, by this notion, we should even be allowed to take a word completely out of context, and just as long as we are able to communicate to at least one other person whatever idea is present to us by using that word, even if it is definitively unrelated, then we would not be using that word incorrectly. In fact, whether we realize it or not, we do this very often.

Whether true or untrue, contemporary schools of thought take for granted that meanings are not in the head. However, it seems clear that anyone’s interpretation of meaning is. It would seem that the most we can agree on is that communication occurs when two or more parties agree on meaning, but they could very well be using identical statements to assert two different things.

Perhaps “where is meaning?” is the wrong question to ask. There is nothing out there in the universe that we can observe in any fashion that dictates meaning. There are no dictionary definitions so precise that, from that definition, we are able to connote everything that is included in the word’s realm of possible references. If definitions were this way, i.e. if they served as rules of meaning, then such a dictionary would be so incredibly large, that it could never be printed. Perhaps it would have to be stored online for anyone to access and edit at a moment’s notice, much like Wikipedia. But still, usage among speakers would be dictating the definitions, so what good would these rules be at all? Definitions would begin to overlap more and more until every word would have so many connotations that it would be virtually indistinguishable from several other words. Is this not already the case?

Usage of phrases and words is in a constant state of flux. We collectively, and often unknowingly, adapt to these constant changes so there remains enough continuity for us to effectively communicate what we mean. Since this adaptation process is often subconscious, we need not think about it; we presume meaning by our usage, and we are almost always correct provided we, and those receiving our message, are fluent in that language.

If Tractatus were more accurate than P.I. in describing the fundamental nature of language, then to learn language would require a lot of memorization, much like one “learns” a foreign language in a classroom. This may allow us to learn something about the concepts of a language, but it does not teach us to effectively use the language within societal contexts, so, learning, in this case, would be much more difficult, and for many, impossible.

So, how to we actually learn language? We’ll have to go back to a time that we do not remember, so we must forget everything we now misunderstand about language. I’ll use the most parallel analogy I can think of:

When parents are teaching a child to walk, they do not simply explain to the child how to walk and expect him to be able to do it without practice. Obviously, the child is not yet proficient in grasping such a concept. Nor does a parent grab one of the child’s legs, put it in front of the other, then do the same with the other leg repeatedly, because the child has not yet developed the practical skill of walking, and one cannot learn such a skill in such a forced manner. The child needs a reason to walk, so the parents teach the child to walk by working toward a goal. One parent (let’s say, the father) stands the child up, and the other parent (the mother) kneels down a few feet away, holding her hands out to the child. The father acts as the spotter, and the mother acts as the goal. The child sees his mother, desires to reach her, and he has to walk to get there in the same way that he learned to crawl (or at least his parents will condition him to believe this based on their training methods). The same is true of language. It is the tool we use to communicate because we need to communicate to get what we want or need. We start out, as babies learning language, by blurting out the word ‘bear’ and pointing to our teddy bear in order to achieve the goal of the teddy bear itself. The child says ‘bear’ to express the general idea “I want that teddy bear” or the command “give that teddy bear to me”. He is communicating with the parent in this sense. He is expressing a desire to achieve a goal. He is not merely making a statement (that would be impossible). Language is the road, not the end of the road. There is no language for language’s sake just as there is no walking for walking’s sake. Language is used for a purpose – a goal – in any given situation.

How each person achieves his goal varies greatly. Not all children walk the same. Some are bowlegged, some are pigeon-toed, some drag their feet and trip on their shoelaces, and some cannot walk at all, so they utilize other tools such as wheelchairs. But they each adapt to their handicaps to get what they need – to get from A to B. Likewise, not everyone speaks the same. Some slur their ‘r’s, some pronounce their ‘s’s with a ‘th’ sound, some use poor grammar, and some cannot speak at all, so they learn sign language. Regardless, each adapts to their handicaps and uses language for the same purpose – to communicate.

Language in general is meant to be used practically, not to be merely understood conceptually. Of course, there are logical concepts to understand which will help us be more precise, but the understanding of those concepts is something like our understanding of how to walk: put one foot in front of the other. As long as you practice walking, you will learn the concept of walking to some extent, but it is the act of walking that is fruitful for the individual. Likewise, one learns the concepts of language to the extents that they need to, but only to the extent that they need to. This is why some children (and adults) in school grasp grammar well, and others do not, though they are able to orally communicate to much of the same effect in social and professional circles. Some are more conceptually-minded. Those prefer to master grammar in order to be as precise as possible both in writing and in speech. They will also make better teachers because they can adapt their language usage to a wide range of listeners. Others prefer to stick to practice and master other types of skills, and perhaps they will become better doers. Either way, practice comes prior to understanding in this case (but not necessarily in the case of everything).

And this is the point: It is only in the case that we look to the world that one might be able to explain language. The world is untamed, and so is the way we understand it and attempt to describe it – i.e. so is language itself. We play language games to adapt the meanings of utterances to our world. Otherwise meaning would be of no use to us, and that is certainly not the case.