position paper

I’ve been going through old papers of mine recently and this one struck me as interesting enough to share. I wrote a barebones post on this subject.

Kahvi Patel
Research Paper
April 17th, 2017

In the early 20th century, the separation between scientific and non-scientific theories was vague. The most practical way to distinguish a theory as scientific was to determine whether it was known a posteriori; relied on an empirical method. This meant that astrology, given its vast amounts of observed information, qualified as a scientific theory. In 1919, Karl Popper sought to contrive an alternative separation. Falsifiability was the philosophical term he coined. With this theory, Popper set the groundwork for the modern scientific method and, simultaneously, addressed the problem of induction that had troubled Hume and many other philosophers before him. Popper’s theory of Demarcation not only has its roots in philosophy, but in human nature. As has been the case throughout history, human nature is more prone towards what Popper distinguished as “pseudo-scientific” theories rather than “scientific” ones. Our inherent desires are what make Popper’s theory of Falsification especially relevant; Demarcation is a tool to fight against our own misleading instincts. In order to repress this widespread malpractice, we must employ tactics of doubt exemplified by philosophers like Descartes and Socrates in order to enforce a more logical, and therefore more productive, general consciousness. As Popper originally described in his theory, we must seek falsification, or doubt, rather than confirmation in order to prove our own theories.

For Popper, the reason for distinguishing between theories was not “ doubting the truth of those other theories, but something else” (Popper 32). What bothered Popper were the incessant “confirming instances” relating to the “pseudo-scientific theories” in question. Freud’s theory of psychoanalysis, for example, was supported repeatedly by whatever the psychologist theorized. In other words, every event confirmed the theory. This practice has a flaw.

I may illustrate this by two very different examples of human behavior: that of a man who pushes a child into the water with the intention of drowning it; and that of a man who sacrifices his life in an attempt to save the child. Each of these two cases can be explained with equal ease in Freudian and in Adlerian terms. According to Freud the first man suffered from repression (say, of some component of his Oedipus complex), while the second man had achieved sublimation. According to Adler the first man suffered from feelings of inferiority (producing perhaps the need to prove to himself that he dared to commit some crime), and so did the second man (whose need was to prove to himself that he dared to rescue the child). (Popper 35)

This situation demonstrates the ambiguity of Freud’s (and Adler’s) theory. The result, no matter how unexpected or crazy, can always be justified. This method was in direct contrast to another Popper had witnessed; Einstein’s theory of gravitation. Whereas Freud’s theory relied on explanation, Einstein’s relied on prediction. More specifically, one sought confirmation, the other sought falsification. Alarmingly though, both relied on empirical evidence and the method of induction, thus classing them both, for now, as scientific theories. Importantly, Popper did not attempt to discredit the validity of Freud’s theory, although he may have done so unconsciously. His intention was merely to distinguish it. The significance of this separation cannot be understated. As history progressed, a scientific method was developed around Popper’s falsifiability and a separation began to emerge between science and others things.

More than just a method for distinction between theories, Falsification has significance in human behavior. In general, humans are programmed to ignore events that do not agree with their beliefs and pay attentions to ones that do. This behavior is called confirmation bias and has been proven to exist in many studies, the most famous of which is the Wason Selection Task. Feel free to attempt this experiment on friends.

He presented subjects with the three-number sequence 2, 4, 6, and asked them to try and guess the rule generating it. Their method of guessing was to produce other three-number sequences, to which the experimenter would respond “yes” or “no” depending on whether the new sequences were consistent with the rule. Once confident with their answers, the subjects would formulate the rule. The correct rule was “numbers in ascending order,” nothing more. Very few subjects discovered it because in order to do so they had to offer a series in descending order (that the experimenter would say “no” to). (Taleb 123)

This experiment demonstrates our inherent instinct to always seek confirmation for our own hypothesis, or theories, instead or trying to discredit them. Once the subjects had formulated a rule, their only intention became trying to find a pattern that fit it instead of trying to supply a series that was inconsistent with their hypothesis. Our natural human behavior compels us to seek confirmation. The parallels here are widespread. Today, most people gather their information (and biases) from the news and there is more news today than ever before. The constant news cycle compounds biases. People have access to countless sources of evidence, or "number series", which they use to confirm their own opinions. It becomes much more difficult to attempt to disprove your own opinion.

A most poignant example of this behavior of confirmation bias is the political polarization that currently exists in the United States: both sides seek confirmation for their own theories, never falsification. Popper’s Demarcation seems to work against human instinct. For instance, Freud’s theory of psychoanalysis is more easily understandable than Einstein’s theory of gravity and thus more attractive. This inherent preference leads to many human patterns of behavior, such as illusion of control and hindsight bias. When a need to explain something arises, we are predisposed to justify our theory by seeking confirmation instead of testing it. A striking example of this behavior is the justification for the start of World War 1. Clearly, the assassination of Franz Ferdinand and the escalating tension and militarization in various countries might predict international conflict. The ensuing World War seems to have been inevitable - after the events have transpired. This is an example of hindsight bias. As Niall Ferguson, a professor of history at Harvard, makes clear in his paper The international bond market between the 1848 revolution and the outbreak of the First World War, the world had not noticed these so-called significant events, as evidenced by the flat lining of war bond prices. If these hints towards the upcoming conflict were as clear as we now make them out to be, why did the price of war bonds not rise? Was there a sudden, universal lack of desire to make profit? or were the events we speak of today not obvious at all. Clearly, the latter is more in line with human nature. The pseudo-scientific theory suggests that the war started because of an episode of tension and militarization. We feel the need to justify events after the fact in order to reassure ourselves that we will see it coming next time. Yet, until those theories in question can withstand falsification, this practice of explanation is useless.

When theories are not tested with falsification, they can lead us to misleading conclusions or a false sense of security. In 470 BC, Socrates exposed society’s wrongful act of gathering, spreading and reciting knowledge that had illogical grounds. In what we now call “Socratic Dialogue”, Socrates reduced his subject’s opinions to a single thesis which was often illogical. Through this, he proved the Delphi’s prophecy true: he was, in fact, “the wisest of men”. This title, however, did not originate from vast education or far-reaching experience.

I am wiser than this man; it is likely that neither of us knows anything worthwhile, but he thinks he knows something when he does not, whereas when I do not know, neither do I think I know; so I am likely to be wiser than he to this small extent, that I do not think I know what I do not know. (Plato 3)

Socrates was merely the only man who knew the extent of his own ignorance. Today, we encounter a similar situation. Many of our generally accepted theories and personal opinions are illogical and/or justified by a flawed method. More importantly though, the vast majority of society does not realize this. Descartes provides another apt exemplar. His doubt, though undoubtedly extreme, was important to build a foundation of truth upon which the rest of his beliefs were founded.

It is as though they [my beliefs] had a right to a place in my belief-system as a result of long occupation and the law of custom. These habitual opinions of mine are indeed highly probable; although they are in a sense doubtful, as I have shown, it is more reasonable to believe than to deny them. But if I go on viewing them in that light I shall never get out of the habit of confidently assenting to them. To conquer that habit, therefore, I had better switch right around and pretend (for a while) that these former opinions of mine are utterly false and imaginary. I shall do this until I have something to counter-balance the weight of old opinion, and the distorting influence of habit no longer prevents me from judging correctly. (Descartes 2)

The important parallel here is doubt. To doubt one’s own belief is the most basic and true form of falsification. We must take advice from Socrates and Descartes and question our own theories and opinions in order to arrive at the roots of our beliefs. We must acknowledge our own ignorance in order to take the next step towards progress.

Popper’s philosophy was originally meant to distinguish scientific theories. His philosophy of Demarcation, however, went beyond science, and addressed two methods of thinking. Unfortunately for humanity, we are prone to the less reliable mode of thinking: one that seeks confirmation. This behavior has led to a worrying trend in human history. By analyzing the past and present, we may understand how hindsight bias and the illusion of control reveal themselves in society and how they are the result of instinctual human behavior. With the problem assessed and recognized, we may understand it and endeavor to solve it. Doubt is perhaps the truest form of falsification and should be practiced by individuals today in order to justify beliefs. Unless we begin challenging our beliefs, instead of seeking confirmation of them, progress of all kinds will slow.


Works Cited
- Popper, Karl. Conjectures and Refutations. Basic Books, 1962.
- Nicholas Taleb, Nassim. The Black Swan: The Impact of the Highly Improbable, 2007
- Ferguson, Niall. “Political risk and the international bond market between the 1848 revolution and the outbreak of the First World War.” Economic History Review, Vol 29, no. 1, February 2006.
- Plato. Apology. translated by G.M.A. Grube, Hackett Publishing Company, 2000.
- Descartes, Renee. Meditations on First Philosophy. translated by Jonathan F. Bennett, Cambridge University Press, 1996.

inspired by Nassim Nicholas Taleb and Fooled by Randomness (great read)