Knowledge

I argue that knowledge, as the word is currently used, is a flawed concept and is not useful. It is merely a feature of language which does not occur in reality.

Throughout history, many formal definitions of knowledge have been proposed, attempting to formalise its linguistic use. The traditional definition of knowledge as justified true belief has already been shown to be incorrect by the Gettier problems. For completeness, I will raise one Gettier-type counterexample here. Farmer A walks past a field. He sees a cow in the field. He forms a justified, true belief that there is a cow in the field. Unbeknownst to him, the cow he sees is a cardboard cutout. However, a cow does actually exist in the field, but is hidden in a ditch, unseen by Farmer A. Farmer A’s belief that there is a cow in the field is therefore true and justified, but we would not say that “Farmer A knows that there is a cow in the field”. This illustrates the incompleteness of true justified belief as a definition of knowledge.

Further, I argue that even definitions that attempt to take the Gettier problems into account are unsatisfactory.

Take for example Nozick’s definition of knowledge:

S knows that p if and only if:
(1) p is true.
(2) S believes that p.
(3) If p weren’t true, S wouldn’t believe that p.
(4) If p were true, S would believe that p and not-(S believes that not-p).

Knowledge under this definition is guaranteed to be true, seemingly solving the problem. However, I contend that criteria (3) and (4) cannot be fulfilled. Using the farmer and cow example, how would the farmer ensure that there is only one cow in the field? Suppose he searches every square metre of the field. There could be a cow in an underground bunker below him, but still considered in the field. Or, there could be a cow walking silently behind the farmer wherever he was walking in the field. More generally, it is always possible to construct a counterexample where p is not true but S believes p is true (or vice versa). Taken to the extreme, this can take the form of a vatted brain, fed signals at all times identical to that which the brain of a farmer searching a field would experience. Because of the above, I contend that (3) and (4) can never be fulfilled with certainty. (See footnote 1)

Due to this, I argue that any binary definition for knowledge (where something is either not known, or known with absolute certainty) is unsatisfactory. This covers all definitions I have seen. I argue that any binary definition that does not require absolute certainty will have a Gettier-type counterexample, exploiting whatever area is left unverified. Yet, any definitions that do require absolute certainty will find certainty impossible to fulfil, thus all statements will be unknown, making the definition useless.

This could be reworded succintly as:

Nothing can be determined with absolute certainty.

The degree of certainty required for beliefs to be considered knowledge could be set as not absolute certainty, resulting in knowledge that could be wrong, which is unacceptable. Or, it could be set as absolute certainty, and nothing can ever be known, which is unacceptable.

Unfortunately, the way the concept of knowledge is used in language requires that its definition be binary, that knowledge be absolutely certain. It is impossible to say that “Person A knows Statement X is true” while Statement X has a chance of being false. This means that any attempts to define knowledge while conforming to its linguistic use must fail.

I thus argue that knowledge, as the concept is currently used, is not a useful concept. Instead, we should speak of beliefs which have a probability of being true. I propose that belief of a statement be on a continuous spectrum strictly between 0 (statement is certainly false) and 1 (statement is certainly true). The exact values 0 and 1 are unobtainable. The number value of belief is the perceived probability of the statement being true, as determined by Bayesian logic applied to available observations of evidence. (See footnote 2)

With this definition, the farmer in the example may estimate a probability of 0.95 that there is a cow in the field, based on past experience. It is unlikely, after all, that a fake cow would be present. It can then be said that the farmer believes, with 95% probability, that there is a cow in the field.

In summary, I feel that a definition of knowledge is not possible, nor is it useful. Certain knowledge does not exist, only uncertain beliefs.

Footnote 1: In this piece, I argue that beliefs cannot be known with absolute certainty. I realise that this is not strictly true, but have omitted this for simplicity. To the best of my knowledge, it is not true only in the cases of statements that are true by definition, and statements about present observations. I do not consider past observations certain because memories can be lost or mistaken.

Examples:
“Cats are animals.” True by definition.
“I am currently observing visual signals (which my mind interprets as a cat).” A present observation.
“I think, therefore I am.” A present meta-observation.

Footnote 2: In fact, from a Bayesian perspective, the failure of binary definitions of knowledge is a direct consequence of Cromwell’s rule. The linguistic requirement that knowledge be absolutely certain is unreasonable from a rational, Bayesian viewpoint.


Also posted on /r/philosophy.

Advertisements