Twentieth-Century Philosophy of Science: A History (Third Edition). Thomas J. Hickey
Чтение книги онлайн.
Читать онлайн книгу Twentieth-Century Philosophy of Science: A History (Third Edition) - Thomas J. Hickey страница 31
Citing Kuhn some sociologists of knowledge including those advocating the “strong program” maintain that the social and political forces that influence society at large also influence scientific beliefs. This is truer in the social sciences, but sociologists who believe that this means that empiricism does not control acceptance of scientific beliefs in the long term are mistaken, because it is pragmatic empiricism that enables wartime victories, peacetime prosperity – and in all times business profits, as reactionary policies, delusional ideologies and utopian fantasies cannot.
4.23 The “Best Explanation” Criteria
As noted above, Thagard’s cognitive-psychology system ECHO developed specifically for theory selection has identified three nonempirical criteria to maximize the coherence aim. His simulations of past episodes in the history of science indicate that the most important criterion is breadth of explanation, followed by simplicity of explanation, and finally analogy with previously accepted theories. Thagard considers these nonempirical selection criteria as productive of a “best explanation”.
The breadth-of-explanation criterion also suggests Popper’s aim of maximizing information content. In any case there have been successful theories in the history of science, such as Heisenberg’s matrix mechanics and uncertainty relations, for which none of these three characteristics were operative in the acceptance as explanations. And as Feyerabend noted in Against Method in criticizing Popper’s view, Aristotelian dynamics is a general theory of change comprising locomotion, qualitative change, generation and corruption, while Galileo and his successors’ dynamics pertains exclusively to locomotion. Aristotle’s explanations therefore may be said to have greater breadth, but his physics is now known to be less empirically adequate.
Contemporary pragmatists acknowledge only the empirical criterion, the criterion of superior empirical adequacy. They exclude all nonempirical criteria from the aim of science, because while relevant to persuasion to make theories appear “convincing”, they are irrelevant to evidence. Nonempirical criteria are like the psychological criteria that trial lawyers use to select and persuade juries in order to win lawsuits in a court of law, but which are irrelevant to courtroom evidence rules for determining the facts of a case. Such prosecutorial lawyers are like the editors and referees of the peer-reviewed academic literature (sometimes called the “court of science”) who ignore the empirical evidence described in a paper submitted for publication.
But nonempirical criteria are routinely operative in the selection of problems to be addressed and explained. For example the American Economic Association’s Index of Economic Journals indicates that in the years of the Great Depression the number of journal articles concerning the trade cycle fluctuated in close correlation with the national average unemployment rate with a lag of approximately two years.
4.24 Nonempirical Linguistic Constraints
The empirical constraint is the institutionalized value that regulates theory acceptance or rejection.
The constraint imposed upon theorizing by empirical test outcomes is the empirical constraint, the criterion of superior empirical adequacy. It is a regulating institutionalized cultural value definitive of modern empirical science that is not viewed as an obstacle to be overcome, but rather as a condition to be respected for the advancement of science toward its aim.
There are other kinds of constraints that are nonempirical and are retarding impediments that must be overcome for the advancement of science, and that are internal to science in the sense that they are inherent in the nature of language. They are the cognition and communication constraints.
4.25 Cognition Constraint
The semantics of every descriptive term is determined by its linguistic context consisting of universally quantified statements believed to be true.
Conversely given the conventionalized meaning for a descriptive term, certain beliefs determining the meaning of the term are reinforced by habitual linguistic fluency with the result that the meaning’s conventionality constrains change in those defining beliefs.
The conventionalized meanings for descriptive terms thus produce the cognition constraint, which inhibits construction of new theories, and is manifested as lack of imagination, creativity or ingenuity.
In his Course in General Linguistics (1916) Ferdinand de Saussure, the founder of semiology, maintained that spoken language is an institution, and that of all social institutions it is the least amenable to initiative. He called one of the several sources of resistance to linguistic change the “collective inertia toward innovation”.
In his Concept of the Positron (1963) Hanson similarly identified this impediment to discovery and called it the “conceptual constraint”. He reports that physicists’ erroneous identification of the concept of the particle with the concept of its charge was an impediment to recognizing the positron. The electron was identified with a negative charge and the much more massive proton was identified with a positive charge, so that the positron as a particle with the mass of an electron and a positive charge was not recognized without difficulty and delay.
In his Introduction to Metascience (1976) Hickey referred to what he called the “cognition constraint”. The cognition constraint inhibits construction of new theories, and is manifested as lack of imagination, creativity or ingenuity. Semantical rules are not just rules. They are also strong linguistic habits with subconscious roots that enable prereflective competence and fluency in both thought and speech, and that make meaning a synthetic psychological experience. Given a conventionalized belief or firm conviction expressible as a universally quantified affirmative statement, the predicate in that affirmation contributes meaning part(s) to the meaning complex of the statement’s subject term. Not only does the conventionalized status of meanings make development of new theories difficult, but also any new theory construction requires greater or lesser semantical dissolution and restructuring.
The cognition-constraint thesis is opposed to the neutral-language thesis that language is merely a passive instrument for expressing thought. Language is not merely passive but rather has a formative influence on thought. The formative influence of language as the “shaper of meaning” has been recognized as the Sapir-Whorf hypothesis and specifically by Benjamin Lee Whorf’s principle of linguistic relativity set forth in his “Science and Linguistics” (1940) reprinted in Language, Thought and Reality. But contrary to Whorf it is not just the grammatical system that determines semantics, but what Quine called the “web of belief”, the shared belief system as found in a dictionary.
Accordingly the more revolutionary the revision of beliefs, the more constraining are both the semantical structure and psychological conditioning on the creativity of the scientist who would develop a new theory, because revolutionary theory development requires relatively more extensive semantical dissolution and restructuring. However, use of computerized