Where to turn if Bad Science is all we’ve got?
Any scientific methodology originating from refined methods, as our higher educational institutions nurture us to do, by design has to concentrate both method and perception. This is essential to create a demarcation between real science and pseudoscience. But is such a demarcation sufficient in protecting us from bad science, or will it inevitably arrest itself in its own duress? We must also today, more than in previous times, consider the powerful concentrating effect of sponsored institutionalised thinking and corporate funded research on science. Science is our untapped potential and only hope to constantly evolve a body of knowledge that can be applied to a growing list of new subjects.
It is generally accepted that empirical evidence is the evidence of the senses, of direct observation or measurement and the pillars of the science we develop. Compare that to rational evidence, which is evidence that is the result of deduction, experience or other reasoning, or anecdotal evidence which comes from personal testimony which may be reliable or not, its potential to reliability also deserving closer scrutiny. Most research in healthcare is based on empirical evidence coming from narrowly defined avenues— called evidence-based research. We must however agree, it is only when we pay heed to all approaches having potential value in scientific advances that we can reliably help to improve the human condition.
We should firstly contemplate what some science philosophers had to say on this matter. The Hungarian born science philosopher Imre Lakatos (1922-1974) suggested that the distinction between science and non-science, and between good and bad science, should be based on when a research program cannot predict anything new, or when its claims cannot be tested—the latter subsequently is bad science and might be degenerating to the point of pseudoscience.
Evidence-based research, our most acceptable approach to the scientific method, also serve to refine auxiliary conjectures and continues to be progressive for as long as new facts can be predicted, and new tests can emerge from data. The philosopher Sir Karl popper famously contributed to the philosophy of science with his claim that falsifiability of existing theories is the demarcation of science from non-science. In different terms it also implies that no methodology is entirely free from observational error and therefore can always be falsified and improved on. So far what all these claims have in common is that good science should be able to advance by learning from its errors and mistakes and result in something new to examine.
The foundations of good science get more insecure if science confirms that no observation (regardless of methodology) is entirely free from measurement error or bias, consequently we may question whether our experimental result was what it appeared to be. Things may also change as we measure them (Heisenberg principle aside), such change occurring in both the observer and the object being observed as they both continuously evolve. Quantum physics may add further confusion here when we consider recent claims by physicists that ‘light knows when we are looking at it and behaves differently when it is being observed’. Slowly it is becoming clear that good science (or at least our expectations of it) is both ephemeral and elusive. In addition to this problem, as I explained in Spheres of perception (2020) besides the aforementioned challenges time and its inseparable companion change, play a significant role in the value and interpretation of all things historically observed and measured. Think of trying to land a plane on an air strip at Heathrow based on historic predictions of continental drift and weather patterns. Unavoidably good science has to progress from its previous flaws; crashing a plane when human lives are at stake is not a good method to eliminate bad science in order to learn more.
Should arrogantly we make the mistake of setting our believes concretely on an empirical science and its ‘flawless’ conclusions, the interactive chemical building blocks of life may soon be nearing saturation, broken down to their smallest measurable elements as based on our narrowly defined set interpretations. With scientific foundations set on such unfalsifiable theories , measured predictability and following concrete rules under fixed conditions — how shall we evolve a good science from here then? It seems Popper was right. In such a mechanistic evidence-based correctness in the physical world we are also (perhaps a surprise to many) in greater danger of advancing pseudoscience. Soon the concrete matter we so arduously tried to break down, measure under strictly applied set conditions and define down to its smallest components and cling to, may become inadequate. Unless we somehow allow for the fact that the subjects being measured and studied also evolve at different rates over time as they interact in a complex network. However, even this won't be enough, we also somehow need to create more scope for our perceptive abilities to continuously coevolve in all spheres attuned to the evolution of the subjects studied.
Lakatos proposed that a scientific revolution occurs when a dominant programme has completely degenerated and is incompetent in responding to accumulating anomalies – creating a crisis in confidence in either the methodology, the basis in approach of the measurer or both. A new scientific breakthrough is then needed to for a progressive science to explain emerging anomalies. I think we are close to such a revolution in the natural and physical sciences today. Lakatos pre-warned that such a revolution if and when it occurs should be driven by logic and method, not irrational mob psychology as is emerging today. Many still see both evolution and many outcomes in medical science as attempts to demarcate the physical from the metaphysical, subsequently when science fails them they rush to find comfort in the metaphysical or in the uncertainty of beliefs and remedies without any proof —this is dangerous under current needs and social demands. Should we perhaps blame science for being too realistically harsh, or too unrealistically demarcated from all spheres of existence? If we consider our current concept of a Darwinian based evolution as the cornerstone of much of contemporary biology and medical science, and place this under Lakatos’s, Popper’s and my own scrutiny here, then it is also overdue for an update. The reason for this is that the general perception is still of an evolution based on a crude ‘unconscious’ mechanical natural-selection searching for survival and reproduction in an egocentric gene. With many advanced techniques available and the genomes of various species now mapped, it is helping us to understand evolution more as interactive and 'perceptive' living system— interconnected to living environments. A new picture of a perceptive bionetwork continuously adjusting and fine-tuning itself as part of a communicative and interdependent bionetwork is emerging and we need urgent adjustments to rescue science, unless we surrender to a future of pseudoscience. If we continue to pursue this road, besides bad science inevitably leading to regression, we also miss out on a significant and overdue moral advancement as explained in my new book.
Lakatos’s concepts, Popperian falsification and Kuhn’s claim of science having a cultural drive that may mislead, all have one thing clearly in common, and that is the possibility that any strictly evidence-based research programme might exhibit changing fortunes over time and we need to be prepared for this. A cultural drive demarcating itself from the gains and benefits of science may in reverse be seen as an indicator of bad science, currently we see this emerge on many fronts. Darwinian based evolutionary theory fits the latter pattern in view of new advances in phylogenetics and the witnessing of a ‘communicative’ biochemistry as the essence of life. There is also however nothing to rule out the possibility that a degenerating programme could somehow stage a spectacular recovery. The latter does not imply manipulation or reinterpretation of research subjects, but rather adapting new perceptions and/or methodologies to help us better understand our evolutionary epistemology. We should pay heed to this as I pointed out in Spheres of Perception, by also widening our perceptions to circulate in freely evolving spheres, especially in healthcare. These three spheres being, the evidence-based physical world, the uncertain (unproven) and metaphysical (yet to be perceived). The should all be open to evolve together in a good science interacting within a progressive world. In such a science no demarcations or biases can hamper progressive understanding to reliably interconnect based on sound principles as part of a continuously evolving perceptive living network.
The philosophers all hinted that science should be non-restrictive. Lakatos’s warned scientists doggedly pursuing a degenerating research programme are guilty of an irrational commitment to a bad science. Popper tells us if nothing can be falsified it is bad science. Today exiting research fields have opened up in astrobiology, quantum mechanics/computing, on dark matter and dark energy, besides the above-mentioned escalating evidence of a flexible genome interactive within changing living environments. These exciting fields are still very much neglected, lost amongst an array of programs finding much more security in the familiarity of bad science with set values operating in inflexible theories, this offering more security to support financial investments.
It appears that unfortunately ‘bad science’ may be currently all that is available to take the next leap forward. In a post reductionist era where the material world and all our advancing means to perceive and measure it will become restrictive when dissected down to its smallest elements. Especially in biochemistry, physics and chemistry, solidity appear to be less concrete than expected. All we can turn to then is an interactive and interconnect all regarding concern driven by perceptions focussed on this escalating change in evolving complexity. The aim will become to gain better understanding of how all these narrowly defined elements interconnect and advance or regress in self-generated complexity. For this we will need new methodologies to create space for the undiscovered and unexpected. Such methodologies should simultaneously not be restrained by culturism or fad as Kuhn suggested, neither should it be based on set institutionalised thinking, existing methods or non-falsifiable set theory as Popper guided or be disabled in its ability to claim anything new (evolve new unrestricted ideas) as Lakatos indicated.
In conclusion we should not neglect including the valuable, seemingly indifferent, opinion on the topic from the American philosopher Larry Laudan in 1983 here. He claimed there is no hope of finding a necessary and sufficient criterion of something as heterogeneous as scientific methodology. This widely accepted conclusion of Lauden opened up more new doors that we should now all eagerly enter in our pursuit of good science.
There is now also growing recognition that the natural, social sciences and the humanities all function as parts of the same human heterogenous endeavour attempting to create systematic and critical investigation—all aimed at acquiring the best possible understanding of the workings of nature, objects and people in a rapidly evolving interconnected living network. The diverse disciplines that form this community of knowledge disciplines are increasingly interdependent as Hansson also implied in 2007. We now understand that all these components (including us) are constantly changing, interactive within a biostructure we are only beginning to understand. Lauden was right in that 'empirical science is facing simultaneous dilemmas all at once as it can only function if perceptive to a heterogeneous environment'. In the current climate of knowledge explosion, and with fund-directed research attempting to gain its support in ‘institutionalised evidence-based’ research, the amount of unpredictable and marginalised axillary theories that lack funding may furthermore be easily under supported or even overlooked. With their potential value lost we are now dangerously close to be led by a demarcated pseudoscience operating in blind isolation.
In order to negate and advance into this vast new world with new opportunity and not despondency, we need to be armed with a reliable, free and truthful science in full acknowledgement of our world as perceptive and interconnected in progressive spheres of perception. These spheres, constantly evolving and adjusting to complexity, both in the physically evident (empirical ) and uncertainty, without neglecting the metaphysical. To do this we must be receptive to change without barriers as an interactive concern adjusting to rapidly evolving heterogeneity. We may define this need for new methodologies as a pending scientific revolution.
References
· Bartley III, W. W., 1968. “Theories of demarcation between science and metaphysics”, pp. 40–64 in Imre Lakatos and Alan Musgrave (eds.), Problems in the Philosophy of Science, Proceedings of the International Colloquium in the Philosophy of Science, London 1965, volume 3, Amsterdam: North-Holland Publishing Company.
· Cioffi, Frank, 1985. “Psychoanalysis, pseudoscience and testability”, pp 13–44 in Gregory Currie and Alan Musgrave, (eds.) Popper and the Human Sciences, Dordrecht: Martinus Nijhoff Publishers, Dordrecht.
· Feleppa, Robert, 1990. “Kuhn, Popper, and the Normative Problem of Demarcation”, pp. 140–155 in Patrick Grim (ed.) Philosophy of Science and the Occult, 2nd ed, Albany: State University of New York Press.
· Fuller, Steve, 1985. “The demarcation of science: a problem whose demise has been greatly exaggerated”, Pacific Philosophical Quarterly, 66: 329–341.
· Glymour, Clark and Stalker, Douglas, 1990. “Winning through Pseudoscience”, pp 92–103 in Patrick Grim (ed.) Philosophy of Science and the Occult, 2nd ed, Albany: State University of New York Press.
· Grove , J.W., 1985. “Rationality at Risk: Science against Pseudoscience”, Minerva, 23: 216–240.
· Gruenberger, Fred J., 1964. “A measure for crackpots”, Science, 145: 1413–1415.
· Hansson, Sven Ove, 1983. Vetenskap och ovetenskap, Stockholm: Tiden. 1996. “Defining Pseudoscience”, Philosophia Naturalis, 33: 169–176.
· Kuhn, Thomas S., 1974. “Logic of Discovery or Psychology of Research?”, pp. 798–819 in P.A. Schilpp, The Philosophy of Karl Popper, The Library of Living Philosophers, vol xiv, book ii. La Salle: Open Court.
· Lakatos, Imre, 1970. “Falsification and the Methodology of Research program”, pp 91–197 in Imre Lakatos and Alan Musgrave (eds.) Criticism and the Growth of Knowledge. Cambridge: Cambridge University Press.
· –––, 1974a. “Popper on Demarcation and Induction”, pp. 241–273 in P.A. Schilpp, The Philosophy of Karl Popper, The Library of Living Philosophers, vol xiv, book i. La Salle: Open Court.
· –––, 1974b. “Science and pseudoscience”, Conceptus, 8: 5–9.
· –––, 1981. “Science and pseudoscience”, pp. 114–121 in S Brown et al. (eds.) Conceptions of Inquiry: A Reader London: Methuen.
· Laudan, Larry, 1983. “The demise of the demarcation problem”, pp. 111–127 in R.S. Cohan and L. Laudan (eds.), Physics, Philosophy, and Psychoanalysis, Dordrecht: Reidel.
· Reisch, George A., 1998. “Pluralism, Logical Empiricism, and the Problem of Pseudoscience”, Philosophy of Science, 65: 333–348.
· Ruse, Michael (ed.), (1996). But is it science? The philosophical question in the creation/evolution controversy, Amherst, NY: Prometheus Books.
· Settle, Tom, 1971. “The Rationality of Science versus the Rationality of Magic”, Philosophy of the Social Sciences, 1: 173–194.
· Thagard, Paul R., 1978. “Why Astrology Is a Pseudoscience”, Philosophy of Science Association (PSA 1978), 1: 223–234.
· –––, 1988. Computational Philosophy of Science, Cambridge, MA: MIT Press.