
For most of human history, explanations of the natural world were bound in superstition and metaphysics. Fire was treated as a sacred element, stars as signs of destiny, disease as the hand of spirits or divine punishment. The role of “knowledge” was not to test or measure but to harmonise with the assumed order of gods and unseen forces.
Even in the late 16th century, this outlook still dominated European thought. Shakespeare gives us a vivid glimpse of it in The Merchant of Venice (1596–97), where Lorenzo speaks of the “music of the spheres”:
Look how the floor of heaven
Is thick inlaid with patines of bright gold:
There’s not the smallest orb which thou behold’st
But in his motion like an angel sings,
Still quiring to the young-eyed cherubins;
Such harmony is in immortal souls;
But whilst this muddy vesture of decay
Doth grossly close it in, we cannot hear it.
Beautiful as it is, this vision reflects a completely mistaken cosmology. The planets do not sing angelic choirs. Within a decade, Galileo’s telescope (1609) revealed craters on the moon and moons circling Jupiter. The poetry of the heavens gave way to the mathematics of orbits.
(In the same way, our understanding of the Bible can be enriched when we look beyond metaphysics to rational explanation. Read in this light, it offers enduring moral guidance, cultural memory, and psychological insight.)
The Renaissance and the Birth of Science
The Renaissance (14th–16th centuries) marked Europe’s intellectual flowering. Humanists rediscovered ancient texts; artists studied anatomy and perspective; navigators steered by the stars across oceans; engineers designed machines. Most of all, Europeans began to question and test direct observation.
This shift matured into the Scientific Revolution of the 16th to 18th centuries, a transformation that gave us a new way of knowing.
Timeline of Transformation
- 1543 — Nicolaus Copernicus publishes De revolutionibus orbium coelestium, proposing a heliocentric universe.
- 1600 — Giordano Bruno is burned at the stake in Rome for heretical ideas, including infinite worlds and cosmic plurality.
- 1609 — Galileo Galilei first turns a telescope skyward, observing Jupiter’s moons, Saturn’s rings, the phases of Venus, and lunar mountains.
- 1619 — Johannes Kepler publishes Harmonices Mundi, formulating the third law of planetary motion.
- 1620 — Francis Bacon’s Novum Organum articulates the inductive method.
- 1637 — René Descartes publishes Discourse on Method, advocating systematic doubt and rational inquiry.
- 1661 — Robert Boyle publishes The Sceptical Chymist, separating chemistry from alchemy.
- 1687 — Isaac Newton publishes Principia Mathematica, uniting celestial and terrestrial mechanics.
- 1703 — Georg Ernst Stahl develops the phlogiston theory, a bold attempt to explain combustion.
In less than two centuries, the old world of symbols and angelic harmonies was replaced by laws, mathematics, and experiment.
Alchemy: The Precursor of Chemistry
Before chemistry emerged as a science, Europe’s tradition was alchemy — a mixture of practical craft, speculative philosophy, and mystical symbolism.
- Roots: Alchemy came from ancient Egypt, Greece, and the Islamic world, reaching Europe through translations in the Middle Ages.
- Aims: Its most famous goals were the transmutation of base metals into gold, the discovery of the philosopher’s stone (a substance granting immortality or perfection), and the pursuit of universal medicine.
- Symbolism: Alchemists wrote in allegories and coded language, mixing astrology, numerology, and Christian mysticism with real metallurgical processes.
For all its strangeness, alchemy was not without value. It preserved and developed skills in smelting, alloys, glassmaking, dyes, and acids. In many cases, alchemists were the first to experiment with substances, even if they lacked a coherent theory.
By the 16th century, thinkers like the Swiss physician Paracelsus (1493–1541) began to redirect alchemy toward medicine, stressing the chemical nature of disease and treatment. By the 17th century, natural philosophers like Boyle attacked its mystical excesses. His Sceptical Chymist (1661) criticised the old elements of earth, air, fire, and water, while still drawing on alchemical practice.
Thus alchemy was both an obstacle and a foundation: an obstacle in its reliance on symbolism and secrecy, a foundation in its practical knowledge of materials. Against this backdrop, the phlogiston theory arose.
The Phlogiston Theory
Chemistry sought a rational framework. Johann Joachim Becher (1667) proposed that materials contained different “earths,” one of which explained burning. Georg Ernst Stahl (1703) refined this into phlogiston: a substance released during combustion, calcination, and rusting.
The point here is not simply the theory itself, but the shift of mindset. What had once been a mystery — fire, transformation, rust — was now being looked at intelligently and systematically, not as an occult force but as a natural process.
For nearly a century, phlogiston held sway. Metals were thought to be ores minus phlogiston. Combustion was the escape of phlogiston into the air. Even respiration was described as the body venting phlogiston.
Unlike alchemy, this was no mystical symbol-system. It was a rational, testable hypothesis, a genuine step forward in chemical explanation.
Yet contradictions multiplied. Why did metals gain weight when they rusted if they were losing phlogiston? Some defenders insisted phlogiston had “negative weight.” Rather than abandon the theory, they stretched it to breaking point. National pride and personal investment played their part: no one easily abandons a theory into which they have poured their career.
Lavoisier and the Triumph of Oxygen
In the 1770s and 1780s, Antoine Lavoisier overturned phlogiston with painstaking experiments. By weighing substances before and after combustion, he showed that oxygen was entering into combination, not phlogiston departing. He established the principle of conservation of mass, laying the foundations of modern chemistry.
For this he is remembered as the “father of modern chemistry.” And yet, his life was cut short by the political madness of his age. In 1794, amid the fury of the French Revolution, Lavoisier was guillotined as a hated tax farmer. The tribunal declared, “The Republic has no need of scientists.”
The irony is bitter — destroying what they could not yet understand. The scene recalls Jesus’ words on the cross: “Father, forgive them, for they know not what they do.”
So too with Giordano Bruno, executed in 1600 not purely for astronomy but for daring to speak of infinite worlds that undercut Church authority.
Here is the pattern: entrenched powers — whether church, crown, or mob — often destroy or dismiss what they cannot yet grasp. It is not always fear; more often it is political madness or ideological frenzy. Crushing an idea can be like crushing an Elephant Hawk moth larva because it looks threatening, horned and eyed, never knowing it would have become a creature of beauty.

The Lesson for Today
Science advances not by mocking anomalies but by taking them seriously. The phlogiston theory was wrong — but it was usefully wrong, a bridge between alchemy and oxygen. Without such speculative steps, chemistry might never have broken free.
The same principle applies now. When independent researchers present anomalies — ancient maps that seem too accurate, DNA traces that defy migration models, underwater ruins — it is easy for orthodoxy to laugh, to dismiss, to wave them away. But laughter is not inquiry.
Treating anomalies seriously does not mean embracing wild claims. It means building testable research programmes that can confirm or refute them. To dismiss them outright is prejudice; to investigate them is the spirit of science.
History shows us the way: from Copernicus to Newton, from alchemy to phlogiston to oxygen, from superstition to science, progress came because bold thinkers dared to speculate — and because others dared to test.


