The Anti-Scientific Error Of Net Zero Climate Religion

Billions spent. Livelihoods destroyed. Energy systems dismantled. All built on simulations of a chaotic system we cannot experimentally test. When did probabilistic inference become grounds for civilisational restructuring? Net Zero is an unfalsifiable fools gold scientism chased by ideologues.

The Anti-Scientific Error Of Net Zero Climate Religion

Every millenarian movement in history shares the same architecture. A fallen world corrupted by human sin. A prophesied catastrophe arriving on a knowable timeline. A priesthood interpreting sacred texts the laity cannot question. Rituals of penance and self-denial. Moral garments distinguishing the faithful from the damned. Heretics cast out for deviation. And a promised salvation available only through radical, immediate, total submission to the doctrine.

The Montanists of second-century Phrygia prophesied the imminent descent of the New Jerusalem. The Flagellants of the fourteenth century processed through plague-ravaged Europe whipping themselves bloody, convinced suffering would avert divine wrath. The Münster Anabaptists seized a German city in 1534, declared it the New Zion, abolished private property, and burned every book except the Bible. The Millerites of 1844 sold their farms and gathered on hilltops for an ascension that never came. Each movement was absolutely certain. Each was wrong. Each caused tremendous suffering in proportion to that certainty.

Now observe the structure of contemporary climate doctrine and notice the architecture is identical.

The fall: industrial civilisation has sinned against Gaia through carbon emission. The prophecy: catastrophe arriving within twelve years, ten years, by 2030, by 2050 — the deadline shifts but the imminence never fades. The priesthood: climate scientists whose models function as scripture, interpreted by an IPCC magisterium whose assessment reports are received as encyclicals. The penance: net zero pledges, carbon fasting, flight shame, dietary restriction. The moral garments: reusable cups and canvas tote bags worn as visible markers of righteousness, electric vehicles as mobile certificates of virtue. The heretics: anyone questioning the models is a "denier" — a word deliberately borrowed from Holocaust terminology to place scientific scepticism in the same moral category as genocide denial.

And the rituals of desecration. Just Stop Oil activists threw tomato soup at Van Gogh's Sunflowers at the National Gallery. Others hurled mashed potato at a Monet, smeared cake on the Mona Lisa, glued themselves to Constable's Hay Wain, chalked "1.5 is dead" on Charles Darwin's tomb in Westminster Abbey. Extinction Rebellion paraded four horsemen through the streets of CopenhagenFamine, Pestilence, War, and Death — at a 2009 climate summit. The founder, Gail Bradbrook, stated explicitly:

You have to use biblical language to talk about what it means to be in a sixth-mass extinction event.

This is not protest. It is iconoclasm — the deliberate destruction of sacred objects to demonstrate the supremacy of the new faith over the old. The paintings are not targeted for their subject matter. They are targeted because they represent cultural inheritance, human achievement, civilisational continuity. The message is theological: your heritage is worthless before the coming catastrophe. Repent.

The historian David Starkey, godfather of English Restorationism and obvious favourite of The Restorationist, identified the pattern precisely:

We have had these throughout history. This time, we got rid of God, but not religion.

The philosopher Alain Badiou went further, calling the discourse of the rights of nature:

a contemporary form of the opium of the people — millenarian terror, concern for everything save the properly political destiny of peoples.

Lucy Biggers spent her twenties as one of America's loudest climate activists — interviewing child saint darling Greta Thunberg, promoting the Green New Deal, pushing plastic bans. In her mid-thirties, she described waking up:

This climate movement — this black and white thinking that I was pushing — was not true, and the truth was much more complex,

She described the movement as providing:

a strong sense of belonging and purpose from being part of a group that she believed was fighting for the right side of history.

Strip the modern vocabulary and you have a textbook description of millenarian conversion: community, purpose, moral certainty, eschatological urgency.

The CenSAMM research centre at the University of Sheffield identifies what it calls "scientific millenarianism" — the point at which mathematical models and precise predictions function as prophecy, and "millenarianism now possesses bona fide scientific components." The critical observation is this: scientific models provide predictions for the future, but what will actually happen remains unknown and dependent on numerous factors. Millenarian belief fills the gap between prediction and future. The space between the IPCC's probability ranges and public certainty is not a scientific space. It is a theological one.

None of this means warming is fictional. What it means is the apparatus surrounding the science — the certainty, the urgency, the moral framework, the suppression of doubt, the punitive response to questioning — is not scientific apparatus. It is religious apparatus. And when religious apparatus drives civilisational policy, history tells us exactly what happens: tremendous damage inflicted with absolute conviction and zero accountability.

What follows is not denial of warming, either. It is something far more dangerous to the current orthodoxy: a demand for epistemic humility proportionate to what is being asked. When you cannot run controlled planetary experiments, when your predictions rest on parameterised guesses about cloud behaviour and ocean circulation, when your attribution methods depend on the very models you are trying to validate — you do not get to speak with the voice of Newtonian mechanics.

You get to be appropriately uncertain. And policy built on uncertainty should look radically different from what we are witnessing. If it's true, why do we have to believe it?

Science is not a democracy. It is not a consensus, despite how frequently the press refer to the "scientific consensus." Science is error reduction.

Karl Popper infamously described his own awakening with remarkable precision in 1962, a decade before:

These theories appeared to be able to explain practically everything that happened within the fields to which they referred. The study of any of them seemed to have the effect of an intellectual conversion or revelation, opening your eyes to a new truth hidden from those not yet initiated. Once your eyes were thus opened you saw confirming instances everywhere: the world was full of verifications of the theory. Whatever happened always confirmed it. Thus its truth appeared manifest; and unbelievers were clearly people who did not want to see the manifest truth; who refused to see it, either because it was against their class interest, or because of their repressions which were still ‘un-analysed’ and crying aloud for treatment.

The most characteristic element in this situation seemed to me the incessant stream of confirmations, of observations which ‘verified’ the theories in question; and this point was constantly emphasized by their adherents. A Marxist could not open a newspaper without finding on every page confirming evidence for his interpretation of history; not only in the news, but also in its presentation—which revealed the class bias of the paper—and especially of course in what the paper did not say. The Freudian analysts emphasized that their theories were constantly verified by their ‘clinical observations’.

150 Years of Thermometers Cannot Arbitrate 4.5 Billion Years of Climate

First, let's be clear about something simple. The Earth is a complex system-of-systems and does not have a single climate. It has five climates: tropical, dry, temperate, continental, polar.

Systematic temperature recording spans roughly 150 years. Earth is approximately 4.5 billion years old. Before instruments, we rely on proxies — ice cores, tree rings, sediment layers, coral growth. These are ingenious reconstructions, but they are not thermometers. They measure other phenomena from which temperature is inferred through models of how those phenomena respond to temperature. Ice core isotope ratios depend on assumptions about past precipitation. Tree ring widths depend on moisture, soil nutrients, and competition effects beyond temperature alone. Each proxy introduces uncertainty. Cross-validation reduces some error but cannot eliminate the fundamental problem: past climate reconstruction is inferential archaeology, not direct observation.

When someone declares current warming "unprecedented," they mean unprecedented within the reliable portion of the proxy record, given assumptions embedded in proxy interpretation. Consider the Medieval Warm Period: vineyards flourished in England, Norse settlers farmed Greenland, Alpine passes remained open year-round — all without industrial CO₂ Karl Marx didn't like. The standard response claims medieval warmth was regional, not global. Perhaps. But the confident assertion of global coherence in current warming depends on the same proxy networks used to assess medieval patterns. If the proxies are reliable enough to declare modern warming globally synchronised, they are reliable enough to take the medieval evidence seriously.

Satellite measurements begin only in 1979. Before them, ocean temperatures relied on ship intake readings with known calibration drift and coverage gaps. Land stations moved locations, changed instruments, and were progressively surrounded by urban development. Adjustments correct these problems, but adjustments introduce their own interpretive choices. McKitrick and others demonstrated spatial correlation between temperature trends and socioeconomic development indicators, suggesting urban heat contamination remains incompletely corrected. The work is disputed — as frontier research should be. But the dispute itself confirms the question is unresolved, not settled.

Multiple independent lines of evidence do converge on warming. Ocean heat content shows persistent energy accumulation. Satellite troposphere temperatures confirm warming trends. Cryosphere indicators — glacier retreat, Arctic sea ice decline, earlier spring melt — move in consistent directions. This makes the "nothing is happening" position untenable for anyone engaging honestly.

But convergence on "warming exists" is not confidence in attribution magnitude, sensitivity estimates, or catastrophic projections. Those require additional inferential steps, each carrying its own uncertainty.

No Instrument Measures "Human Causation" — Attribution Is an Inference Stack, Not a Reading

Walk into any laboratory and point at a measuring instrument. A thermometer measures molecular kinetic energy. A voltmeter measures electrical potential. A mass spectrometer counts ions. Now ask: which instrument measures human causation of warming?

None. The quantity does not exist as a directly observable physical property.

Attribution proceeds through model-based inference. Scientists simulate Earth's climate under different forcing scenarios — natural variability alone, solar forcing, volcanic eruptions, greenhouse gases — and compare the fingerprints with observations. When greenhouse gas simulations match observations better than natural-only simulations, attribution is declared. The IPCC states it is "extremely likely" (95%+ confidence) most observed warming since 1950 is human-caused.

The methodology is scientifically reasonable and the best available given we cannot experiment on planets. But it carries dependencies public communication systematically omits.

Attribution confidence depends on model skill at simulating natural variability. If models underestimate natural swings, they over-attribute observed changes to anthropogenic forcing — and estimating natural variability from 150 years of instruments is inherently insufficient to capture centennial or millennial oscillations.

Aerosol forcing remains highly uncertain. Industrial aerosols exert cooling effects, partially masking greenhouse warming. The IPCC's own assessments show wide uncertainty ranges — wider than for greenhouse gases. If aerosol cooling is weaker than assumed, greenhouse warming must be smaller to match observations. If stronger, warming must be larger. Attribution requires getting this balance right, yet the error bars are enormous.

Cloud feedbacks represent the single largest uncertainty in climate sensitivity. Clouds can cool by reflecting sunlight or warm by trapping infrared radiation. Which effect dominates as climate warms remains incompletely understood. Different models produce different cloud responses, yielding sensitivity estimates the IPCC assesses at 2.5–4.0°C for doubled CO₂. The range matters desperately: at 2.5°C, warming proceeds slowly enough for adaptation and technological development; at 4.0°C, impacts accelerate and tipping risks increase. Policy treating the high end as certain when the assessed range spans a factor of 1.6 is policy running ahead of knowledge.

Observational constraint studies using energy budget methods and historical data often produce lower sensitivity estimates, clustering closer to 2°C or below. Lewis and Curry's 2018 study found best estimates around 1.5–1.8°C. These estimates are disputed — other researchers argue the methods underestimate sensitivity. The existence of the argument is the point. When credentialed researchers using standard methods reach materially different conclusions, the science is contested, not settled.

When public communication compresses "extremely likely given these models and assumptions" into "definitively proven," epistemic disruption occurs. Not because scientists are lying, but because the translation strips away the conditionality required for honest uncertainty.

The Falsifiability Deficit: What Observation Would Change Your Mind?

Karl Popper's demarcation or falsiification criterion remains the sharpest tool for separating science from pseudoscience: a claim is scientific if observations could, in principle, prove it wrong.

General relativity predicted light bending near massive objects. Evolution predicted fossil sequences showing nested hierarchies. John Haldane's notion of pre-Cambrian rabbits would shatter the framework. These are falsifiable.

What observation would falsify "most recent warming is anthropogenic"? Temperatures stopping? Natural variability allows pauses. Temperatures falling? Volcanic eruptions could override greenhouse forcing. Models over-predicting? Single realisations can diverge from ensemble means.

The difficulty is not necessarily because climate science is pseudoscience. Complex system attribution involves probabilistic reasoning where no single observation definitively falsifies the core claim. But this creates a dangerous situation: if no observation can cleanly disconfirm the attribution claim, disconfirming evidence gets explained away whilst confirming evidence gets amplified. The inference becomes self-sealing — exactly the pattern Popper warned against.

NOAA now explicitly states we cannot confidently detect century-scale trends in Atlantic hurricane frequency after accounting for observational changes. Research by Vecchi and others argues apparent increases are consistent with improved detection, not climate trends. Sea level satellite altimetry since 1993 shows roughly 3mm per year — measurable, not catastrophic. Extreme projections of metres of rise by 2100 depend on assumptions about rapid ice sheet collapse remaining speculative.

Yet public communication routinely invokes worst-case scenarios as central estimates. Treating extrapolations as foregone conclusions prevents the empirical updating science requires. If observations consistently undershoot projections, humility demands revising downward. Climate science does this internally — IPCC assessments evolve, sensitivity estimates narrow. But the policy and advocacy layers do not track revisions. Once a catastrophic claim enters public discussion, it persists regardless. Failed predictions vanish. New deadlines replace old ones. The ratchet only tightens.

Complex Systems Resist Prophecy — And Planets Are the Most Complex Systems We Know

Economics cannot reliably forecast recessions six months ahead. Epidemiology failed to predict COVID-19 variant trajectories. Meteorology cannot confidently predict weather beyond ten days. All are complex nonlinear systems. Climate belongs in this category.

Climate models discretise continuous systems onto computational grids roughly 100 kilometres across. Processes smaller than grid resolution — convection, cloud formation, boundary layer turbulence — are parameterised using simplified rules tuned to match observations. These are educated approximations, not first-principles physics. Different modelling groups make different choices, yielding different results. For identical forcing scenarios, projected global temperatures can span more than a degree by 2100. Regional precipitation projections show even wider disagreement, with models unable to agree on the sign of change in some areas.

Ensemble averaging smooths this spread, creating an illusion of precision. But averaging does not reduce structural uncertainty — it hides it. If all models share common bias from similar parameterisations, the ensemble mean inherits the bias.

Models initialised in the 1980s produce temperature trends broadly consistent with observations — but "broadly consistent" is not "precisely accurate." Models ran warm during the 1998–2012 hiatus. Explanations included natural variability, underestimated aerosol forcing, or stronger ocean heat uptake. All plausible. Each revealing incomplete initial understanding. This is normal science — but it demonstrates models are works in progress, not oracles.

Directional warming from greenhouse gases is high confidence. Magnitude and timing are medium confidence. Regional impacts are low confidence. Yet policy treats all three as equally certain, optimising for catastrophic scenarios whilst dismissing moderate outcomes as unworthy of preparation. If policy genuinely "followed the science," it would preserve optionality across the range of plausible futures the science describes.

The Replication Crisis Climate Science Pretends Cannot Touch It

Psychology spent the 2010s watching famous studies collapse under replication attempts. Economics found behavioural findings irreproducible. Biomedical research discovered preclinical landmarks rarely translated clinically. Publication bias, p-hacking, and citation cartels created literatures filled with false positives.

Climate science insists on immunity. The field is supposedly grounded in physics. Measurements are supposedly objective. Models supposedly solve fundamental equations. This is partly true — radiative transfer is not vulnerable to p-hacking. But climate science is not purely physics. It spans geophysics, oceanography, ecology, economics, and increasingly, advocacy. As the field expanded into policy relevance, it absorbed practices from adjacent disciplines.

Boundary erosion created authority confusion. Cogent Social Sciences accepted a hoax paper arguing the penis is a conceptual construct responsible for climate change. Progress in Human Geography published a paper claiming glaciers are gendered constructs requiring feminist analysis. Neither is glaciology. Neither is physics. Both are peer-reviewed, citation-accumulating, prestige-inheriting. When policymakers cite "climate science literature," are they citing radiative forcing calculations or discourse analysis of glacier metaphors?

Model code is often proprietary or poorly documented. Datasets require institutional access. Statistical methods are described in papers but rarely made reproducible through shared code. When outsiders attempt replication, they encounter barriers — not conspiracy, but friction preventing the adversarial testing necessary for error detection.

Climategate revealed the culture plainly. Researchers discussed declining to share data with critics, adjusting series to "hide the decline" in tree ring proxies, and coordinating to exclude dissenting papers from IPCC reports. Independent investigations concluded no scientific misconduct — the phrase referred to a documented divergence problem. But the emails revealed gatekeeping, data resistance, and editorial pressure antithetical to open science.

Steve McIntyre attempted to replicate Michael Mann's hockey stick analysis and found errors in principal component analysis creating artificial hockey stick shapes from random data. The National Research Council confirmed McIntyre's statistical criticisms were valid. The institutional response was not engagement but exclusion. Data requests were denied. Critics were marginalised. This is not how robust knowledge is built. This is how paradigms protect themselves from challenge.

Catastrophe Gets Funded and Moderation Gets Buried

Research funding is competitive. Proposals framing work as addressing existential risk are more fundable than proposals investigating moderate outcomes — not through conspiracy, but through normal institutional prioritisation. Alarming research gets funded, produces papers, receives media attention, increases public concern, and justifies more funding. Moderate findings break the loop. Research concluding warming is slower than expected does not generate headlines.

Researchers respond rationally. Frame work as addressing catastrophe and funding arrives. Frame it as reducing alarm and funding evaporates. Career advancement depends on publication volume, citation counts, and grant success — all correlating with urgency framing.

A study finding stronger hurricane intensification is more publishable than one finding no trend. Research detecting tipping point proximity is more newsworthy than research extending the timeline. Over time, the literature accumulates more alarming findings than moderate ones — not necessarily because reality is alarming, but because alarming findings are more likely published and cited.

IPCC lead author selection favours researchers whose records support urgency. Dissenting views receive token space in chapter caveats whilst executive summaries emphasise consensus. Funding concentrates in established institutions. The result is epistemic monoculture — one modelling approach standard, one sensitivity range accepted, one policy conclusion foregone.

Monocultures are fragile. In agriculture, uniform crops are vulnerable to disease. In finance, correlated strategies amplify systemic risk. In knowledge production, uniform frameworks miss errors alternative approaches would catch. When models produce low sensitivity, the question becomes "what's wrong with this model?" rather than "what if sensitivity is actually low?" The asymmetry is subtle, pervasive, and structurally biased toward high-end estimates.

The Graveyard of Confident Predictions

In 1970, ecologist Kenneth Watt predicted Earth would be 11 degrees colder by 2000. In 1988, the UN warned governments had ten years before rising seas obliterated nations. In 2004, The Guardian reported a Pentagon study warning Britain would face "Siberian" climate by 2020. In 2008, NASA's James Hansen testified Arctic summer sea ice would disappear within five to ten years. In 2018, media cited UN statements claiming 12 years remained to prevent catastrophe.

None occurred.

Every failed prediction came from credentialed experts in major media. Every one shaped public perception. And every one has been quietly memory-holed, replaced by new deadlines carrying identical certainty.

The standard defence separates scientific projections from media exaggeration. Partly true — but the line dissolves when scientists stand beside politicians declaring existential deadlines, publish papers with apocalyptic titles, and issue press releases emphasising worst-case scenarios.

The pattern is systematic and unidirectional. Predictions consistently overshoot. Ice decline is slower than projected. Temperature rise tracks the lower end of ranges. Extreme weather attribution is weaker than headlines suggest. If errors were random, some predictions would undershoot. They rarely do. Systematic bias in one direction indicates something broken in the prediction machinery — institutional and psychological pressures tilting framing toward the fundable, the newsworthy, the urgent.

The proper response is recalibration: examine what went wrong, adjust models, revise confidence, communicate uncertainty honestly.

Instead, failed predictions vanish and new ones arrive with identical certainty. At some point, credibility must be reckoned with. If economic forecasters consistently predicted recessions never arriving, their models would be revised and their authority questioned. Climate predictions deserve identical scrutiny. The fact they do not receive it reveals climate has become insulated from normal accountability standards — and the insulation is political, not scientific.

Germany's €100 Billion Physics Lesson and Britain's £7 Billion Biomass Fraud

On 15 April 2023, Germany permanently shut down its final three functioning, safe, zero-emission nuclear reactors. The Energiewende treated renewables and nuclear as competitors rather than complements. Germany became a net electricity importer. Power prices remained elevated. Natural gas consumption increased.

Research found postponing the nuclear phase-out would have reduced wholesale electricity prices by approximately €9 per megawatt-hour whilst cutting both gas generation and carbon emissions.

The contradiction is absolute. Climate emergency supposedly requires maximising emissions reductions. Nuclear provides zero-emission baseload running continuously regardless of weather on minimal land area. Germany eliminated it whilst increasing gas dependence — because nuclear fails the ideological purity test. Emotional associations with weapons, waste, and Chernobyl override the physics of passive-safe modern reactor designs, manageable waste volumes, and the fundamental irrelevance of Soviet-era graphite-moderated reactor failures to Western light-water technology.

The result: Germany burns lignite whilst importing electricity from French nuclear and Polish coal. Emissions crossed borders. German consumers absorbed the cost. Doctrine overpowered thermodynamics.

Britain follows a parallel path. Drax Power Station in North Yorkshire — once Britain's largest coal plant — is now Britain's largest "renewable" facility. It burns millions of tonnes of wood pellets shipped across the Atlantic from the southern United States. The government has paid over £7 billion in tax funds as subsidies.

The logic collapses on contact with reality. Regrowth takes 30–50 years. Carbon neutrality across half a century does not help if critical thresholds are crossed in the interim. Investigative reporting by The Times and the BBC documented supply chains including whole trees from ecologically valuable southern forests, not sawmill residue. The supply chain — felling, transport, pelletising, transatlantic shipping — is energy-intensive and often fossil-derived. Lifecycle analysis finds biomass frequently worse than coal per unit of energy when accounting for supply chain emissions and carbon debt.

Mississippi communities adjacent to pellet facilities report severe particulate pollution, noise, and respiratory disease. The Times described one town as a "sacrifice zone" — suffering environmental harms so Britain can claim renewable energy credits. Ofgem investigations found governance weaknesses and Drax paid compensation. Internal documents revealed in court cases suggest company officials harboured private doubts about sustainability claims.

Seven billion pounds invested in nuclear would have produced genuinely clean, firm power for 60 years. Instead, it subsidised transatlantic wood shipments to meet arbitrary targets whilst actual emissions barely budged. Policy optimised for appearance. Reality delivered failure dressed in green.

Paying £1 Billion for Electricity Nobody Receives

British consumers paid approximately £1 billion in 2023 for power generated but never used. Scottish wind farms produced electricity the grid could not transmit south. The system paid generators to switch off, then paid gas plants elsewhere to fill the gap. Both payments landed in consumer bills.

The Scotland-to-England transmission bottleneck has been known for years. Wind capacity expanded rapidly. Grid upgrades lagged. The predictable result: generation regularly exceeds transmission capacity. Carbon Tracker estimated curtailed wind in 2023 could have powered roughly one million homes — produced, paid for, and thrown away.

The perversity compounds. Consumers fund generation serving no purpose whilst simultaneously funding fossil generation to replace it. More renewable capacity without corresponding transmission creates more curtailment, more balancing costs, and more fossil backup. Policy celebrates each new wind farm announcement whilst ignoring the system dysfunction they create. Renewable capacity statistics increase. Headlines proclaim progress. Bills increase. Emissions barely decline because gas plants run more frequently for backup and balancing.

The fix is obvious: build transmission, match generation deployment to grid capability. It is not happening at necessary pace. Planning delays, local opposition, and regulatory friction slow transmission whilst subsidies make even curtailment-heavy wind sites profitable. One billion pounds per year in deadweight loss producing no social value — whilst lower-income households, spending higher proportions of income on energy, suffer disproportionately.

ULEZ: Surveillance Enforcement of Regressive Taxation Disguised as Environmentalism

London's absurd Ultra Low Emission Zone expanded in August 2023 across all boroughs, creating a camera-enforced perimeter where older vehicles face £12.50 daily charges. Air quality improvements are measurable — nitrogen oxide and particulate concentrations apparently fell. The policy works on its own terms.

But those terms are regressive and authoritarian.

A low-income household owning a 12-year-old diesel faces £8,000–£12,000 replacement costs. Scrappage grants offer perhaps £2,000. The household must find £6,000–£10,000 or pay £12.50 daily — exceeding £3,000 annually, approaching £10,000 over three years. The affluent replace vehicles trivially. The charge slides off those with resources whilst crushing those without.

The pointless Local Government Ombudsman criticised TfL for mishandling scrappage scheme changes, leaving applicants blindsided by modified eligibility. Administrative incompetence compounding policy regressivity.

The enforcement mechanism is comprehensive surveillance. Automatic number plate recognition cameras record every vehicle entry. Algorithms match plates to compliance databases. No discretion. No appeals for temporary hardship. A parent rushing a sick child to hospital in a non-compliant car receives the same charge as a luxury SUV deliberately avoiding compliance.

Once the infrastructure exists, expansion is trivial. Add zones. Increase fees. Extend hours. Capture more data. Each step is small. Cumulatively, they build a panopticon. The justification is always environment. The implementation is always surveillance. The costs are always regressive. And the people bearing the costs are told they are saving the planet.

Don't Look Up; Just Believe, Or Else

Parliamentary democracies legislate based on evidence. When evidence changes, laws change. When policies fail, they are revised.

Climate policy operates differently.

Net Zero targets are legislated as binding commitments. No observation triggers reversal. If temperatures rise less than projected, Net Zero continues. If economic costs exceed benefits, Net Zero continues. If technological alternatives emerge making mandates obsolete, Net Zero continues. The policy is legislatively immune to evidence — moral commitment dressed as empirical policy.

Democratic governance requires accountability. Accountability requires revising failed policies. Climate activists celebrate binding targets precisely because they prevent future governments from weakening commitments — but democratic volatility is not a bug. It is how error correction works. Voters remove governments implementing bad policy. New governments try different approaches. If voters cannot choose to abandon Net Zero, they are not sovereign. The targets are.

Emergency is the classic justification for suspending normal governance. Every authoritarian expansion claims necessity. Climate emergency is presented as unique — global, existential, irreversible — therefore normal democratic constraints do not apply. This is how democracies die: not through coup, but through emergency governance never ending. Each crisis justifies new powers. Each power becomes permanent.

Once governments can legislate irreversible policies based on expert projections, there is no limiting principle. Pandemic policy becomes permanent biosecurity. Financial stability becomes permanent market intervention. Climate is the leading edge of technocracy displacing democracy.

The demand is simple: proportionality between certainty and coercion.

If climate science is 95% confident in attribution, policy should be 95% reversible. If projections span a factor of two, policy should accommodate both ends. If models disagree on regional impacts, mandates should not assume consensus where none exists.

Build nuclear because it works when wind stops and models err. Maintain gas as insurance against projection failure. Stop subsidising accounting fraud called renewable. Make deployment contingent on grid capability, not arbitrary targets. Restore falsifiability — if climate policy cannot specify observations invalidating it, it is not science-informed policy. It is faith enforced by law.

The models may be right. The science may be sound. But certainty has been inflated, urgency weaponised, and compulsion normalised. When pensioners freeze whilst politicians claim moral victory, when billions subsidise transatlantic wood-burning, when surveillance cameras enforce regressive charges — doctrine has replaced science and democracy has been suspended by technocrats who will face no penalty when the public pays the price.

The millenarians were always wrong. The Montanists. The Flagellants. The Münsterites. The Millerites. Every one was absolutely certain. Every one demanded total submission. Every one punished doubt. And every one left destruction in proportion to their conviction.

Demand the evidence. Demand the uncertainty ranges. Demand the falsification criteria. Demand epistemic humility before civilisational restructuring.

Climate policy has become climate doctrine. Doctrine must be challenged. Democracy demands it. Science requires it. And the people bearing the costs deserve nothing less.

If it's true, why are we required to believe it, or else?

Read more