Inventing the Future, Building Nothing: Britain's Post-Silicon Paralysis
The UK is yet again falling behind in the technology race, left with clunky silicon machines consuming energy by the megawatt. The next generation of AI computing completely reinvents the physics with light and thermodynamics to break through the von Neumann bottleneck.
Somewhere in a data centre outside London, a machine is burning through enough electricity to power a small town. It's training an artificial intelligence model—the sort everyone now uses to write emails, generate images, or answer questions about medieval crop rotation. The machine costs several million pounds. It produces enough waste heat to warm several dozen homes. And it represents technology already obsolete.
We are living through the death throes of the computer as we know it.
For seventy years, computing advanced by making transistors smaller. Smaller transistors meant faster chips. Faster chips meant better computers. This inexorable march—Moore's Law—turned room-sized calculating machines into pocket supercomputers. But physics has other ideas. You cannot make silicon transistors much smaller than they already are. The laws of quantum mechanics begin interfering. Electrons tunnel through barriers they should not cross. Heat builds up faster than you can remove it.
The silicon chip, the foundation of everything from your smartphone to ChatGPT, has hit a wall.
This would merely be inconvenient if artificial intelligence had not simultaneously become the most important technology since the steam engine. AI models now require trillions of calculations. Training a single large language model consumes megawatts of power—enough to run a small industrial facility for months. The largest AI companies are building power stations to feed their computers. Microsoft is restarting the Three Mile Island nuclear reactor. This is not sustainable. More importantly, it is not scalable.
And whilst Britain wrings its hands about whether to regulate AI chatbots, our competitors are rebuilding the computer itself.
Computing with Heat: The Physics of Thought
Here is where things become strange.
Imagine a marble rolling around a landscape of hills and valleys. Release it anywhere, and it will eventually settle in the lowest valley it can reach. Now imagine the landscape represents a problem you want to solve—finding the best route between cities, or the optimal way to fold a protein, or the right weights for a neural network. The valleys represent solutions. The marble will find one.
This is thermodynamic computing. Instead of processing information through billions of on-off switches, you build a physical system whose natural behaviour solves your problem. You let physics do the work.
Traditional computers operate through brute force. They try solution after solution after solution, checking each one methodically. A thermodynamic computer simply relaxes into an answer, the way a marble rolls downhill or heat spreads through a room. The process uses a fraction of the energy. In theory, it approaches the absolute minimum energy required by the laws of thermodynamics—something called the Landauer limit, after the IBM physicist who calculated it in 1961.
Most computing today wastes energy fighting against physics. Thermodynamic computing uses physics. The leader in the field is Extropic.
The concept sounds like science fiction until you realise it has already been built. Companies in Japan, Canada, and the United States have demonstrated working thermodynamic processors. They use something called "Ising machines"—devices based on a 1920s model of magnetism—to solve optimisation problems thousands of times more efficiently than conventional chips. Other researchers have built probabilistic computers using "p-bits," quantum devices operating at room temperature, or networks of artificial neurons running on thermal noise.
These are not laboratory curiosities. They are prototypes of production systems.
And Britain? We have published some excellent papers.
Light-Speed Computation: When Photons Replace Electrons
What if instead of running electricity through silicon, you performed calculations with light?
Photons are massless. They generate almost no heat. They travel at—well, the speed of light. You can send multiple wavelengths down the same optical fibre simultaneously, each carrying different information, giving you massive parallelism for free. Most importantly, light excels at exactly the operations AI requires most: matrix multiplication.
Every time an AI model processes an image, translates a sentence, or generates text, it performs enormous numbers of matrix operations—multiplying grids of numbers together. This is what graphics cards spend most of their time doing. It is also what light does naturally when it passes through certain optical components.
Photonic computing uses waveguides, beam splitters, and interferometers—the optical equivalent of wires, transistors, and gates—to build processors out of light. The calculations happen at the speed of light, using almost no energy, generating virtually no heat. Researchers have built prototype optical neural networks that perform inference—running a trained AI model—at a fraction of the energy cost of a GPU.
Think of it this way: a conventional computer chip is like a city where millions of cars (electrons) must drive along roads (wires) to deliver packages (information). Traffic jams occur. Accidents happen. Fuel is wasted. A photonic chip is more like a perfectly coordinated system of lasers and mirrors where information travels unimpeded at maximum speed, consuming almost no energy in transit.
The implications are staggering. Companies like Lightmatter and Celestial AI in the United States are building optical accelerators expected to reduce AI inference energy consumption by orders of magnitude. MIT researchers have demonstrated optical tensor processors. Israeli and Chinese labs have built free-space optical computers where light simply passes through a series of trained transparent masks and emerges having performed a complex calculation.
Zero moving parts. Almost zero energy. Near-instant results.
Britain's contribution to this revolution? We have some very good university research groups.
Two Revolutions, One Indifference
Thermodynamic and photonic computing are not competing technologies. They solve different problems and will likely coexist, possibly in the same systems. Thermodynamic processors excel at optimisation and sampling—finding good solutions to hard problems. Photonic processors excel at the matrix operations underpinning modern AI. Together, they could replace the GPU entirely.
This is not a distant future. The first commercial systems are entering data centres now.
American companies are raising billions. Chinese state investment in photonic computing runs to tens of billions. The European Union has launched research programmes. And Britain—home to Alan Turing, Tim Berners-Lee, and ARM Holdings—is watching from the sidelines.
We are exceptionally good at inventing things and catastrophically poor at building them. We invented the computer and let America build the computing industry. We invented the web and let Silicon Valley monetise it. We led the world in artificial intelligence research in the 1970s, then defunded it. Now we risk doing the same with the technologies poised to succeed silicon computing.
The situation is not entirely bleak. Britain retains world-class research groups in photonics at Southampton, Cambridge, and Oxford. We have expertise in quantum technologies through the National Quantum Technologies Programme. ARM, though Japanese-owned, remains British in culture and talent. We have the intellectual foundation.
What we lack is the will to build.
The Energy Question, or: How to Power an Empire of Thought
Consider the scale of the problem. Training GPT-4 likely consumed more than fifty megawatt-hours of electricity. The energy cost of running all AI services worldwide already rivals that of small countries. If current trends continue, AI could consume several percent of global electricity generation by 2030.
This is untenable.
Thermodynamic and photonic systems offer a way out. Optical operations can reduce energy consumption per calculation by a factor of one thousand. Thermodynamic processors can approach theoretical efficiency limits. Together, they could let us build AI systems orders of magnitude more powerful than today's whilst using less total energy.
This is not merely an environmental concern. It is strategic. The countries controlling energy-efficient AI infrastructure will dominate the twenty-first century the way oil producers dominated the twentieth. Except this time, the advantage goes not to those with resources in the ground but to those with fabrication capacity and expertise.
China understands this. So does the United States. Both are investing heavily in post-silicon computing. The Americans through DARPA programmes and private venture capital. The Chinese through direct state investment and integration with national industrial policy.
Britain is writing policy papers.
Why Light and Heat Are Not Yet Commodities
Precision remains a problem. Optical systems struggle to match the accuracy of digital electronics. Floating-point arithmetic—the standard way computers handle decimal numbers—does not translate cleanly to light. Thermodynamic systems can be unpredictable; small manufacturing variations change their behaviour. Both technologies require new design tools, new programming languages, new ways of thinking about computation.
Memory is another bottleneck. Storing large AI models optically is still unsolved. You can perform calculations with light brilliantly, but you must convert results back to electronics to store them. This limits overall system performance.
Manufacturing is hard. Silicon photonics—integrating optical components onto chips—is promising but not yet commodity technology. The fabrication processes are complex and expensive. Thermodynamic devices have no standardised production methods at all.
Software ecosystems take time to mature. CUDA—the programming platform making GPUs useful for AI—took a decade to develop. Neither thermodynamic nor photonic computing has equivalent tools yet. No deep learning frameworks. No optimised libraries. No vast community of developers.
These are real problems. They are also solvable problems, given sufficient investment and expertise.
What We Stand to Lose
Imagine Britain in 2035.
The global AI infrastructure runs on photonic processors manufactured in Taiwan, South Korea, and Arizona. Chinese thermodynamic optimisers power logistics, drug discovery, and materials science. American companies control the software platforms. European data centres import hardware. Britain, having failed to invest, buys access at the pleasure of others.
Our AI systems run on foreign hardware, subject to foreign export controls, dependent on foreign supply chains. Our research institutions are world-class but lack commercial pathways. Our engineers emigrate to California and Shenzhen. We have become a client state in the technology defining the century.
This is not inevitable. But it is the current trajectory.
The window is narrowing. First-mover advantages in hardware compound. Manufacturing expertise takes years to develop. Supply chains are relationship-based and sticky. The companies and countries moving now will set standards, accumulate patents, and capture markets. Those who wait will pay rent.
Britain has perhaps five years to decide whether we wish to be relevant.
A Path Forward, If We Choose It
What would a serious British effort look like?
First, consolidate research. We have excellent groups working in isolation. Establish national programmes linking photonic research at Southampton with quantum expertise at Oxford and AI capabilities at Cambridge and DeepMind. Create clear pipelines from laboratory to fabrication.
Second, invest in manufacturing. Partner with ARM and establish a British silicon photonics foundry. Subsidise initial production runs. Accept that first-generation devices will be expensive and imperfect. This is how industries are born.
Third, train people. We need engineers who understand both photonics and machine learning, physicists who can design thermodynamic computers, software developers who can program them. Update university curricula. Fund PhD programmes. Make Britain the place to learn these technologies.
Fourth, procure strategically. Government purchases can seed markets. Commit to buying British photonic accelerators for NHS AI systems, defence applications, research computing. Accept slightly higher initial costs to establish domestic capability.
Fifth, think longer term than the next election. These technologies will not mature in five years. They require sustained investment over a decade or more. Establish independent funding mechanisms insulated from political cycles.
None of this requires technological miracles. It requires will.
The Stakes Of The 21st Century
We are living through a transition as fundamental as the shift from steam to electricity or from analogue to digital. The computer as we know it—the silicon chip, the GPU, the von Neumann architecture—is obsolete. The question is not whether it will be replaced but who will build the replacement.
Thermodynamic and photonic computing will power the AI systems controlling logistics, designing drugs, optimising cities, and running economies. They will determine which nations lead and which follow. They will shape the balance of power for the rest of the century.
Britain can be at the forefront. We have the research base. We have the talent. We have the industrial heritage. What we lack is the conviction to move from papers to products, from prototypes to production.
The choice is simple. We can build the future or buy it from someone else.
One suspects our grandchildren will not forgive us if we choose wrongly.