Nuclear Power for AI Data Centres…

According to the World Nuclear Industry Status Report, whose data can be explored visually here, there are 407 operational nuclear reactors currently generating electricity across the world. Of these, 94 are in the USA, 62 are in China, 57 are in France, and 34 are in Russia. The average age of the world’s operational reactors is 32.6 years, and they generate ~9% of global electricity. There are ~11,800 data centres worldwide with a rapidly growing proportion incorporating AI-specific infrastructure. Whereas traditional data centres require 10-15KW of electricity per rack, AI data centres need 40 – 250 KW per rack to support the heavy computational demand of AI models. So, where’s this extra electricity coming from? It’s a question brought into sharper focus by the conflict in the Middle East and its potential impact on the availability and price of gas which is used to generate ~20% of electricity globally.

All the major tech giants have been considering this question for some time. They want a reliable electricity supply and low emissions for their AI endeavours and are thus turning to nuclear power. For example, Microsoft wants to restart a Three Mile Island reactor mothballed in 2019, and Meta have signed a trio of nuclear deals  securing enough electricity to power ~5 million homes for its AI data centres. It takes some decades to build new, large-scale, nuclear reactors like those currently connected to electricity grids, and the surge in power demand for AI data centres is surpassing the planned new generation and transmission capacity. Amazon and most of the tech giants are thus keen to harness Small Modular Reactors (SMR) to sustain AI growth. SMRs are new with just two in the world currently operable. However, you’ll see from the World Nuclear Association’s SMR project tracker that we can expect many more to come on stream over the next decade.

Nuclear SMRs will thus be key providers of the power for the AI data centres needed to underpin this digital technology’s ever more rapid momentum. Is that a problem? No, provided there’s strict regulatory control before, during, and after SMRs are built and put into service, and that global institutions exist with real teeth to ensure that commercial organisations and nation states do not flout the necessary balance between AI self-interest, the greater good, and the proliferation of nuclear material. That may be a tall ask in a world which is full of conflict, extremism, and volatility, and is already embarked on a huge race for AI dominance. SMRs, however, are new and things may not go to plan. If SMR delays happen, then we may see AI momentum slow over the coming decade. Electricity, after all, is the blood of the digital world, and if there isn’t enough blood then things are bound to go awry…

Drone – The word of the decade…

Most people try to live the best life they can, and most want to live in a world where rules help their chances of doing so. Most don’t want to live in a world dominated by those who ignore or flout rules to suit their own purpose. The world order, however, is changing, the United Nations appears toothless, and disruptive geriatric leaders are making life hard for everyone. Conflicts around the world are making ordinary people increasingly worried, but anyone who wants to live their best life must focus on the things they can control and change rather than worry about the things they can’t. That’s sound guidance, but easier said than done.

The future is more uncertain today than for many years, and so when an old IT colleague asked what the Badger’s word or phrase of the 2020/30 decade would likely be, they didn’t get the answer they expected. They anticipated phrases like ‘Artificial Intelligence’, machine learning’ or ‘deep fake’, but the Badger’s answer was one word, namely ‘Drone’. There’re still some years of the decade to go, but on the evidence so far, and with further rapid tech advances inevitable in the coming years, the Badger feels that he’s unlikely to change his mind about his choice of word.

Drone’ is a word that’s growing in importance for anyone who wants to live the best life they can. It’s a fascinating word with a range of biological, sonic, technological, and metaphorical uses. For example, drone is a function, a sound, and a warning and a weapon. It can describe the buzz of a bee, the whirl of a machine, a worker, some of humanity’s most advanced tools, and a shadow overhead to be feared by civilian and military personnel alike. Ten years ago, it was mainly used to refer to bees or the experimental technology of unmanned aerial vehicles (UAVs), but at the start of this decade it became used mostly as a descriptor for any autonomous or remotely controlled civilian or military flying object. Today it is a blanket term for any man-made, autonomous, or remotely controlled flying object that can perform any civilian or military function. When someone uses the word today, it will mostly be in the context of weapons used in Ukraine and the Middle East, and not bees!

Declaring ‘the word of the decade’ halfway through a decade might be foolhardy, but the Badger’s sticking with it, because he feels that clever, man-made, affordable, flying objects for civilian and military purposes will continue to evolve rapidly and become a historically significant feature of this decade. Meanwhile, the bee population, essential pollinators in nature, is in decline. Somehow the word ‘drone’ highlights that humans have their priorities the wrong way round. If you want to live your best life, then change something – plant something in the garden to attract bees…

Delhi, AI, and a rosy future for IT services companies?

For the times, they are a-changin’ sang Bob Dylan in the 1960s. This is particularly apt today given four recent matters which, in the broadest sense, have IT at their core.

First, Meta’s CEO has testified for the first time before a jury to defend against accusations that Meta’s social media platforms harm children’s mental health, and that their platforms are designed to prioritise keeping users scrolling to maximise profits. The trial’s outcome could prove seismic. Second, ‘Epstein Data’ has triggered an inevitable media and political frenzy and repercussions for some individuals, but it has produced little so far that would stand up in a court of law. Nevertheless, ‘Epstein data’ is a reminder of the dangers of email, and that using services underpinned by IT always leaves a record somewhere. The third matter is the impact of AI-driven fears on the share prices of major IT services companies. Investors are anxious about the future demand for IT consulting/services. At the time of writing, the share prices of Accenture, Capgemini, CGI, Sopra Steria, Tata Consultancy Services, and Infosys have dropped by 42%, 36%, 36%, 32%, 28%, and 27%, respectively, over the last 12 months. The market is taking a sober look at the impact of AI.

And the fourth is the AI Impact Summit in Delhi, the largest ever global gathering of world leaders and tech bosses, which ended with 88 nations signing the ‘Delhi Declaration of AI Impact’. Some have called this the ‘Delhi Magna Carta’ to emphasise that it represents a milestone in global cooperation, and consensus about AI’s use for economic growth and social good. The Declaration, however, is not legally binding, and so calling it a Magna Carta is a political metaphor rather than a formal treaty. The Declaration’s a political statement of principles which are far from certain to be embedded into national/international laws, standards, and institutions. A hint about why it may ultimately have little influence is captured by a USA comment which is reported in the item here. The comment is that the USA will not accept ‘global governance of AI’. Why? Because it and China are locked in a structural competition over computational power, microchips, AI-enabled defence systems, and the control of global standards. It’s existential for both and the Declaration doesn’t change that.

Unsurprisingly today ‘For the times, they are a-changin’ is an even louder truth, both geopolitically and for IT services companies and their employees. Dario Amodel, CEO of Anthropic, foresees AI eliminating the jobs of many software engineers. It’s always been important for IT and tech companies to be fleet of foot and for IT people to keep their skills current. The Delhi Declaration highlights that this is more important than ever. With AI-driven transformation gathering pace, the market is showing that a rosy future for IT service companies, and their employees, is not guaranteed…

Potholes – Can IT, tech, and AI help?

The UK Department of Transport’s map showing how England’s Local Authorities rate on keeping local roads in good condition and free from potholes is a little embarrassing. Why? Well, apart from the visual impact of the Red, Yellow, Green picture, you’ll see here that only 16 (~10%) of 154 Local Authorities are rated Green, 13 (~8%) are Red, and a whopping 125 (~81%) are Amber. So little Green is embarrassing, especially as most road users, if the Badger’s local community is representative, think their Yellow-rated Local Authority should really be Red.

The methodology for these ratings is here.  There are three underlying scorecards – Condition covering the Local Authority’s road conditions, Spend covering their spend on highways maintenance, and Best Practice covering how well they follow highways maintenance best practice. It’s this Best Practice component in particular that requires attention because only 20 (13%) out of 154 Local Authorities are rated Green. The Badger’s not surprised having witnessed the way potholes are repaired in his own locality. They are repaired, and then the same ones reappear a month or so later and they are repaired again …and then a month or so later again! Why do the repairs constantly fail? The Badger’s observations suggest there are likely underlying problems with the reactive nature of his Local Authority’s repair business process, its contractor management, and the professionalism and quality of the repair itself. The Badger was a little surprised to read that the Institution of Civil Engineers apparently believe that failing pothole repairs are due to the UK’s moderate climate with temperatures hovering around freezing in Winter. When the RAC produces its own pothole index, however, the Badger thinks there’s got to be more to the problem than that.  

So, can IT and modern tech help with this problem? Well, people can already report a pothole online using their Local Authority’s website – although the mechanism isn’t always easy to find on the website – or by using a tool like FixMyStreet.  Local Authorities also already use Highways Asset Management Software packages of one form or another, and so IT and tech and is already playing a role especially if it’s efficiently integrated across the entire ‘cradle to grave’ business process. Is it? Your guess is as good as the Badger’s.

So, what’s the answer to the question? Well, digital innovation and AI in some form seems to be the answer to everything these days, and a case for it for helping with potholes can be seen here. So, the answer is ‘Yes’, but with the following important caveat. The whole business process must first be overhauled to be proactive with embedded professionalism, quality, and contractor management controls. Simply investing in more IT, tech, and AI without doing this would be an expensive mistake that will not improve the pothole situation on our roads or ease public concern.

Have the lessons from the ‘Move fast and break things’ era really been learned?

The first quarter of the 21st century is complete, and so it seems appropriate to reflect on a period of continuously accelerating digital innovation that has transformed how people work, play, communicate, share information, and buy things. The technological change seen so far this century differs markedly to that experienced by previous generations. It’s been fast! Previous generations experienced the impact of technological change much more slowly. The technologies the Badger’s grandfather and great-grandfather were used to in their childhoods, for example, were still central to their lives in their old age. With subsequent generations, it’s become normal for the barely imaginable technologies of their youth to become commonplace in their later life. Just think, if your ancestors could spend a week with you today, most would be wide-eyed and speechless in awe at the digital technology you use!

Digital technology has driven significant changes in society in the last 25 years, and AI will be no different. In the last 25 years, the internet has become critical global infrastructure, and the advent of smartphones has blended communication, entertainment, photography, and productivity into a single, pocket-size, device. Personal and professional interactions have become dominated by email, instant messaging, and real-time video calls rather than paper, and the way we store, access and manage large amounts of data has moved from local, physical, items like high-capacity CDs, to ‘The Cloud’ where it can be accessed from anywhere at any time. Streaming for entertainment and the online purchasing of goods have become the norm, and cyber capabilities have become crucial for militaries and policing. And then, of course, there’s social media. Whether you love it or loathe it, it’s been an addictive disruptor of everything!

All this, and much more, has happened in barely 25 years. Our lives have become deeply entangled with digital technology and the world has become more unstable. While this instability can be attributed to economic, climate, pandemic, and geopolitical factors, the digital revolution has, in the Badger’s opinion, played a significant role in societal disruption. Why? Because the early Facebook philosophy of ‘Move fast and break things’ epitomised the ethos of the companies that are today’s tech giants, and ‘Silicon Valley’ as a whole. This ethos showed scant regard for the overall societal impact of what they produced. As we are now seeing, the societal, ethical, political, and human problems this ethos produces only really manifests itself many, many years later. With AI continuing the digital revolution in the second quarter of the 21st century, a good question to ask is this: have the lessons from the impact on society of the ‘Move fast and break things’ era been learned and applied in the AI world that will be transformational in the coming decades? AI gives enormous power to those who control it, and so the Badger thinks the answer to this question is obvious. You, however, may think differently…

AI – from ‘build, baby, build’ to ‘bust, baby, bust’?

Every Christmas/New Year period, the BBC’s Radio 4 Today programme invites well known individuals to guest-edit the programme. Each guest focuses on a topic relevant to their interests, experience, and society. Two of the Christmas 2025 guests were inventor, engineer, and businessman Sir James Dyson and the AI pioneer and entrepreneur Mustafa Suleyman.  The Badger was driving to visit relatives on the days they were guest-editing. He had the Today programme on the radio as background noise on both occasions. He turned the volume up when each man was interviewed because they were intelligent, impressive, and articulate individuals conveying enormous common-sense and objectivity, characteristics which seem in short supply today.

Their words resonated with the Badger. Sir James Dyson, for example, likes ‘doers’ rather than ‘talkers’, and Mustafa Suleyman spoke eloquently about AI and that it must be ‘a tool in the hands of and under the control of humans if it’s to benefit all of humankind’. There’s plenty of ‘talkers’ in the world, but it’s ‘doers’ like these two with vision, objectivity, commonsense, and a passion for humankind, rather than politicians, which have the greatest influence on the lives of most people. The Badger agrees that AI is a tool. There are plenty of ‘talkers’ concerned that humans would become subservient to AI, but if we let that happen then we only have ourselves to blame. There’s currently a huge ‘build, baby, build’ rush to construct new, giant, energy-hungry, AI data centres and to amass and use the chips and devices they need to function. Enormous sums are being spent around the world, the technology continues to advance way ahead of any regulation, and AI company stock market valuations are stratospheric. Having worked in IT during the dot.com era, the words of these two men made the Badger ponder more about the current AI ‘build, baby, build’ surge.

Four conclusions emerged. The first was that such surges often produce over-capacity and ‘bust, baby, bust’ outcomes (c.f. China’s property crash) with the bigger the boom, the deeper and longer the bust! The second was that AI is here to stay, but some huge AI companies will not survive even though the AI market bubble is not like the dot.com era when many companies with high valuations had no revenues. Inevitably, when investor appetite for speculative risk tightens for any reason, and it will, a painful correction will happen. The third was that eyebrows should be raised when tech companies arrange for the restart of shuttered nuclear facilities to provide electricity for their new data centres.   

The Badger’s last conclusion was that we should question whether the world’s leaders, including those of hyperscale global tech corporations, are the right kind of ‘doers’. Do they have objectivity, common-sense, and mankind’s well-being at heart, or are they just examples of Lord Acton’s 1887 line Power corrupts and absolute power corrupts absolutely’? Whatever the answer, 2026 looks likely to be a troublesome year…

AI and trust…

Misinformation, disinformation, scams, and questionable videos have been commonplace aspects of social media for years. The Badger, like many, has become distrustful of content pushed to him by algorithms because normally it is not what it appears or purports to be. Three typical examples of content that’s helped to fuel the Badger’s distrust are as follows. The first is spectacular, obviously fake, video of shipping and aircraft incidents that put Hollywood movies to shame. The second is content from activist or political groups that criticise or parody others and promise a better future. Activist and political groups are unreliable and frequently blinkered with short memories. The third is incessant clickbait. Life’s too short to waste time clicking such links. Putting it diplomatically, you can tell by now that the Badger’s trust in what’s pushed to his social media feeds is not high.

AI, of course, is increasingly helping the producers of this content that’s led to this erosion of trust. As this report from the University of Melbourne in Australia highlights, there’s a complex relationship between AI adoption and trust. It reports that while 66% of its survey respondents use AI regularly and believe in its benefits, less than a half (46%) trust the AI they use. The Badger aligns with this finding. He’s an occasional user of AI, but he doesn’t trust it. This ‘trust gap’ – as the report highlights – is a critical challenge for AI’s wider adoption.

Reflecting on this has led the Badger to two conclusions. The first was that since anyone can create content with AI tools, it’s inevitable that the volume and sophistication of misinformation, disinformation, scams, and questionable video content in social media feeds will increase further. Soon the question to really ask yourself about social media feeds will no longer be ‘what’s fake?’… but ‘what’s real?’  The second conclusion was that this, society’s huge energy bill for AI, and its unsustainably high stock market valuations, are widening rather than closing the Badger’s ‘trust gap.’

AI tools are here to stay, but as the report above points out, the biggest challenge for AI is trust. As the common adage highlights, trust is the easiest thing in the world to lose, and the hardest thing in the world to get back. At present, it doesn’t feel as if AI is winning the battle for our trust. The Badger’s current overall feeling about the question of trust is nicely summed up by this passage from J.K. Rowling’s book Harry Potter and the Chamber of Secrets’. ‘Ginny!’ said Mr. Weasley, flabbergasted. ‘Haven’t I taught you anything? What have I always told you? Never trust anything that can think for itself if you can’t see where it keeps its brain?’  For the Badger, the last sentence of this passage, written over a decade ago, gets to the nub of the AI and trust issue…

The imagination of children – Lego and AI…

While returning home from a stroll through a glorious deciduous wood resplendent with Autumn colour, the Badger saw an interesting book in a charity shop window. He popped in and came out with a carrier bag half-full of Lego bricks of all shapes, sizes, colours, and types rather than the book! The Lego was in great condition at a bargain price and buying it for his grandson to play with when he visits seemed a no-brainer. On arriving home, the bag was emptied onto a table. There were standard bricks and bases, Technik bricks, wheels, motors, and arms, legs, torsos, heads, and hands from Lego figures, and much more. The Badger was hooked. He spent the rest of the afternoon using his imagination to produce a number of creative masterpieces! Indeed, everyday since, the Badger’s improved his masterpieces and created new ones. It’s addictive!

Lego empowers creativity, provides immediate gratification from having built something with your hands, and it helps to develop spatial reasoning, design thinking, and problem-solving. Furthermore, it encourages an understanding of mechanics through trial and error. It’s fun, educational, and great for kids (and adults) of all ages with building things often a collaborative and social activity. Kids, for example, learn from each other when they play with it together and when adults help them. Building Lego models together strengthens the bonds between individuals.

As the Badger built his own masterpieces, he remembered that Lego has been an excellent teaching aid at home and in education establishments for decades, as the recent announcement about a teenager building a robotic hand using Lego illustrates. It also struck him that Millennials were shaped by the emergence of the internet, Gen Z were moulded by social media’s evolution, and that Gen Alpha – his grandson’s generation – will be defined by the rapid expansion of AI use. The Badger senses a danger, however, that Gen Alpha may simply ask AI for ideas and instructions of what to build from a bag of bricks rather than use their own imagination and individuality to create masterpieces. Always inquisitive, the Badger asked CoPilot what could be built with a bag of mixed Lego bricks. It replied with ideas and instructions, and thus neatly illustrated that the danger of Gen Alpha outsourcing their imagination, creativity, and physical trial and error learning to AI is real.

A recent UK study found that ~22% of 8 to 12 year-old children already use generative AI tools, which – let’s remember – have not been designed from the outset with children in mind. Have the  lessons from social media’s impact on children been learned? The answer’s not obvious, which is why the Badger will be encouraging his grandson to produce his own Lego masterpieces without engaging in virtual world interactions. Another reason, of course, is that the Badger will be able to transfer knowledge and enjoy helping to build his creations too…

AI – A ‘Macbeth Moment’?

The Badger was browsing in a shop when Hubble Bubble (Toil and Trouble) by Manfred Mann featured in the piped music. It struck a chord with the recent warnings by JP Morgan’s CEO, the Bank of England, and others, that an AI bubble could pop. Later that day, while clearing a cupboard, the Badger found his old school notes for Shakespeare’s Macbeth, part of the English Literature syllabus of the time. Scribbled notes about the three witches uttering ‘Double, double toil and trouble: fire burn and cauldron bubble’ caught his eye. The coincidental combination of the song title, these scribbles, and the AI warnings triggered some contemplation on the AI bubble.

During the dot.com debacle of the early 2000s, the Badger was a senior member of a UK, stock-exchange listed, IT services company. Such companies, investors believed, would benefit from the dot.com boom. The company’s share price thus rose ~tenfold before collapsing back to its original level when the market realised that dot.com companies were massively over-valued, and many had little real revenue let alone profit. For years following the crash, doing business in the IT sector was tough. The NASDAQ, for example, crashed from around 5000 to 1100 and it took ~15 years to recover. Many dotcoms disappeared, but the likes of Amazon, eBay, Google and others rose from the ashes to become the powerhouses of recovery. Having worked in IT throughout the debacle, the Badger’s instincts are alive to tech bubbles. Today they ring alarm bells.

Whether AI’s a market bubble that bursts, or a transformation that sticks, depends on whether company valuations are grounded in real, scalable, business fundamentals, or  speculative optimism. Either way, AI is unlike anything seen before, so when JP Morgan, the Bank of England, the World Economic Forum and others have some anxiety, then we should take note, especially as, for example, Nvidia, Anthropic, and OpenAI’s market values have risen many-fold in just two years. There’s unprecedented spending on computational infrastructure, massive bets on future productivity gains, and belief that AI will revolutionise everything. The actual return  on investment, however, has not been impressive so far. When the UK National Cyber Security Centre advises organisations to have plans to operate their business without access to computers following a cyber-attack, the hype of an AI dominated future seems a little questionable.

The Badger’s learned from his dot.com era experience that it’s prudent to be wary. If market valuations become detached from fundamentals, or the availability of computational infrastructure stalls, or the promised productivity gains for organisations don’t materialise, or geopolitically driven export controls cause disruption, then any AI bubble will pop triggering a huge domino effect. AI is facing a ‘Macbeth moment’. Witches making prophecies surround the bubbling AI cauldron uttering ‘double, double toil and trouble; fire burn and cauldron bubble’. In the play, Macbeth felt a sense of foreboding…as do more and more of today’s leaders….

The Future; microchipped, monitored and tracked?

The Badger sank onto the sofa after his infant grandson’s parents collected the little whirlwind following a weekend sleepover. The Badger had been reminded that Generation Alpha are the most digital-immersed cohort yet. Born into a world full of tech, they are digital natives from an early age, as was evident during the activities we did over the weekend. Struck by the youngster’s digital awareness and especially their independence, curiosity, and eagerness to grasp not just what things are, but also why and how they work, the Badger found himself wondering about the digital world that his grandson might encounter in the future.

From his IT experience, the Badger knows that change is continuous and disruptive for IT professionals, organisations, and the public alike. Change in the digital landscape over the last 40 years has been phenomenal. All of the following have caused upheavals on the journey to the digital world we have today: the move from mainframes to client-server and computer networks, relational databases, the PC, spreadsheets and word processing packages, mobile networks and satellite communications, mobile computing, image processing, the internet, online businesses, social media, the cloud, microchip miniaturisation, and advances in software engineering. These have changed the way organisations function, how the general public engages with them, and how people interact with family, friends, and others globally. AI is essentially another transformative upheaval, and one that will impact Generation Alpha and future generations the most.

Data, especially personal data, is the ‘oil’ of today’s and tomorrow’s digital world, and the entities that hold and control it will use it to progress their own objectives. With AI and the automation of everything, the thirst for our data is unlikely to be quenched, which should make us worry about the digital world for Generation Alpha and beyond. Why? Because humans in the hands of tech, rather than the other way around, increasingly seems to be the direction of travel for our world. The UK government’s announcement of a digital ID ‘to help tackle illegal migration, make accessing government services easier, and enable wider efficiencies’ has made the Badger a little uneasy about the digital world his grandson will experience. A backlash, as illustrated by this petition to Parliament, illustrates the scale of worry that it’s a step towards mass surveillance and state control. Governments, after all, do not have good track records in delivering what they say they will.

As the Badger started to doze on the sofa, he envisaged a future where humans are microchipped and have their lives monitored and tracked in real time from birth to death, as happens with farm animals. He resolved to make sure his grandson learns about protecting his personal data and that he values a life with personal freedom rather than control by digital facilities. The Badger then succumbed to sleep, worn out from activities with a member of Generation Alpha…