OpenAI pausing Stargate UK is hardly a surprise!

As widely reported (see here for example), OpenAI is pausing its multi-billion-dollar Stargate UK project. The project was first announced in September 2025 with the declared purpose of ensuring ‘OpenAI’s world leading AI models can run on local computing power in the UK, for UK – particularly for specialist use cases where jurisdiction matters. This will help power the UK’s future economy, boost its global competitiveness, and deliver on the countries national AI Opportunities Action Plan’. The UK government’s AI Opportunities Action Plan had been announced in January 2025 as a focus for ramping up AI adoption to boost economic growth, jobs, and improvements to people’s everyday lives. A year later, in January 2026, a seemingly positive  progress update was published. The government’s thus likely to be wringing its hands about OpenAI’s pause. Why? Because it puts a dent in the country’s desire to be an ‘AI superpower’, especially when the company asserts that regulation and high energy costs are obstacles. The Stargate UK pause, however, is hardly a surprise given that the holistic situation faced by OpenAI today is really no different to when the project was announced last September.

OpenAI announced the project on the date President Trump started his state visit to the UK. With tariffs as a backdrop, the pressure on the UK government to make the visit a success was huge, and a centrepiece during the visit was the signing of a technology partnership involving new investment and cooperation on AI. Domestically, the government needed this to promote its growth agenda, but a ‘technology partnership’ and tangible realities are different. Given the pressure for the visit to be a success, OpenAI’s Stargate UK announcement was part of an overall joint PR strategy – at least that’s what the Badger senses. At that time, the UK had some of the highest costs for electricity in the world, and that’s still the same today! If there’s one thing an aspirant AI superpower needs, it’s economically competitive electricity and so it can hardly be a surprise when a commercial company focused ‘on the business case and numbers’ decides to hold off further investment. Additionally, there’s uncertainty about changes to UK law to allow AI firms to train their systems using copyrighted works, ongoing investor anxiety about an AI bubble, the fact that OpenAI hasn’t delivered a profit yet and is forecast to make losses of ~$44 billion before becoming profitable in 2029, and that OpenAI is facing massive competition from Google (and others) which is raising significant questions about its future. All of these points were material when Stargate UK was announced 7 months ago, and they remain so today.

A sceptic could thus be excused for thinking that the project was driven by a geopolitical public relations necessity in the first place. For the Badger, with his instincts rattling from experience, it’s thus hardly a surprise that Stargate UK is paused…   

Required leadership qualities – Competence, Consistency, Clarity, Communication and Charisma…

Early in the Badger’s IT career before the internet arrived, training for delivery people – project and team leaders, and technical staff – took place face to face in a group led by a senior delivery person and a professional trainer. Such training was often a one or two-day event conducted away from the hubbub of the workplace so that participants were not distracted by their normal work activities. At one course the Badger attended, participants were challenged to express the qualities that the members of project teams look for in their delivery leader. Participants, all team, or project leaders with various levels of experience, had ten minutes to produce five words for discussion with the course leaders and the wider group.

Many found it more difficult than expected because they struggled to think about delivery leadership from the perspective of team members who were not, and never aspired to be, leaders. Nevertheless, at the end of the exercise and subsequent discussion, the group converged on the following five words as required leadership qualities: Competence, Consistency, Clarity, Communication and Charisma. These words became known as the 5Cs and provided the theme underpinning the rest of the course. Whilst their context related to what delivery people look for from their delivery leader, the Badger’s found over the years that they are a good reference point for what to look for in leaders more generally.

The Badger’s worked for, and with, many senior leaders of all kinds over the years. They all had different personalities, strengths, and weaknesses. Some were more competent than others, some were more consistent and clearer than others, and some were better and more inspiring communicators than others. None were extroverts, but they all had a charisma that you couldn’t quite put your finger on. Underpinned by the 5Cs, the Badger considered some as much better leaders than others. There’s lots of broader and more detailed information available about the traits of good leaders, but the Badger’s routinely used the simple, qualitative, 5Cs as his mental ‘initial leadership quality’ checklist over the years to shape an initial opinion – which sometimes has subsequently changed. Sometimes, however, that initial opinion has not been very flattering and has not changed.

With the 5Cs concept in your psyche, you can’t help but use it to judge leaders who regularly appear on broadcast or social media even though you’ve never met them. Inevitably that’s unfair, but rather than relying on instinct alone, the 5Cs provides some structure in forming an opinion about where that person is on the POOR to GOOD leadership qualities spectrum. Ego, wealth, and having a powerful position is not the same as having good leadership qualities. For example, any leader who rants publicly and profanely on social media is unprofessional and sets a bad example for online behaviour. Someone with GOOD leadership qualities would never do this…

Will AI experience a ‘tobacco moment’?

The Badger smiled and then sighed when Meta and YouTube were recently found liable for harming a young woman through the addictive design of their products and their failure to warn users of the risks. The smile was because it’s good to see tech giants not getting their own way. The sigh was because it’s taken far too many years to get to this point. Sensible people have known for years that these apps are designed to keep users compulsively engaged for as long as possible because it’s the clever monetisation of this that underpins their business models.

The Badger recalls the early days of social media when it simply helped people stay in touch, share milestones, and reconnect with old friends. In those days there was a clear divide between real and online life. Conversations ended on leaving a room or putting the phone down, photographs lived in physical albums, and social media was used as a harmless tool and not something that shaped or dominated how we lived. Today things are quite different. Social media has grown in power, profitability, and influence, to such an extent that the average person spends more time online using it than is prudent. What’s changed since those early days is the design of the apps and platforms. Endless scrolling, algorithm-driven recommendations, push notifications, and short video loops aren’t accidental. They’re features engineered to keep people engaged for as long as possible. Indeed, the BBC was reporting way back in 2018 that social media apps were deliberately addictive to users. The Badger thinks all this has certainly eroded the real-world routines, relationships, and boundaries for users over the last decades.

In the Meta and YouTube case, the prosecution lawyers have cleverly focused on how platforms are designed rather than what’s posted on them to win. The two giants plan to appeal but it’s debatable whether the appeals will succeed. Social media is thus having to grapple with the fact that this could be a reckoning similar to that experienced by the tobacco industry some decades ago. This ‘tobacco moment’ prompted the Badger to muse on whether AI will ultimately experience such a moment too. He concluded that it will. AI has the potential to harm institutions, elections, markets, information ecosystems, and critical infrastructure, and so its reckoning moment could happen faster, globally, and structurally. The possible triggers might relate to bias, misinformation, autonomy, and safety failures. Like the ‘tobacco moment’ for social media, AI’s moment will not be about banning it, but about liability.

A ‘tobacco moment’ isn’t about a single lawsuit. It happens when society collectively decides that an industry has externalized too much harm and the legal, regulatory, and cultural tides all turn at once. It seems foolhardy, therefore, to think that AI will be immune to a ‘tobacco moment’ of its own at some stage in the future…

Rage against the screen…

The Badger’s 6-year-old grandson likes trains! Books about trains, Brio train sets, and Lego trains are favourite toys, but seeing and riding on real trains brings a special sparkle to his eyes. He loves to watch steam engines chuff along the Watercress Line, see historic locomotives in museums, ride miniature railways at visitor attractions, and travel on the regular trains that commuters use every day. He’s fascinated by how trains work, which is great, but his persistent questions about ‘how’ and ‘why’ can sometimes be wearing!

Last weekend the Badger and his grandson did something that didn’t relate to trains. We visited the Tangmere Military Aviation Museum, a small place with a number of static military jets as well as memorabilia from when Tangmere was a World War II RAF fighter base. The visit spawned an observation about 6-year-olds that he had not anticipated. At each exhibit there’s a computer that can be used to engage with the exhibit’s story, pull up photographs, and watch film clips. At many exhibits it’s possible to sit in the cockpit, peer into the fuselage, and use a computerized simulator. The Badger’s grandson observed that planes are engineered and work differently to trains!

It was all fun, but the Badger noticed that his grandson preferred using the computers rather than engaging physically with the exhibit itself. For example, the Canberra has part of the fuselage removed so visitors can easily lean in to see the environment around the pilot and crew. Adjacent to the jet is a computer showing images streamed from a camera mounted inside the fuselage. The camera can be panned through 360 degrees using a mouse and the user can zoom in on any part of the pilot and crew area. This 6-year-old used this computer rather than physically looking inside the fuselage. This preference was clear with other exhibits too. Seeing that ‘the screen’ had a greater pull with the youngster than exploring the exhibit physically made the Badger uneasy. If youngsters in their early formative years prefer screens to engaging with the physical real world, then we should surely all be worried.

On the car radio driving home, the Badger listened to the CEO of Mumsnet, being interviewed about Mumsnet’s Rage against the Screen’ campaign which is calling on politicians to ban social media for under-16s, stop Big Tech using data to target children with addictive algorithms, and to put children’s safety and wellbeing ahead of platform profits. The Badger found himself agreeing with the points made. In the UK, you must be 16 years or older to do many things (see here), so why not ban social media for under 16s? If the Badger’s grandson is already ‘virtual rather than physical world first’ at the age of 6, then ‘Raging against the Screen’ is surely a campaign that needs to succeed…   

AI and progress towards nuclear fusion for power generation…

When the radio alarm signals that it’s time to rise and prepare for the day ahead, it’s easy to doze for a few extra minutes without listening to the programme being broadcast from the radio. Sometimes, of course, there’s something in the babble which grabs your attention, sharpens alertness, and forces you to concentrate on whatever’s being said. That’s exactly what happened with the Badger earlier this week, The babble included an item of interest because it related to the Badger’s post-doctoral research many decades ago. That item was about the scientific and engineering drive to harness the power of nuclear fusion for the generation of limitless, sustainable, carbon-free electricity.  

The item covered the UK government’s written statement on the UK’s Fusion Strategy, it’s investment in STEP  – building a prototype fusion plant in Nottinghamshire by 2040 – and its investment in the world’s most powerful fusion-dedicated AI supercomputer to accelerate fusion design, modelling, and operations. It asserted that this is the most ambitious push yet to establish the UK’s complete energy independence from foreign price shocks. Investing £45m in this supercomputer, part of a wider government effort in AI and supercomputing infrastructure that has already seen a separate £36m supercomputer investment at the University of Cambridge, is a step along the road. However, let’s face it, it’s a tiny step when the country spends >£60bn on Defence, >£300bn on Welfare, and >£90bn on Education.

Harnessing nuclear fusion is the holy grail of getting clean, limitless energy. It’s been that way for as long as the Badger can remember and so has its reputation for always being ’50 years away’. The scientific and engineering challenges to be overcome in order to build and operate a commercially viable nuclear fusion reactor are enormous. However, there’s been huge advances over the last 30 years or so, a timeline that in parallel has also seen huge advances in computers and information processing. The latter has already helped enormously in getting fusion to its current position and there’s little doubt that further computing advances, particularly in AI and machine learning, will continue to accelerate progress towards achieving the holy fusion grail of large scale, carbon-free, sustainable energy on this planet.

But with the first experimental reactors currently forecast to start operating around 2040 and beyond, usable power from fusion still seems ’50 years away’ in practice. Generation Alpha and their children are thus likely to be the first generations to use power from viable fusion reactors. So, here’s a thought. Enormous amounts of money are being spent on AI across the globe. In comparison, a pittance is being spent on getting to power-generating fusion reactors that will hugely benefit our planet. Unless there’s a 1960’s-like ‘let’s go to the moon’ moment for fusion, the Badger can’t help but feel that it will always be ’50 years away’ regardless of investments in dedicated AI supercomputers…

Nuclear Power for AI Data Centres…

According to the World Nuclear Industry Status Report, whose data can be explored visually here, there are 407 operational nuclear reactors currently generating electricity across the world. Of these, 94 are in the USA, 62 are in China, 57 are in France, and 34 are in Russia. The average age of the world’s operational reactors is 32.6 years, and they generate ~9% of global electricity. There are ~11,800 data centres worldwide with a rapidly growing proportion incorporating AI-specific infrastructure. Whereas traditional data centres require 10-15KW of electricity per rack, AI data centres need 40 – 250 KW per rack to support the heavy computational demand of AI models. So, where’s this extra electricity coming from? It’s a question brought into sharper focus by the conflict in the Middle East and its potential impact on the availability and price of gas which is used to generate ~20% of electricity globally.

All the major tech giants have been considering this question for some time. They want a reliable electricity supply and low emissions for their AI endeavours and are thus turning to nuclear power. For example, Microsoft wants to restart a Three Mile Island reactor mothballed in 2019, and Meta have signed a trio of nuclear deals  securing enough electricity to power ~5 million homes for its AI data centres. It takes some decades to build new, large-scale, nuclear reactors like those currently connected to electricity grids, and the surge in power demand for AI data centres is surpassing the planned new generation and transmission capacity. Amazon and most of the tech giants are thus keen to harness Small Modular Reactors (SMR) to sustain AI growth. SMRs are new with just two in the world currently operable. However, you’ll see from the World Nuclear Association’s SMR project tracker that we can expect many more to come on stream over the next decade.

Nuclear SMRs will thus be key providers of the power for the AI data centres needed to underpin this digital technology’s ever more rapid momentum. Is that a problem? No, provided there’s strict regulatory control before, during, and after SMRs are built and put into service, and that global institutions exist with real teeth to ensure that commercial organisations and nation states do not flout the necessary balance between AI self-interest, the greater good, and the proliferation of nuclear material. That may be a tall ask in a world which is full of conflict, extremism, and volatility, and is already embarked on a huge race for AI dominance. SMRs, however, are new and things may not go to plan. If SMR delays happen, then we may see AI momentum slow over the coming decade. Electricity, after all, is the blood of the digital world, and if there isn’t enough blood then things are bound to go awry…

Drone – The word of the decade…

Most people try to live the best life they can, and most want to live in a world where rules help their chances of doing so. Most don’t want to live in a world dominated by those who ignore or flout rules to suit their own purpose. The world order, however, is changing, the United Nations appears toothless, and disruptive geriatric leaders are making life hard for everyone. Conflicts around the world are making ordinary people increasingly worried, but anyone who wants to live their best life must focus on the things they can control and change rather than worry about the things they can’t. That’s sound guidance, but easier said than done.

The future is more uncertain today than for many years, and so when an old IT colleague asked what the Badger’s word or phrase of the 2020/30 decade would likely be, they didn’t get the answer they expected. They anticipated phrases like ‘Artificial Intelligence’, machine learning’ or ‘deep fake’, but the Badger’s answer was one word, namely ‘Drone’. There’re still some years of the decade to go, but on the evidence so far, and with further rapid tech advances inevitable in the coming years, the Badger feels that he’s unlikely to change his mind about his choice of word.

Drone’ is a word that’s growing in importance for anyone who wants to live the best life they can. It’s a fascinating word with a range of biological, sonic, technological, and metaphorical uses. For example, drone is a function, a sound, and a warning and a weapon. It can describe the buzz of a bee, the whirl of a machine, a worker, some of humanity’s most advanced tools, and a shadow overhead to be feared by civilian and military personnel alike. Ten years ago, it was mainly used to refer to bees or the experimental technology of unmanned aerial vehicles (UAVs), but at the start of this decade it became used mostly as a descriptor for any autonomous or remotely controlled civilian or military flying object. Today it is a blanket term for any man-made, autonomous, or remotely controlled flying object that can perform any civilian or military function. When someone uses the word today, it will mostly be in the context of weapons used in Ukraine and the Middle East, and not bees!

Declaring ‘the word of the decade’ halfway through a decade might be foolhardy, but the Badger’s sticking with it, because he feels that clever, man-made, affordable, flying objects for civilian and military purposes will continue to evolve rapidly and become a historically significant feature of this decade. Meanwhile, the bee population, essential pollinators in nature, is in decline. Somehow the word ‘drone’ highlights that humans have their priorities the wrong way round. If you want to live your best life, then change something – plant something in the garden to attract bees…

Delhi, AI, and a rosy future for IT services companies?

For the times, they are a-changin’ sang Bob Dylan in the 1960s. This is particularly apt today given four recent matters which, in the broadest sense, have IT at their core.

First, Meta’s CEO has testified for the first time before a jury to defend against accusations that Meta’s social media platforms harm children’s mental health, and that their platforms are designed to prioritise keeping users scrolling to maximise profits. The trial’s outcome could prove seismic. Second, ‘Epstein Data’ has triggered an inevitable media and political frenzy and repercussions for some individuals, but it has produced little so far that would stand up in a court of law. Nevertheless, ‘Epstein data’ is a reminder of the dangers of email, and that using services underpinned by IT always leaves a record somewhere. The third matter is the impact of AI-driven fears on the share prices of major IT services companies. Investors are anxious about the future demand for IT consulting/services. At the time of writing, the share prices of Accenture, Capgemini, CGI, Sopra Steria, Tata Consultancy Services, and Infosys have dropped by 42%, 36%, 36%, 32%, 28%, and 27%, respectively, over the last 12 months. The market is taking a sober look at the impact of AI.

And the fourth is the AI Impact Summit in Delhi, the largest ever global gathering of world leaders and tech bosses, which ended with 88 nations signing the ‘Delhi Declaration of AI Impact’. Some have called this the ‘Delhi Magna Carta’ to emphasise that it represents a milestone in global cooperation, and consensus about AI’s use for economic growth and social good. The Declaration, however, is not legally binding, and so calling it a Magna Carta is a political metaphor rather than a formal treaty. The Declaration’s a political statement of principles which are far from certain to be embedded into national/international laws, standards, and institutions. A hint about why it may ultimately have little influence is captured by a USA comment which is reported in the item here. The comment is that the USA will not accept ‘global governance of AI’. Why? Because it and China are locked in a structural competition over computational power, microchips, AI-enabled defence systems, and the control of global standards. It’s existential for both and the Declaration doesn’t change that.

Unsurprisingly today ‘For the times, they are a-changin’ is an even louder truth, both geopolitically and for IT services companies and their employees. Dario Amodel, CEO of Anthropic, foresees AI eliminating the jobs of many software engineers. It’s always been important for IT and tech companies to be fleet of foot and for IT people to keep their skills current. The Delhi Declaration highlights that this is more important than ever. With AI-driven transformation gathering pace, the market is showing that a rosy future for IT service companies, and their employees, is not guaranteed…

Potholes – Can IT, tech, and AI help?

The UK Department of Transport’s map showing how England’s Local Authorities rate on keeping local roads in good condition and free from potholes is a little embarrassing. Why? Well, apart from the visual impact of the Red, Yellow, Green picture, you’ll see here that only 16 (~10%) of 154 Local Authorities are rated Green, 13 (~8%) are Red, and a whopping 125 (~81%) are Amber. So little Green is embarrassing, especially as most road users, if the Badger’s local community is representative, think their Yellow-rated Local Authority should really be Red.

The methodology for these ratings is here.  There are three underlying scorecards – Condition covering the Local Authority’s road conditions, Spend covering their spend on highways maintenance, and Best Practice covering how well they follow highways maintenance best practice. It’s this Best Practice component in particular that requires attention because only 20 (13%) out of 154 Local Authorities are rated Green. The Badger’s not surprised having witnessed the way potholes are repaired in his own locality. They are repaired, and then the same ones reappear a month or so later and they are repaired again …and then a month or so later again! Why do the repairs constantly fail? The Badger’s observations suggest there are likely underlying problems with the reactive nature of his Local Authority’s repair business process, its contractor management, and the professionalism and quality of the repair itself. The Badger was a little surprised to read that the Institution of Civil Engineers apparently believe that failing pothole repairs are due to the UK’s moderate climate with temperatures hovering around freezing in Winter. When the RAC produces its own pothole index, however, the Badger thinks there’s got to be more to the problem than that.  

So, can IT and modern tech help with this problem? Well, people can already report a pothole online using their Local Authority’s website – although the mechanism isn’t always easy to find on the website – or by using a tool like FixMyStreet.  Local Authorities also already use Highways Asset Management Software packages of one form or another, and so IT and tech and is already playing a role especially if it’s efficiently integrated across the entire ‘cradle to grave’ business process. Is it? Your guess is as good as the Badger’s.

So, what’s the answer to the question? Well, digital innovation and AI in some form seems to be the answer to everything these days, and a case for it for helping with potholes can be seen here. So, the answer is ‘Yes’, but with the following important caveat. The whole business process must first be overhauled to be proactive with embedded professionalism, quality, and contractor management controls. Simply investing in more IT, tech, and AI without doing this would be an expensive mistake that will not improve the pothole situation on our roads or ease public concern.

Electricity – The lifeblood and Achilles heel of the modern world…

Risk, an unavoidable aspect of daily life, is the possibility of something bad happening. Every personal activity and decision we take involves some level of risk. Understanding this, and managing risk responsibly, builds self-confidence, resilience, independence, and fulfilment. Risk is inescapable for businesses and governments too. Most maintain risk registers and have plans to manage the consequences should they happen. The public version of the UK’s National Risk Register, for example, is here.  A few days ago, the Badger’s home experienced a power cut following heavy rain in the area. It was the first for many years and so it reminded the Badger of just how dependent we are in today’s world on electricity. It’s the lifeblood of the modern digital world, but also its Achilles heel. The Heathrow  shut down of March 2025,  the Iberian grid collapse of April 2025, and Russia’s relentless attacks on Ukraine’s energy infrastructure, all illustrate the chaos that can be unleashed when electricity supply is  seriously disrupted. 

The Badger’s power cut set him thinking. In an age of global belligerence, could an enemy bring societal chaos to the UK without using cyber techniques or nuclear weapons? Well, yes. Simply knockout a significant number of the nation’s electricity production sites. The country’s electricity supply is vulnerable due to many things, including outdated infrastructure, and so an unexpected coordinated attack using conventional weapons on the  top dozen or so non-nuclear generation and interconnector sites would cause havoc with our daily lives. If there was also a simultaneous attack on the undersea data cables connecting the UK to the world digitally then we would experience chaos like never before.

At this point it’s worth emphasising that this is the output of the Badger’s own musing. It is not derived from having any particular insight into the measures the nation uses to protect its critical national infrastructure. But if the Badger thinks this scenario is plausible, then our defence forces and our enemies will have too, and so hopefully something similar will already be on the country’s private version of the National Risk Register. But here’s the thing. As an individual, do you spend any time thinking about how you would function during a prolonged loss of electricity and online services? Probably not. Should you? Yes, because you’ll get a flavour of the likely impact of a nationwide blackout here

Is it prudent to have some appropriate fallback items and mechanisms ‘in the back of a cupboard’ to use if such a scenario occurred? Of course it is. When the Badger was a child, before the modern digital world existed, one of his father’s mantras was ‘Always have something to fall back on because you never know what calamity will unfold tomorrow’. These words seem even more relevant today when electricity is the lifeblood of a modern world that’s more dangerous than it’s been for decades.