AI – from ‘build, baby, build’ to ‘bust, baby, bust’?

Every Christmas/New Year period, the BBC’s Radio 4 Today programme invites well known individuals to guest-edit the programme. Each guest focuses on a topic relevant to their interests, experience, and society. Two of the Christmas 2025 guests were inventor, engineer, and businessman Sir James Dyson and the AI pioneer and entrepreneur Mustafa Suleyman.  The Badger was driving to visit relatives on the days they were guest-editing. He had the Today programme on the radio as background noise on both occasions. He turned the volume up when each man was interviewed because they were intelligent, impressive, and articulate individuals conveying enormous common-sense and objectivity, characteristics which seem in short supply today.

Their words resonated with the Badger. Sir James Dyson, for example, likes ‘doers’ rather than ‘talkers’, and Mustafa Suleyman spoke eloquently about AI and that it must be ‘a tool in the hands of and under the control of humans if it’s to benefit all of humankind’. There’s plenty of ‘talkers’ in the world, but it’s ‘doers’ like these two with vision, objectivity, commonsense, and a passion for humankind, rather than politicians, which have the greatest influence on the lives of most people. The Badger agrees that AI is a tool. There are plenty of ‘talkers’ concerned that humans would become subservient to AI, but if we let that happen then we only have ourselves to blame. There’s currently a huge ‘build, baby, build’ rush to construct new, giant, energy-hungry, AI data centres and to amass and use the chips and devices they need to function. Enormous sums are being spent around the world, the technology continues to advance way ahead of any regulation, and AI company stock market valuations are stratospheric. Having worked in IT during the dot.com era, the words of these two men made the Badger ponder more about the current AI ‘build, baby, build’ surge.

Four conclusions emerged. The first was that such surges often produce over-capacity and ‘bust, baby, bust’ outcomes (c.f. China’s property crash) with the bigger the boom, the deeper and longer the bust! The second was that AI is here to stay, but some huge AI companies will not survive even though the AI market bubble is not like the dot.com era when many companies with high valuations had no revenues. Inevitably, when investor appetite for speculative risk tightens for any reason, and it will, a painful correction will happen. The third was that eyebrows should be raised when tech companies arrange for the restart of shuttered nuclear facilities to provide electricity for their new data centres.   

The Badger’s last conclusion was that we should question whether the world’s leaders, including those of hyperscale global tech corporations, are the right kind of ‘doers’. Do they have objectivity, common-sense, and mankind’s well-being at heart, or are they just examples of Lord Acton’s 1887 line Power corrupts and absolute power corrupts absolutely’? Whatever the answer, 2026 looks likely to be a troublesome year…

AI and trust…

Misinformation, disinformation, scams, and questionable videos have been commonplace aspects of social media for years. The Badger, like many, has become distrustful of content pushed to him by algorithms because normally it is not what it appears or purports to be. Three typical examples of content that’s helped to fuel the Badger’s distrust are as follows. The first is spectacular, obviously fake, video of shipping and aircraft incidents that put Hollywood movies to shame. The second is content from activist or political groups that criticise or parody others and promise a better future. Activist and political groups are unreliable and frequently blinkered with short memories. The third is incessant clickbait. Life’s too short to waste time clicking such links. Putting it diplomatically, you can tell by now that the Badger’s trust in what’s pushed to his social media feeds is not high.

AI, of course, is increasingly helping the producers of this content that’s led to this erosion of trust. As this report from the University of Melbourne in Australia highlights, there’s a complex relationship between AI adoption and trust. It reports that while 66% of its survey respondents use AI regularly and believe in its benefits, less than a half (46%) trust the AI they use. The Badger aligns with this finding. He’s an occasional user of AI, but he doesn’t trust it. This ‘trust gap’ – as the report highlights – is a critical challenge for AI’s wider adoption.

Reflecting on this has led the Badger to two conclusions. The first was that since anyone can create content with AI tools, it’s inevitable that the volume and sophistication of misinformation, disinformation, scams, and questionable video content in social media feeds will increase further. Soon the question to really ask yourself about social media feeds will no longer be ‘what’s fake?’… but ‘what’s real?’  The second conclusion was that this, society’s huge energy bill for AI, and its unsustainably high stock market valuations, are widening rather than closing the Badger’s ‘trust gap.’

AI tools are here to stay, but as the report above points out, the biggest challenge for AI is trust. As the common adage highlights, trust is the easiest thing in the world to lose, and the hardest thing in the world to get back. At present, it doesn’t feel as if AI is winning the battle for our trust. The Badger’s current overall feeling about the question of trust is nicely summed up by this passage from J.K. Rowling’s book Harry Potter and the Chamber of Secrets’. ‘Ginny!’ said Mr. Weasley, flabbergasted. ‘Haven’t I taught you anything? What have I always told you? Never trust anything that can think for itself if you can’t see where it keeps its brain?’  For the Badger, the last sentence of this passage, written over a decade ago, gets to the nub of the AI and trust issue…

The imagination of children – Lego and AI…

While returning home from a stroll through a glorious deciduous wood resplendent with Autumn colour, the Badger saw an interesting book in a charity shop window. He popped in and came out with a carrier bag half-full of Lego bricks of all shapes, sizes, colours, and types rather than the book! The Lego was in great condition at a bargain price and buying it for his grandson to play with when he visits seemed a no-brainer. On arriving home, the bag was emptied onto a table. There were standard bricks and bases, Technik bricks, wheels, motors, and arms, legs, torsos, heads, and hands from Lego figures, and much more. The Badger was hooked. He spent the rest of the afternoon using his imagination to produce a number of creative masterpieces! Indeed, everyday since, the Badger’s improved his masterpieces and created new ones. It’s addictive!

Lego empowers creativity, provides immediate gratification from having built something with your hands, and it helps to develop spatial reasoning, design thinking, and problem-solving. Furthermore, it encourages an understanding of mechanics through trial and error. It’s fun, educational, and great for kids (and adults) of all ages with building things often a collaborative and social activity. Kids, for example, learn from each other when they play with it together and when adults help them. Building Lego models together strengthens the bonds between individuals.

As the Badger built his own masterpieces, he remembered that Lego has been an excellent teaching aid at home and in education establishments for decades, as the recent announcement about a teenager building a robotic hand using Lego illustrates. It also struck him that Millennials were shaped by the emergence of the internet, Gen Z were moulded by social media’s evolution, and that Gen Alpha – his grandson’s generation – will be defined by the rapid expansion of AI use. The Badger senses a danger, however, that Gen Alpha may simply ask AI for ideas and instructions of what to build from a bag of bricks rather than use their own imagination and individuality to create masterpieces. Always inquisitive, the Badger asked CoPilot what could be built with a bag of mixed Lego bricks. It replied with ideas and instructions, and thus neatly illustrated that the danger of Gen Alpha outsourcing their imagination, creativity, and physical trial and error learning to AI is real.

A recent UK study found that ~22% of 8 to 12 year-old children already use generative AI tools, which – let’s remember – have not been designed from the outset with children in mind. Have the  lessons from social media’s impact on children been learned? The answer’s not obvious, which is why the Badger will be encouraging his grandson to produce his own Lego masterpieces without engaging in virtual world interactions. Another reason, of course, is that the Badger will be able to transfer knowledge and enjoy helping to build his creations too…

AI – A ‘Macbeth Moment’?

The Badger was browsing in a shop when Hubble Bubble (Toil and Trouble) by Manfred Mann featured in the piped music. It struck a chord with the recent warnings by JP Morgan’s CEO, the Bank of England, and others, that an AI bubble could pop. Later that day, while clearing a cupboard, the Badger found his old school notes for Shakespeare’s Macbeth, part of the English Literature syllabus of the time. Scribbled notes about the three witches uttering ‘Double, double toil and trouble: fire burn and cauldron bubble’ caught his eye. The coincidental combination of the song title, these scribbles, and the AI warnings triggered some contemplation on the AI bubble.

During the dot.com debacle of the early 2000s, the Badger was a senior member of a UK, stock-exchange listed, IT services company. Such companies, investors believed, would benefit from the dot.com boom. The company’s share price thus rose ~tenfold before collapsing back to its original level when the market realised that dot.com companies were massively over-valued, and many had little real revenue let alone profit. For years following the crash, doing business in the IT sector was tough. The NASDAQ, for example, crashed from around 5000 to 1100 and it took ~15 years to recover. Many dotcoms disappeared, but the likes of Amazon, eBay, Google and others rose from the ashes to become the powerhouses of recovery. Having worked in IT throughout the debacle, the Badger’s instincts are alive to tech bubbles. Today they ring alarm bells.

Whether AI’s a market bubble that bursts, or a transformation that sticks, depends on whether company valuations are grounded in real, scalable, business fundamentals, or  speculative optimism. Either way, AI is unlike anything seen before, so when JP Morgan, the Bank of England, the World Economic Forum and others have some anxiety, then we should take note, especially as, for example, Nvidia, Anthropic, and OpenAI’s market values have risen many-fold in just two years. There’s unprecedented spending on computational infrastructure, massive bets on future productivity gains, and belief that AI will revolutionise everything. The actual return  on investment, however, has not been impressive so far. When the UK National Cyber Security Centre advises organisations to have plans to operate their business without access to computers following a cyber-attack, the hype of an AI dominated future seems a little questionable.

The Badger’s learned from his dot.com era experience that it’s prudent to be wary. If market valuations become detached from fundamentals, or the availability of computational infrastructure stalls, or the promised productivity gains for organisations don’t materialise, or geopolitically driven export controls cause disruption, then any AI bubble will pop triggering a huge domino effect. AI is facing a ‘Macbeth moment’. Witches making prophecies surround the bubbling AI cauldron uttering ‘double, double toil and trouble; fire burn and cauldron bubble’. In the play, Macbeth felt a sense of foreboding…as do more and more of today’s leaders….

The Future; microchipped, monitored and tracked?

The Badger sank onto the sofa after his infant grandson’s parents collected the little whirlwind following a weekend sleepover. The Badger had been reminded that Generation Alpha are the most digital-immersed cohort yet. Born into a world full of tech, they are digital natives from an early age, as was evident during the activities we did over the weekend. Struck by the youngster’s digital awareness and especially their independence, curiosity, and eagerness to grasp not just what things are, but also why and how they work, the Badger found himself wondering about the digital world that his grandson might encounter in the future.

From his IT experience, the Badger knows that change is continuous and disruptive for IT professionals, organisations, and the public alike. Change in the digital landscape over the last 40 years has been phenomenal. All of the following have caused upheavals on the journey to the digital world we have today: the move from mainframes to client-server and computer networks, relational databases, the PC, spreadsheets and word processing packages, mobile networks and satellite communications, mobile computing, image processing, the internet, online businesses, social media, the cloud, microchip miniaturisation, and advances in software engineering. These have changed the way organisations function, how the general public engages with them, and how people interact with family, friends, and others globally. AI is essentially another transformative upheaval, and one that will impact Generation Alpha and future generations the most.

Data, especially personal data, is the ‘oil’ of today’s and tomorrow’s digital world, and the entities that hold and control it will use it to progress their own objectives. With AI and the automation of everything, the thirst for our data is unlikely to be quenched, which should make us worry about the digital world for Generation Alpha and beyond. Why? Because humans in the hands of tech, rather than the other way around, increasingly seems to be the direction of travel for our world. The UK government’s announcement of a digital ID ‘to help tackle illegal migration, make accessing government services easier, and enable wider efficiencies’ has made the Badger a little uneasy about the digital world his grandson will experience. A backlash, as illustrated by this petition to Parliament, illustrates the scale of worry that it’s a step towards mass surveillance and state control. Governments, after all, do not have good track records in delivering what they say they will.

As the Badger started to doze on the sofa, he envisaged a future where humans are microchipped and have their lives monitored and tracked in real time from birth to death, as happens with farm animals. He resolved to make sure his grandson learns about protecting his personal data and that he values a life with personal freedom rather than control by digital facilities. The Badger then succumbed to sleep, worn out from activities with a member of Generation Alpha…  

Cyber security – a ‘Holy Grail’?

King Arthur was a legendary medieval king of Britain. His association with the search for the ‘Holy Grail’, described in various traditions as a cup, dish, or stone with miraculous healing powers and, sometimes, providing eternal youth or infinite sustenance, stems from the 12th century. Since then, the search has become an essential part of Arthurian legend, so much so that Monty Python parodied it in their 1975 film. Indeed, it’s common for people today to refer to any goal that seems impossible to reach as a ‘Holy Grail’. It’s become a powerful metaphor for a desired, ultimate achievement that’s beyond reach.

Recently, bad cyber actors – a phrase used here to refer collectively to wicked individuals, gangs, and organisations, regardless of their location, ideology, ultimate sponsorship or specific motives – have caused a plethora of highly disruptive incidents in the UK. Incidents at the Co-op, Marks & Spencer, Harrods, JLR, and  Kido  have been high profile due to the nature and scale of the impact on the companies themselves, their supply chains, their customers, and also potentially the economy. Behind the scenes (see here, for example) questions are, no doubt, being asked not only of the relevant IT service providers, but also more generally about how vulnerable we are to cyber security threats.

While taking in the colours of Autumn visible through the window by his desk, the Badger found himself mulling over what these incidents imply in a modern world reliant on the internet, online services, automation and underlying IT systems. As the UK government’s ‘Cyber security breaches survey – 2025’ shows, the number of bad cyber actor incidents reported is high, with many more going unreported. AI, as the National Cyber Security Centre  indicates, means that bad actors will inevitably become more effective in their intrusion operations, and so we can expect an increase in the frequency and intensity of cyber threats in the coming years. The musing Badger, therefore, concluded that organisations need to be relentlessly searching for a ‘Holy Grail’ to protect their operations from being vulnerable to serious cyber security breaches. As he watched a few golden leaves flutter to the ground, the Badger also concluded that in a world underpinned by complex IT, continuous digital evolution, and AI, this ‘Holy Grail’ will never be found. But that doesn’t mean organisations should stop searching for it!

These damaging incidents highlight again that cyber security cannot be taken for granted, especially when the tech revolution of recent decades has enabled anyone with a little knowledge and internet access to be a bad cyber actor. The UK government’s just announced the introduction of  digital ID by 2029. Perhaps they have found a ‘Holy Grail’ that guarantees not only the security of personal data, but also that its IT programmes will deliver on time and to their original budget? Hmm, that’s very doubtful…

AI – Pop goes the weasel!

The Badger’s five-year old grandson, full of energy, innocence, and inquisitiveness, has been staying for a few days. It’s been fun, tiring, and a reminder that grandparents can be important influencers for Generation Alpha!  It was also a reminder that today’s childhood is vastly different to that of previous generations. The Badger’s grandson considers being on WhatsApp video calls, watching kids YouTube videos, and engaging with technology like phones, tablets, and laptops in classroom and home settings as routine. This wasn’t the case when the Badger was five, nor was it when the youngster’s Millennial parents were that age!

One evening, just before the lad’s bedtime, the Badger was on the sofa engrossed in the news feed on his smartphone. Reports of anxiety that AI is a stock market bubble about to pop had grabbed his attention. Some reports (like the one here), but certainly not all, derived from a report from MIT noting that most AI investments made by companies have so far provided zero returns. This fuelled concerns, existing in some quarters for a while, that AI is a stock market bubble soon to crash. Many of the reports drew parallels between AI and the dot.com crash of 25 years ago. As a professional in the IT sector at that time, the Badger experienced first-hand the dot.com era and its aftermath, and so he became absorbed in his own thoughts about the parallels. Until, that is, his grandson jumped on the sofa, prodded the Badger’s ribs, and asked to watch a ‘Pop goes the weasel’ cartoon. Initially struck by the synergy between ‘Pop goes the weasel’ as a good label for his AI thoughts, a suitable YouTube cartoon was found and the two of us watched it on the Badger’s smartphone. (A kids punk-music version of the rhyme didn’t seem suitable just before bedtime).

Once the youngster was in bed, the Badger cogitated further on the dot.com era and AI. The late 1990s saw rapid tech advances with many investors expecting internet-based companies to succeed simply because the internet was an innovation. Companies launched on stock markets even though they had yet to generate meaningful revenue or profits and had no proprietary technology or finished products. Valuations boomed regardless of dodgy fundamentals, and the dot.com crash was thus, to those with objectivity, inevitable. To an extent, some of the same dynamics exist with AI today. It may be a transformative technology, with the likes of ChatGPT having impressive traction with people, but AI is really still in its infancy striving to show a return on investment in a company setting. The Badger senses, therefore, that AI is likely in  sizeable correction rather than dot.com crash territory. This should be no surprise, because the history of tech stock market valuations suggests, to quote the nursery rhyme, ’that’s the way the money goes. Pop goes the weasel’… 

Youngsters outsourcing their mental effort to technology…

Live Aid happened on Saturday 13th July 1985. If you were a young adult then, do you remember what you were doing when the concert happened? Were you there? Did you watch it live on television? The Badger had his hands full that day doing some home renovations while having a one-year-old baby in the house. He thus only saw snippets of the televised live concert. Last weekend, however, he made up for it by watching the highlights broadcast to celebrate the concert’s 40th anniversary.

Watching the highlights brought home why the music at the concert has stood the passage of time. It was delivered by talented people with great skill and showmanship without today’s cosseting production techniques and tech wizardry. What struck a chord most, however, was the enthusiasm of the Wembley Stadium crowd, the vast majority of whom are now grandparents in, or facing, retirement! People in that crowd had none of the internet access, smartphones, or online services we take for granted today. In 1985 the UK’s first cellular telephone services were only just being introduced by Cellnet and Vodafone, and ‘home computing’ meant the likes of the Sinclair ZX Spectrum and the BBC Micro. A far cry from today! Furthermore, those in that crowd represent a generation that thought for themselves and didn’t have their minds dulled by reliance on digital technology and internet-based online services. Their grandchildren, on the other hand, only know life based around the internet, and they often seem oblivious to the likelihood that their reliance on online things like social media might be dulling their minds, nudging them towards a passivity of thought, and perhaps ultimately causing atrophy of their brain.  

Concern about technology dulling human minds isn’t new. In 370 BC, for example, Socrates worried that writing would erode a person’s memory!  With AI endlessly expanding, however, the potential for today’s youngsters to completely outsource mental effort to technology seems very real. More and more  scientific evidence shows  that while the human brain is highly adaptable, digital immersion changes attentiveness, the way we process information, and decision-making. Some brain functions weaken due to digital immersion, others evolve, but the Badger thinks that when our digital world provides instant answers, the joy and effort of discovery through independent thought is dwindling. Always available digital content at our fingertips means fragmented attention spans and contemplation and reflection taking a back seat,  especially for youngsters with no life-experience without today’s online world.

Watching the 40th anniversary highlights thus did more than provide a reminder of the great music of that day. It brought home the fact that today’s  grandparents have something precious – a lived experience of independent thought and contemplation without an overreliance on our digital world. It feels, however, that their grandchildren are progressively outsourcing their mental effort to ever more advanced digital technology which, this grandfather senses, doesn’t augur well for the human race…

Software – The invisible infrastructure of daily life

We’re living in a world where software is the invisible infrastructure for daily life. There doesn’t appear to be any quantitative measure of what proportion of everything we use actually contains software, but it’s not outrageous to assert that almost everything we use depends on it. Whatever the real proportion is, with the advent of AI and the Internet of Things (IoT) it’s only climbing. Software has blossomed from primitive beginnings to become crucial in every facet of life in less than 75 years. In the 1950s software wasn’t a ‘product’ but something bundled with hardware used primarily for scientific calculations, military simulations, and data processing. Most of it was written in machine code or early assembly languages tailored to specific hardware, and it was written by the same engineers who built the hardware. A large program was a few thousand lines of code loaded manually via cards or magnetic tape. FORTRAN only emerged in 1956 and the term ‘software engineering’ was only coined in the late 1960s/early 1970s.

How things have changed. Today’s internet, search tools, social media, cars, medical devices, satellites, aircraft, trains, weapons, smartphones and so on depend on many, many millions of lines of code, see here and here for example. Modern health, financial, energy, government, intelligence, and defence capabilities all rely on huge amounts of software. Indeed, any item in our homes that can sync with a smartphone app contains software. In less than 75 years software has changed life, taken over the world, and become professionalised in the way it’s produced. Writing machine code for specific hardware in a way that ‘every byte counts’ has evolved into the professional, ever changing and improving, discipline of software engineering which incorporates design and development processes, methods, standards, tools and techniques that ensure that production software meets requirements, is scalable, tested, robust, maintainable, secure, and performant. Software engineering, of which coding is today just a subset, continues to evolve with, for example, the likes of Microsoft and Google anticipating that AI will render hand crafting of code redundant by the end of this decade.  

The software woven into our lives in recent decades has brought immense convenience and transformed communication, public services, business, and conflict, as world events illustrate. Today it’s undeniably critical infrastructure, but software, unlike most infrastructure, isn’t something you can tangibly touch! Compared to what existed 75 years ago, the software in use today is a vulnerable sprawling metropolis with vulnerabilities that bad actors can use to cause disruption. Are we safer today than in the 1950s now that everything depends on software? Hmm, that’s debateable, but what the Badger senses is that even with the advent of AI, software engineering as a discipline and the career prospects for software engineers focused on solutions, security, quality, robustness, and testing rather than coding, are good for many, many years yet…

Once upon a time there was the Strategic Defense Initiative (Starwars)…

There comes a time when a room at home needs a decorative refresh. That time recently came in the Badger household, and so he deployed his practical skills to refurbish the room himself. The project was planned, agreed with an important stakeholder (the wife), and fastidiously executed. The room’s now in the post-delivery phase with the small list of defects pointed out at acceptance by the important stakeholder now corrected. Painting walls listening to good music playing on the radio during the project proved a more satisfying experience than expected. On finishing one wall, and while stepping back admiring his handiwork, the Badger found himself listening to the broadcaster’s regular news bulletin and sighing deeply on hearing that President Trump had unveiled plans for a ~$175 billion US ‘Golden Dome’ missile defence system. Memories of President Reagan’s 1983 Strategic Defence Initiative (SDI) came flooding back.

The goal of SDI was to develop a system that could intercept and destroy incoming nuclear missiles, effectively shielding the USA from a potential Soviet attack during the Cold War. Many dubbed it ‘Star Wars’ because of its proposed use of space-based technology. At the time, the Badger was working on the software design and development of a Relational Database Management System (RDMS) product – pretty cutting edge at the time. He remembers thinking that SDI would never come to fruition. Indeed, SDI itself was never fully realised, but its ideas have shaped military technology and policies in Missile and Space-based defence, Cybersecurity strategy, and International Collaboration ever since.

Rolling forward 40 years, the world is a quite different place geopolitically, technologically, economically, and militarily. Daily civilian and military life now depends on digital capabilities that didn’t exist in 1983, and continued rapid tech advances, innovation and AI are changing both domains at a rate never imagined just a few decades ago. Reagan’s SDI initiative and President Trump’s ‘Golden Dome’ share some similarities, but whilst the available tech in 1983 meant the former’s space-based missile defence was largely theoretical, President Trump’s benefits from modern, real, sophisticated satellite, space, sensor, and missile technologies. ‘Golden Dome’ revives elements of SDI but it also suffers from some of the same challenges, particularly, around cost, scepticism about its effectiveness, and concern that it dramatically escalates the global arms race. It’s certain, however, that just as happened when SDI was announced in 1983, military and tech sector commercial organisations will be relishing the prospect of picking up ‘Golden Dome’ contracts regardless of whether its stated ambitions will ever fully come to fruition.

But why did the Badger sigh so deeply on hearing about ‘Golden Dome’ on the radio? It was simply an instant reaction to the feeling that it’s another step on the road to creating the Terminator film’s SKYNET system for real, and that our species seems intent on a path that can lead to eventual self-inflicted extinction.