Cyber security – a ‘Holy Grail’?

King Arthur was a legendary medieval king of Britain. His association with the search for the ‘Holy Grail’, described in various traditions as a cup, dish, or stone with miraculous healing powers and, sometimes, providing eternal youth or infinite sustenance, stems from the 12th century. Since then, the search has become an essential part of Arthurian legend, so much so that Monty Python parodied it in their 1975 film. Indeed, it’s common for people today to refer to any goal that seems impossible to reach as a ‘Holy Grail’. It’s become a powerful metaphor for a desired, ultimate achievement that’s beyond reach.

Recently, bad cyber actors – a phrase used here to refer collectively to wicked individuals, gangs, and organisations, regardless of their location, ideology, ultimate sponsorship or specific motives – have caused a plethora of highly disruptive incidents in the UK. Incidents at the Co-op, Marks & Spencer, Harrods, JLR, and  Kido  have been high profile due to the nature and scale of the impact on the companies themselves, their supply chains, their customers, and also potentially the economy. Behind the scenes (see here, for example) questions are, no doubt, being asked not only of the relevant IT service providers, but also more generally about how vulnerable we are to cyber security threats.

While taking in the colours of Autumn visible through the window by his desk, the Badger found himself mulling over what these incidents imply in a modern world reliant on the internet, online services, automation and underlying IT systems. As the UK government’s ‘Cyber security breaches survey – 2025’ shows, the number of bad cyber actor incidents reported is high, with many more going unreported. AI, as the National Cyber Security Centre  indicates, means that bad actors will inevitably become more effective in their intrusion operations, and so we can expect an increase in the frequency and intensity of cyber threats in the coming years. The musing Badger, therefore, concluded that organisations need to be relentlessly searching for a ‘Holy Grail’ to protect their operations from being vulnerable to serious cyber security breaches. As he watched a few golden leaves flutter to the ground, the Badger also concluded that in a world underpinned by complex IT, continuous digital evolution, and AI, this ‘Holy Grail’ will never be found. But that doesn’t mean organisations should stop searching for it!

These damaging incidents highlight again that cyber security cannot be taken for granted, especially when the tech revolution of recent decades has enabled anyone with a little knowledge and internet access to be a bad cyber actor. The UK government’s just announced the introduction of  digital ID by 2029. Perhaps they have found a ‘Holy Grail’ that guarantees not only the security of personal data, but also that its IT programmes will deliver on time and to their original budget? Hmm, that’s very doubtful…

AI – Pop goes the weasel!

The Badger’s five-year old grandson, full of energy, innocence, and inquisitiveness, has been staying for a few days. It’s been fun, tiring, and a reminder that grandparents can be important influencers for Generation Alpha!  It was also a reminder that today’s childhood is vastly different to that of previous generations. The Badger’s grandson considers being on WhatsApp video calls, watching kids YouTube videos, and engaging with technology like phones, tablets, and laptops in classroom and home settings as routine. This wasn’t the case when the Badger was five, nor was it when the youngster’s Millennial parents were that age!

One evening, just before the lad’s bedtime, the Badger was on the sofa engrossed in the news feed on his smartphone. Reports of anxiety that AI is a stock market bubble about to pop had grabbed his attention. Some reports (like the one here), but certainly not all, derived from a report from MIT noting that most AI investments made by companies have so far provided zero returns. This fuelled concerns, existing in some quarters for a while, that AI is a stock market bubble soon to crash. Many of the reports drew parallels between AI and the dot.com crash of 25 years ago. As a professional in the IT sector at that time, the Badger experienced first-hand the dot.com era and its aftermath, and so he became absorbed in his own thoughts about the parallels. Until, that is, his grandson jumped on the sofa, prodded the Badger’s ribs, and asked to watch a ‘Pop goes the weasel’ cartoon. Initially struck by the synergy between ‘Pop goes the weasel’ as a good label for his AI thoughts, a suitable YouTube cartoon was found and the two of us watched it on the Badger’s smartphone. (A kids punk-music version of the rhyme didn’t seem suitable just before bedtime).

Once the youngster was in bed, the Badger cogitated further on the dot.com era and AI. The late 1990s saw rapid tech advances with many investors expecting internet-based companies to succeed simply because the internet was an innovation. Companies launched on stock markets even though they had yet to generate meaningful revenue or profits and had no proprietary technology or finished products. Valuations boomed regardless of dodgy fundamentals, and the dot.com crash was thus, to those with objectivity, inevitable. To an extent, some of the same dynamics exist with AI today. It may be a transformative technology, with the likes of ChatGPT having impressive traction with people, but AI is really still in its infancy striving to show a return on investment in a company setting. The Badger senses, therefore, that AI is likely in  sizeable correction rather than dot.com crash territory. This should be no surprise, because the history of tech stock market valuations suggests, to quote the nursery rhyme, ’that’s the way the money goes. Pop goes the weasel’… 

Youngsters outsourcing their mental effort to technology…

Live Aid happened on Saturday 13th July 1985. If you were a young adult then, do you remember what you were doing when the concert happened? Were you there? Did you watch it live on television? The Badger had his hands full that day doing some home renovations while having a one-year-old baby in the house. He thus only saw snippets of the televised live concert. Last weekend, however, he made up for it by watching the highlights broadcast to celebrate the concert’s 40th anniversary.

Watching the highlights brought home why the music at the concert has stood the passage of time. It was delivered by talented people with great skill and showmanship without today’s cosseting production techniques and tech wizardry. What struck a chord most, however, was the enthusiasm of the Wembley Stadium crowd, the vast majority of whom are now grandparents in, or facing, retirement! People in that crowd had none of the internet access, smartphones, or online services we take for granted today. In 1985 the UK’s first cellular telephone services were only just being introduced by Cellnet and Vodafone, and ‘home computing’ meant the likes of the Sinclair ZX Spectrum and the BBC Micro. A far cry from today! Furthermore, those in that crowd represent a generation that thought for themselves and didn’t have their minds dulled by reliance on digital technology and internet-based online services. Their grandchildren, on the other hand, only know life based around the internet, and they often seem oblivious to the likelihood that their reliance on online things like social media might be dulling their minds, nudging them towards a passivity of thought, and perhaps ultimately causing atrophy of their brain.  

Concern about technology dulling human minds isn’t new. In 370 BC, for example, Socrates worried that writing would erode a person’s memory!  With AI endlessly expanding, however, the potential for today’s youngsters to completely outsource mental effort to technology seems very real. More and more  scientific evidence shows  that while the human brain is highly adaptable, digital immersion changes attentiveness, the way we process information, and decision-making. Some brain functions weaken due to digital immersion, others evolve, but the Badger thinks that when our digital world provides instant answers, the joy and effort of discovery through independent thought is dwindling. Always available digital content at our fingertips means fragmented attention spans and contemplation and reflection taking a back seat,  especially for youngsters with no life-experience without today’s online world.

Watching the 40th anniversary highlights thus did more than provide a reminder of the great music of that day. It brought home the fact that today’s  grandparents have something precious – a lived experience of independent thought and contemplation without an overreliance on our digital world. It feels, however, that their grandchildren are progressively outsourcing their mental effort to ever more advanced digital technology which, this grandfather senses, doesn’t augur well for the human race…

Software – The invisible infrastructure of daily life

We’re living in a world where software is the invisible infrastructure for daily life. There doesn’t appear to be any quantitative measure of what proportion of everything we use actually contains software, but it’s not outrageous to assert that almost everything we use depends on it. Whatever the real proportion is, with the advent of AI and the Internet of Things (IoT) it’s only climbing. Software has blossomed from primitive beginnings to become crucial in every facet of life in less than 75 years. In the 1950s software wasn’t a ‘product’ but something bundled with hardware used primarily for scientific calculations, military simulations, and data processing. Most of it was written in machine code or early assembly languages tailored to specific hardware, and it was written by the same engineers who built the hardware. A large program was a few thousand lines of code loaded manually via cards or magnetic tape. FORTRAN only emerged in 1956 and the term ‘software engineering’ was only coined in the late 1960s/early 1970s.

How things have changed. Today’s internet, search tools, social media, cars, medical devices, satellites, aircraft, trains, weapons, smartphones and so on depend on many, many millions of lines of code, see here and here for example. Modern health, financial, energy, government, intelligence, and defence capabilities all rely on huge amounts of software. Indeed, any item in our homes that can sync with a smartphone app contains software. In less than 75 years software has changed life, taken over the world, and become professionalised in the way it’s produced. Writing machine code for specific hardware in a way that ‘every byte counts’ has evolved into the professional, ever changing and improving, discipline of software engineering which incorporates design and development processes, methods, standards, tools and techniques that ensure that production software meets requirements, is scalable, tested, robust, maintainable, secure, and performant. Software engineering, of which coding is today just a subset, continues to evolve with, for example, the likes of Microsoft and Google anticipating that AI will render hand crafting of code redundant by the end of this decade.  

The software woven into our lives in recent decades has brought immense convenience and transformed communication, public services, business, and conflict, as world events illustrate. Today it’s undeniably critical infrastructure, but software, unlike most infrastructure, isn’t something you can tangibly touch! Compared to what existed 75 years ago, the software in use today is a vulnerable sprawling metropolis with vulnerabilities that bad actors can use to cause disruption. Are we safer today than in the 1950s now that everything depends on software? Hmm, that’s debateable, but what the Badger senses is that even with the advent of AI, software engineering as a discipline and the career prospects for software engineers focused on solutions, security, quality, robustness, and testing rather than coding, are good for many, many years yet…

Once upon a time there was the Strategic Defense Initiative (Starwars)…

There comes a time when a room at home needs a decorative refresh. That time recently came in the Badger household, and so he deployed his practical skills to refurbish the room himself. The project was planned, agreed with an important stakeholder (the wife), and fastidiously executed. The room’s now in the post-delivery phase with the small list of defects pointed out at acceptance by the important stakeholder now corrected. Painting walls listening to good music playing on the radio during the project proved a more satisfying experience than expected. On finishing one wall, and while stepping back admiring his handiwork, the Badger found himself listening to the broadcaster’s regular news bulletin and sighing deeply on hearing that President Trump had unveiled plans for a ~$175 billion US ‘Golden Dome’ missile defence system. Memories of President Reagan’s 1983 Strategic Defence Initiative (SDI) came flooding back.

The goal of SDI was to develop a system that could intercept and destroy incoming nuclear missiles, effectively shielding the USA from a potential Soviet attack during the Cold War. Many dubbed it ‘Star Wars’ because of its proposed use of space-based technology. At the time, the Badger was working on the software design and development of a Relational Database Management System (RDMS) product – pretty cutting edge at the time. He remembers thinking that SDI would never come to fruition. Indeed, SDI itself was never fully realised, but its ideas have shaped military technology and policies in Missile and Space-based defence, Cybersecurity strategy, and International Collaboration ever since.

Rolling forward 40 years, the world is a quite different place geopolitically, technologically, economically, and militarily. Daily civilian and military life now depends on digital capabilities that didn’t exist in 1983, and continued rapid tech advances, innovation and AI are changing both domains at a rate never imagined just a few decades ago. Reagan’s SDI initiative and President Trump’s ‘Golden Dome’ share some similarities, but whilst the available tech in 1983 meant the former’s space-based missile defence was largely theoretical, President Trump’s benefits from modern, real, sophisticated satellite, space, sensor, and missile technologies. ‘Golden Dome’ revives elements of SDI but it also suffers from some of the same challenges, particularly, around cost, scepticism about its effectiveness, and concern that it dramatically escalates the global arms race. It’s certain, however, that just as happened when SDI was announced in 1983, military and tech sector commercial organisations will be relishing the prospect of picking up ‘Golden Dome’ contracts regardless of whether its stated ambitions will ever fully come to fruition.

But why did the Badger sigh so deeply on hearing about ‘Golden Dome’ on the radio? It was simply an instant reaction to the feeling that it’s another step on the road to creating the Terminator film’s SKYNET system for real, and that our species seems intent on a path that can lead to eventual self-inflicted extinction.

AI and copyright…

Elton John recently had some sharp words to say about the UK government’s plans to exempt AI technology firms from copyright laws. Apparently, there’s currently a game of ping-pong underway between the House of Commons and the House of Lords regarding this plan. Many writers, musicians, and artists are furious about the plan, and Elton’s comments caused the Badger to scratch his head and ponder. Why? Because, like many individuals and bloggers, his website’s content could be plundered by AI without his knowledge or permission regardless of the copyright statement on its home page. With AI models and tools increasingly mainstream, Elton’s words made the Badger realise that he, and probably many others around the globe, should have copyright more prominent in our thoughts.

Copyright law is complex and, as far as the Badger understands, ‘fair dealing’ or ‘fair use’ allows limited use of copyright material without permission from the copyright owner under specific circumstances. Fair dealing/use is not a blanket permission, and what constitutes this depends on factors such as how much of the material is used, whether its use is justified, and whether it affects the copyright owner’s income. The Badger’s not a lawyer, but  he senses that AI and copyright is a legal minefield that will keep experts with digital and legal qualifications in lucrative work for years to come.

As the Badger pondered, he scratched his head again and then asked Copilot if AI used material held on copyrighted websites. The short response was that it (and other AI) follows strict copyright guidelines and only generates brief summaries of copyrighted material respecting fair use principles and with pointers to official sources. To test the efficacy of the answer, the Badger asked Copilot for the lyrics of Elton John’s song ‘Candle in the wind’. Copilot responded with ‘Can’t do that due to copyright’. Typing the same request, however, into the Badger’s browser readily produced the lyrics. Make of that what you will, but it does make you wonder why you would need to use AI like Copilot for this kind of interaction.

At the heart of Elton John’s point is the long-established principle that if someone or an enterprise wants to use copyrighted material in something that produces a commercial gain for themselves, then the copyright owner should give prior permission and be paid. AI is a disruptive technology, much of it controlled by the same giant US corporations that already dominate the tech world. AI cannot be ignored, but exempting tech firms from copyright law seems wrong on many different levels. The Badger’s concluded that he should improve his understanding of copyright law, and that AI tech firms must not be exempt from such laws. After all, if you were to take a leaf out of President Trump’s playbook then if you want something, you need permission AND  you must pay.

A vintage Fortran Source code listing…

The Badger found an old paper Fortran source code listing, in good condition considering its age, at the back of a cupboard this week. It triggered memories of his programming activities early in his IT career. It also caused him to reflect on the changes there have been in IT as a result of the tremendous advances in digital technology, and the way we live and work, over the last 40 years. As illustrated below, this period has been one of continuous, rapid change.

In the 1980s, personal computers began to make their way into businesses and homes. The likes of IBM, Apple, and Microsoft introduced devices that revolutionized how people accessed information and performed tasks. The introduction of graphical user interfaces (GUIs) also made computers more user-friendly enabling a broader audience to embrace technology. The 1990s brought the birth and expansion of the internet, drastically changing communication, commerce, and entertainment. It brought a new level of connectivity and made information accessible globally at the click of a button. E-commerce giants like Amazon and eBay emerged, transforming the retail landscape and giving rise to online shopping.

The 2000s saw the rise of the mobile revolution. With the introduction of smartphones and tablets, technology became ever more integrated into our work and personal lives. Apple’s iPhone and Google’s Android led the charge, creating app-driven ecosystems that allowed users to perform a myriad of tasks on-the-go. Mobile internet access became ubiquitous fostering a new era of social media, instant messaging, and mobile gaming. In the 2010s, Cloud Computing with Amazon Web Services (AWS), Microsoft Azure, and Google Cloud brought scalable, on-demand, computing resources. This facilitated the rise of Software as a Service (SaaS) models which enable access to software applications via the internet and help businesses to reduce infrastructure costs and improve scalability.

In recent years, ‘Big Data’ has meant that organizations can leverage vast amounts of data to gain customer insights, optimize their operations, and make data-driven decisions. AI technologies such as machine learning, natural language processing, and computer vision, are also rapidly being integrated into applications from healthcare and finance to autonomous vehicles and smart home devices. In addition, the COVID-19 pandemic accelerated the adoption of remote working and digital collaboration tools, and video conferencing platforms like Zoom and Microsoft Teams have become essential communication and productivity tools.

Anyone working in the IT world over this period has had an exciting time! The Fortran listing reminded the Badger that it was produced when programming was a very human, hand-crafted activity. Source code today is produced differently, and AI will dominate programming in the future. The Badger’s career spanning all these changes  was challenging, exciting, creative, and one where dynamism, innovation, teamwork, hard work, and a ‘can do’ mentality were embedded workforce traits. Is that the case today? It has to be in a future which is dominated by AI.