Youngsters outsourcing their mental effort to technology…

Live Aid happened on Saturday 13th July 1985. If you were a young adult then, do you remember what you were doing when the concert happened? Were you there? Did you watch it live on television? The Badger had his hands full that day doing some home renovations while having a one-year-old baby in the house. He thus only saw snippets of the televised live concert. Last weekend, however, he made up for it by watching the highlights broadcast to celebrate the concert’s 40th anniversary.

Watching the highlights brought home why the music at the concert has stood the passage of time. It was delivered by talented people with great skill and showmanship without today’s cosseting production techniques and tech wizardry. What struck a chord most, however, was the enthusiasm of the Wembley Stadium crowd, the vast majority of whom are now grandparents in, or facing, retirement! People in that crowd had none of the internet access, smartphones, or online services we take for granted today. In 1985 the UK’s first cellular telephone services were only just being introduced by Cellnet and Vodafone, and ‘home computing’ meant the likes of the Sinclair ZX Spectrum and the BBC Micro. A far cry from today! Furthermore, those in that crowd represent a generation that thought for themselves and didn’t have their minds dulled by reliance on digital technology and internet-based online services. Their grandchildren, on the other hand, only know life based around the internet, and they often seem oblivious to the likelihood that their reliance on online things like social media might be dulling their minds, nudging them towards a passivity of thought, and perhaps ultimately causing atrophy of their brain.  

Concern about technology dulling human minds isn’t new. In 370 BC, for example, Socrates worried that writing would erode a person’s memory!  With AI endlessly expanding, however, the potential for today’s youngsters to completely outsource mental effort to technology seems very real. More and more  scientific evidence shows  that while the human brain is highly adaptable, digital immersion changes attentiveness, the way we process information, and decision-making. Some brain functions weaken due to digital immersion, others evolve, but the Badger thinks that when our digital world provides instant answers, the joy and effort of discovery through independent thought is dwindling. Always available digital content at our fingertips means fragmented attention spans and contemplation and reflection taking a back seat,  especially for youngsters with no life-experience without today’s online world.

Watching the 40th anniversary highlights thus did more than provide a reminder of the great music of that day. It brought home the fact that today’s  grandparents have something precious – a lived experience of independent thought and contemplation without an overreliance on our digital world. It feels, however, that their grandchildren are progressively outsourcing their mental effort to ever more advanced digital technology which, this grandfather senses, doesn’t augur well for the human race…

Fuzzy information? Still make decisions…

Twenty years ago on the 7th July 2005, four suicide bombers targeted London’s public transport system during the morning rush hour. At 8:50am three bombs detonated within 50 seconds of each other on Underground trains at Aldgate, Edgeware Road, and Russell Square, and a fourth detonated an hour later on a double-decker bus in Tavistock Square. Almost 800 innocent people were injured and 52 lost their lives. The Badger remembers that day clearly. At the time of the bombing, he was attending a UK leadership meeting in his firms Great Marlborough Street office completely oblivious to unfolding events.

The UK CEO had started the meeting at 9:30am even though the UK Sales Director was absent and hadn’t called to say they’d be late. They eventually arrived at 10:20am,  perspiring heavily having walked from Waterloo because no Underground trains were running. They said ‘Something serious is happening. There’s sirens everywhere, the Underground isn’t running, and mobile phone networks aren’t working’. The room’s TV was tuned to a news channel, and everyone present scanned the internet, tried their Blackberry devices, and looked at their corporate emails for information. No one could connect to a mobile phone network. When news of the Tavistock Square bus explosion appeared on the TV  there was instant recognition that the meeting could not continue, not least because Tavistock Square was just a 4-minute walk from the company’s main London office housing some hundreds of staff.

The Badger, the company lead on business continuity crises, activated the company response to the unfolding event. The meeting room became a rudimentary crisis management centre. It’s tools were just a conference phone, laptops providing access to corporate email, the news channel on the TV, and Blackberry devices with, at best, intermittent mobile network connectivity. The Badger and a subset of his colleagues spent the next 10 hours in the room dealing with a maelstrom that involved monitoring the terror incident, mobilising business continuity contacts and processes, establishing the well-being of staff and visitors to the company’s London offices, ensuring the continuity of projects and services, making and communicating clear decisions relevant to clients, verifying the continuity of business operations, and dealing with the needs and well-being of staff.  

It was an intense day full of fuzzy, confusing, and often conflicting information. For the Badger and his colleagues, the experience reinforced the importance of having cool, unemotional heads to make decisions during crises, especially when information is highly fluid. It also reinforced that fuzzy, confusing, or conflicting information should not be used as an excuse for prevaricating on decision-making when there’s overwhelming pressure. Make a decision, move on, and change it if better information emerges was an important dynamic. We eventually went home exhausted having made many more good decisions than bad. It hadn’t been the routine day in the office the Badger had expected. It had been truly unforgettable….

Gold, e-waste, and a dependence on physical innovation…

There’s a large Acer tree in the Badger’s garden with masses of delicate leaves which rustle sweetly when there’s a hint of breeze. In the Summer it’s a great place to sit in the shade under its branches with a cold beer. In the UK’s heatwave that’s exactly what the Badger did to escape the sun’s rays, read, track online interests, and cogitate about life. He’s probably drunk more cold beer than prudent but chilling out in this way allows the mind to be stimulated by something you read, at least that’s the case with the Badger. You can predict neither the trigger in advance, nor how your thoughts will develop to a conclusion once they’re triggered. So, what caught the Badger’s eye and triggered the stream of thought that prompted the writing of this post? It was reading that an interdisciplinary team of scientists has found a new and sustainable way to recover gold from e-waste (see here and here). 

The Badger’s interest was piqued because of his metals/materials research background prior to a career in IT during which a latent interest in metals/materials never entirely disappeared. Gold, recovering it from e-waste, and e-waste itself, are fascinating topics given this metal’s unique properties. The total amount of gold ever mined makes just a 22-metre sided cube, and tiny quantities are used in smartphones, computers, and most other electronic devices. E-waste is any electrical or electronic equipment that’s been discarded, working or not. We all have some – perhaps an old MP3 player, smartphone, or tablet – somewhere in a drawer or cupboard. E-waste volumes, containing gold and other important elements, are growing but less than 25% of it is collected and recycled.  A new, sustainable, cheaper, and less hazardous way of recovering gold from it is an important development, especially if we stop hoarding our old devices in the first place!

Once triggered, where did the Badger’s thoughts end up? They meandered but concluded something about innovation, a subject that seems to be dominated in the mainstream by AI and new services in the virtual digital world. But here’s the point. None of this virtual digital innovation could exist without underlying ‘true physical innovation’ in the world of metals and materials. Without innovation in the science, extraction, processing, manufacturing, and recycling of condensed matter none of the electronic devices we rely on in the online digital world of today and tomorrow would exist. Youngsters looking for a stable and fertile career path should thus consider the physics and chemistry of metals/materials because the world today and in the future  depends more and more on innovation in this field. One thing, however, is a certainty. You never know what will trigger your thoughts and where they will take you if you relax with a cold beer in the shade under a tree in a heatwave….

Software – The invisible infrastructure of daily life

We’re living in a world where software is the invisible infrastructure for daily life. There doesn’t appear to be any quantitative measure of what proportion of everything we use actually contains software, but it’s not outrageous to assert that almost everything we use depends on it. Whatever the real proportion is, with the advent of AI and the Internet of Things (IoT) it’s only climbing. Software has blossomed from primitive beginnings to become crucial in every facet of life in less than 75 years. In the 1950s software wasn’t a ‘product’ but something bundled with hardware used primarily for scientific calculations, military simulations, and data processing. Most of it was written in machine code or early assembly languages tailored to specific hardware, and it was written by the same engineers who built the hardware. A large program was a few thousand lines of code loaded manually via cards or magnetic tape. FORTRAN only emerged in 1956 and the term ‘software engineering’ was only coined in the late 1960s/early 1970s.

How things have changed. Today’s internet, search tools, social media, cars, medical devices, satellites, aircraft, trains, weapons, smartphones and so on depend on many, many millions of lines of code, see here and here for example. Modern health, financial, energy, government, intelligence, and defence capabilities all rely on huge amounts of software. Indeed, any item in our homes that can sync with a smartphone app contains software. In less than 75 years software has changed life, taken over the world, and become professionalised in the way it’s produced. Writing machine code for specific hardware in a way that ‘every byte counts’ has evolved into the professional, ever changing and improving, discipline of software engineering which incorporates design and development processes, methods, standards, tools and techniques that ensure that production software meets requirements, is scalable, tested, robust, maintainable, secure, and performant. Software engineering, of which coding is today just a subset, continues to evolve with, for example, the likes of Microsoft and Google anticipating that AI will render hand crafting of code redundant by the end of this decade.  

The software woven into our lives in recent decades has brought immense convenience and transformed communication, public services, business, and conflict, as world events illustrate. Today it’s undeniably critical infrastructure, but software, unlike most infrastructure, isn’t something you can tangibly touch! Compared to what existed 75 years ago, the software in use today is a vulnerable sprawling metropolis with vulnerabilities that bad actors can use to cause disruption. Are we safer today than in the 1950s now that everything depends on software? Hmm, that’s debateable, but what the Badger senses is that even with the advent of AI, software engineering as a discipline and the career prospects for software engineers focused on solutions, security, quality, robustness, and testing rather than coding, are good for many, many years yet…

Smartwatch, traditional watch, or both?

Is there a smartwatch from the likes of Apple, Samsung, Huawei and others, on your wrist? A decade ago, smartwatches were essentially novelties for tech enthusiasts. Today they’re mainstream. In the ten years since Apple unveiled its first watch  they’ve become a popular, wrist-worn, command and control centres for time, date, productivity aids, communication, fitness and personal health.  Globally there are more than 450 million smartwatches in use, and the number is expected to rise to ~750 million by 2029. Many people are turning to smartwatches from traditional mechanical/ automatic watches because they do significantly more than just tell the time and their capabilities continue to expand as technology marches on.

So, does this mean the traditional wristwatch, which first appeared in the 19th century, will soon be obsolete? Many say yes, but the Badger thinks otherwise. A traditional mechanical/automatic watch performs its purpose of providing the time and date  extremely well. Accordingly, it’ll be around for many decades yet because it has design simplicity, is robust, doesn’t require frequent battery charging or software updates, and is immune to cyber threats. Traditional watches provide their core function – the time and date – in aesthetically pleasing hardware that can be chosen to suit any lifestyle or occasion. Many think that a traditional watch’s lack of connectivity to today’s online world is a disadvantage. On the contrary, the Badger thinks it’s an advantage.

Smartwatches, of course, come in many guises but one thing fundamentally drives their design, namely convenient access to the services and information that underpin the rhythm of life in the modern digital world. Their manufacturers routinely enhance their design, functionality, and usability as a wrist-based hub for time, date, and things like voice and message communication, activity and fitness tracking, and personal health monitoring and diagnostics. As a convenient computer on our wrists, however, they are yet another screen that grabs attention. They need regular battery recharges and software and security updates to protect against cyber threats. Like smartphones, there’s also a better model coming soon!

So, are smartwatches rendering traditional mechanical/automatic watches obsolete? No. Why not? Because most people today understand the dangers of the digital world, and they are increasingly aware from world events of the inconvenience and turmoil that can ensue when key energy, communication, and online infrastructure is damaged. Their smartwatch could be rendered useless in such circumstances, whereas a traditional mechanical or automatic watch will continue to deliver its core function, time and date, unabated. So don’t ditch your traditional watch for a smartwatch, have and use both (as the Badger does). You will then always be able to access the time and date on your wrist should a digital disaster occur. The obsolescence of traditional watches is a long way off because in the current world climate it’s prudent to have non-digital contingencies for unexpected digital difficulties…

Everyone is a salesperson…

Good senior leaders and managers often enjoy being invited to speak to employees attending company training courses. The Badger certainly did. His sessions not only always delivered a message relevant to the training course but also provided an opportunity for attendees to ask questions about any subject close to their heart. Their questions were often diverse and required quick thinking to answer, but that’s what made the sessions fun! It was always rewarding to see attendees relax during the sessions, to hear their responses to the Badger’s answers, and to observe body language when the audience stayed silent. It was also pleasing when ‘light bulb’ moments spread across the attendees faces if an answer triggered a rush of understanding.

As a leader strongly focused on IT delivery, the Badger spoke mainly to training course groups from the business operation, delivery, and technical communities. Their questions were sometimes unusual. For example, on one occasion the Badger was asked ‘I hear senior people utter their favourite sayings frequently, but which of these have merit because they encapsulate a truth?’  The Badger gathered his thoughts for a moment before rattling off a string of common phrases in use in the company and signalling that they all had merit because they all captured a truth relevant in any company. The string included, for example, the following:     

‘What gets measured gets done…

‘You don’t jump high unless the bar’s set high…’

‘If you bring problems then bring solutions too…’

‘Time is precious, get to the point…’

‘Decisions aren’t about making everyone happy…’

‘Everyone is a salesperson…’

This last one prompted an indignant response from a couple of attendees who were software engineers. They were contemptuous of  salespeople and unequivocal that  they were not, and never would be, a salesperson. On enquiring if they interacted with peers in their client’s organisation on their projects they answered yes. The Badger pointed out that they were actually representing the company when they interacted with external people, and that made them a salesperson of sorts regardless of their job title. He also highlighted that being ‘sales aware’ during such interactions was important because they were well placed to identify the early signs of potential avenues of further work which could be fed into the company’s main sales machinery for qualification and potential follow-up by others. They remained unpersuaded, and so the Badger pointed out that without sales the company would fail, and they would be out of a job! Their facial expressions changed as a ‘light bulb’ moment hit home on realising that even technical IT staff must be commercially and sales aware and acknowledge that ‘Everyone is a salesperson’ of sorts. ‘Sales’ is not a dirty word. It is at the heart of a company’s success and the employment of everyone within it. Remember, everyone is a salesperson…

Once upon a time there was the Strategic Defense Initiative (Starwars)…

There comes a time when a room at home needs a decorative refresh. That time recently came in the Badger household, and so he deployed his practical skills to refurbish the room himself. The project was planned, agreed with an important stakeholder (the wife), and fastidiously executed. The room’s now in the post-delivery phase with the small list of defects pointed out at acceptance by the important stakeholder now corrected. Painting walls listening to good music playing on the radio during the project proved a more satisfying experience than expected. On finishing one wall, and while stepping back admiring his handiwork, the Badger found himself listening to the broadcaster’s regular news bulletin and sighing deeply on hearing that President Trump had unveiled plans for a ~$175 billion US ‘Golden Dome’ missile defence system. Memories of President Reagan’s 1983 Strategic Defence Initiative (SDI) came flooding back.

The goal of SDI was to develop a system that could intercept and destroy incoming nuclear missiles, effectively shielding the USA from a potential Soviet attack during the Cold War. Many dubbed it ‘Star Wars’ because of its proposed use of space-based technology. At the time, the Badger was working on the software design and development of a Relational Database Management System (RDMS) product – pretty cutting edge at the time. He remembers thinking that SDI would never come to fruition. Indeed, SDI itself was never fully realised, but its ideas have shaped military technology and policies in Missile and Space-based defence, Cybersecurity strategy, and International Collaboration ever since.

Rolling forward 40 years, the world is a quite different place geopolitically, technologically, economically, and militarily. Daily civilian and military life now depends on digital capabilities that didn’t exist in 1983, and continued rapid tech advances, innovation and AI are changing both domains at a rate never imagined just a few decades ago. Reagan’s SDI initiative and President Trump’s ‘Golden Dome’ share some similarities, but whilst the available tech in 1983 meant the former’s space-based missile defence was largely theoretical, President Trump’s benefits from modern, real, sophisticated satellite, space, sensor, and missile technologies. ‘Golden Dome’ revives elements of SDI but it also suffers from some of the same challenges, particularly, around cost, scepticism about its effectiveness, and concern that it dramatically escalates the global arms race. It’s certain, however, that just as happened when SDI was announced in 1983, military and tech sector commercial organisations will be relishing the prospect of picking up ‘Golden Dome’ contracts regardless of whether its stated ambitions will ever fully come to fruition.

But why did the Badger sigh so deeply on hearing about ‘Golden Dome’ on the radio? It was simply an instant reaction to the feeling that it’s another step on the road to creating the Terminator film’s SKYNET system for real, and that our species seems intent on a path that can lead to eventual self-inflicted extinction.

AI and copyright…

Elton John recently had some sharp words to say about the UK government’s plans to exempt AI technology firms from copyright laws. Apparently, there’s currently a game of ping-pong underway between the House of Commons and the House of Lords regarding this plan. Many writers, musicians, and artists are furious about the plan, and Elton’s comments caused the Badger to scratch his head and ponder. Why? Because, like many individuals and bloggers, his website’s content could be plundered by AI without his knowledge or permission regardless of the copyright statement on its home page. With AI models and tools increasingly mainstream, Elton’s words made the Badger realise that he, and probably many others around the globe, should have copyright more prominent in our thoughts.

Copyright law is complex and, as far as the Badger understands, ‘fair dealing’ or ‘fair use’ allows limited use of copyright material without permission from the copyright owner under specific circumstances. Fair dealing/use is not a blanket permission, and what constitutes this depends on factors such as how much of the material is used, whether its use is justified, and whether it affects the copyright owner’s income. The Badger’s not a lawyer, but  he senses that AI and copyright is a legal minefield that will keep experts with digital and legal qualifications in lucrative work for years to come.

As the Badger pondered, he scratched his head again and then asked Copilot if AI used material held on copyrighted websites. The short response was that it (and other AI) follows strict copyright guidelines and only generates brief summaries of copyrighted material respecting fair use principles and with pointers to official sources. To test the efficacy of the answer, the Badger asked Copilot for the lyrics of Elton John’s song ‘Candle in the wind’. Copilot responded with ‘Can’t do that due to copyright’. Typing the same request, however, into the Badger’s browser readily produced the lyrics. Make of that what you will, but it does make you wonder why you would need to use AI like Copilot for this kind of interaction.

At the heart of Elton John’s point is the long-established principle that if someone or an enterprise wants to use copyrighted material in something that produces a commercial gain for themselves, then the copyright owner should give prior permission and be paid. AI is a disruptive technology, much of it controlled by the same giant US corporations that already dominate the tech world. AI cannot be ignored, but exempting tech firms from copyright law seems wrong on many different levels. The Badger’s concluded that he should improve his understanding of copyright law, and that AI tech firms must not be exempt from such laws. After all, if you were to take a leaf out of President Trump’s playbook then if you want something, you need permission AND  you must pay.

A career as a TikTok/Instagram influencer?

If a student says they intend to develop a career as a social media influencer on TikTok, Instagram (and other platforms), and they ask your opinion on their intent, what would you say? The Badger was put on the spot and asked this question during a discussion with a sizeable group of University students midway through their degree courses. Most in the group were studying various flavours of science, engineering, computing, or IT-based subjects. So, what did the Badger answer?

Well, to create a little time to marshal his thoughts, the Badger asked the group to raise a hand if they thought being a TikTok or Instagram influencer was a career path that needed a degree-level education? Only two students put a hand up. A couple commented dryly that most social media platform influencers had little underlying talent or expertise and were focused on their egos and gaining celebrity, notoriety, and money rather than something beneficial for today’s world. That’s harsh, but it’s an understandable perspective. Whether we like it or not, however, becoming a social media influencer is the aspiration of many young digital natives because it’s seen as an easy and convenient way to generate an income.

So, is being a social media influencer a real career path? Many believe so, ostensibly because some with that label make considerable sums of money through brand partnerships, sponsorships, advertising, and selling merchandise. They also perceive that influencers don’t need high educational qualifications although they must be adaptable and adept at analysing trends and staying relevant as audience preferences change. There’s no doubt that some influencers have skills in content creation, marketing, and audience engagement, and a natural charisma, and flair for storytelling, but the reality is that only a small percentage succeed in making a reasonable living from their efforts. Like in any career, success as an influencer on the likes of TikTok and Instagram requires some competence and skill, and so it would be foolish to suggest that being a social media influencer is not a legitimate career path in today’s world.

The Badger was thus careful when answering the student’s question. He simply communicated the advice given by his father when the Badger was first deciding to further his own education at University, namely ‘Get the best education you can in a subject you enjoy and are good at. Don’t pre-suppose how you’ll use that in the future because life has a habit of taking you in unexpected directions’. The students thought this was wise counsel because none of them thought they would secure jobs directly relevant to their degree subject. That’s a shame, but ever that’s been the case. They unanimously concluded that if you intend to have a career as a social media influencer, then it’s prudent to get the best education you can first.

VE Day, Gen-Z, resilience and preparedness…

Many have family members who lived through the violence and hardships of World War 2 as civilians or combatants. Their experiences shaped not only their own lives, but also the values they instilled in their children. The Badger’s father, for example, proudly served his country in the military and then worked hard to create a better life for his family once he was demobbed. He was the epitome of that ‘Keep calm and carry on’ and ‘There’s no such word as can’t, try!’ generation, and he brought his children up to embody discipline, standards, hard work, duty, calm objectivity, preparedness, and a sense of right and wrong. These instilled values have served the Badger well over the years. The 80th anniversary of Victory in Europe (VE), a day which saw spontaneous rejoicing and street parties, is being celebrated on Thursday the 8th May 2025. It’s an opportunity to reflect on the sacrifices and resilience of a WW2 generation, civilians and combatants, who resisted tyranny. It will be poignant for the Badger because his father, sadly no longer with us, was unable to celebrate on VE Day at the time.

Life is very different today, as the Badger explained to a couple of Generation Z digital natives last weekend. Homes in the 1940s  were different. The internet, social media, instant communication, music and video streaming, electronic games, smartphones, personal computers, online banking, online shopping, robots, and driverless cars were science fiction, and children played physical games that would make today’s health and safety coterie wince. The Gen-Z natives struggled to relate to how life functioned in the 1940s without digital technology. The Badger then asked them two questions – what would you do if a) the UK experienced an electricity blackout akin to that seen recently  on the Iberian peninsula, or b) cyber-attacks took out online and critical infrastructure services for a prolonged period. ‘We’ll get by until someone sorts things out’ was the glib response, although they had no real idea about how they would actually get by! This made the Badger wonder about the resilience of our completely digital-native Gen-Z generation. As individuals, perhaps we’ve all become complacent about the risks associated with our dependence on digital services.  

In fact, do you know how you would ‘keep calm and carry on’ if digital services suddenly disappeared for a prolonged period? Do you have any personal emergency measures or pack of essentials to fall back on if something catastrophic happened to the electricity grid? Individuals rarely consider such questions even though our digital world is highly complex and believing ‘it’ll never happen’ just reflects naivety. Without their tech will digital-native Gen-Z ever be as resilient, resourceful, and prepared to make sacrifices like those of the 1940s in really tough times? If the Badger’s conversation was anything to go by, the jury’s most definitely out…