Youngsters outsourcing their mental effort to technology…

Live Aid happened on Saturday 13th July 1985. If you were a young adult then, do you remember what you were doing when the concert happened? Were you there? Did you watch it live on television? The Badger had his hands full that day doing some home renovations while having a one-year-old baby in the house. He thus only saw snippets of the televised live concert. Last weekend, however, he made up for it by watching the highlights broadcast to celebrate the concert’s 40th anniversary.

Watching the highlights brought home why the music at the concert has stood the passage of time. It was delivered by talented people with great skill and showmanship without today’s cosseting production techniques and tech wizardry. What struck a chord most, however, was the enthusiasm of the Wembley Stadium crowd, the vast majority of whom are now grandparents in, or facing, retirement! People in that crowd had none of the internet access, smartphones, or online services we take for granted today. In 1985 the UK’s first cellular telephone services were only just being introduced by Cellnet and Vodafone, and ‘home computing’ meant the likes of the Sinclair ZX Spectrum and the BBC Micro. A far cry from today! Furthermore, those in that crowd represent a generation that thought for themselves and didn’t have their minds dulled by reliance on digital technology and internet-based online services. Their grandchildren, on the other hand, only know life based around the internet, and they often seem oblivious to the likelihood that their reliance on online things like social media might be dulling their minds, nudging them towards a passivity of thought, and perhaps ultimately causing atrophy of their brain.  

Concern about technology dulling human minds isn’t new. In 370 BC, for example, Socrates worried that writing would erode a person’s memory!  With AI endlessly expanding, however, the potential for today’s youngsters to completely outsource mental effort to technology seems very real. More and more  scientific evidence shows  that while the human brain is highly adaptable, digital immersion changes attentiveness, the way we process information, and decision-making. Some brain functions weaken due to digital immersion, others evolve, but the Badger thinks that when our digital world provides instant answers, the joy and effort of discovery through independent thought is dwindling. Always available digital content at our fingertips means fragmented attention spans and contemplation and reflection taking a back seat,  especially for youngsters with no life-experience without today’s online world.

Watching the 40th anniversary highlights thus did more than provide a reminder of the great music of that day. It brought home the fact that today’s  grandparents have something precious – a lived experience of independent thought and contemplation without an overreliance on our digital world. It feels, however, that their grandchildren are progressively outsourcing their mental effort to ever more advanced digital technology which, this grandfather senses, doesn’t augur well for the human race…

Have Millennials benefited from being the first real ‘digital native’ generation?

Millennials are the first ‘digital native’ generation. They’ve grown up with the internet, mobile telephony, and instant information at their fingertips. The digital world of the 1980s and 1990s, when Millennials were born, laid the foundation for today’s advanced capabilities. As Millennials have moved from childhood to adulthood and parenthood,  and from education to employment and family responsibilities, they’ve embraced the relentless wave of digital advances and made them part of their life’s ecosystem. A quick recap of the tech in the 1980s and 1990s illustrates the scale of the digital revolution they have embraced.

The 1980s saw the rise of PCs like the IBM PC, Apple II, and Commodore 64. All had limited processing power. MS-DOS was the popular operating system, hard drives were a few tens of megabytes, and 1.44MB  floppy discs were the common removable storage medium. Software had largely text-based user interfaces and WordPerfect and Lotus 1-2-3 dominated word processing and spreadsheets, respectively. Local Area Networks (LANs) started appearing to connect computers within an organisation, and modems provided dial-up internet access at a maximum rate of 2400 bits/second.

The 1990s saw PCs become more powerful. Windows became a popular operating system making software more user-friendly and feature rich, and Microsoft Office gained traction. CD-ROMs arrived providing 700MB of storage to replace floppy discs, hard drive capacities expanded to several gigabytes, and gaming and multimedia capabilities revolutionized entertainment. Ethernet became standard, computer networks expanded, the World Wide Web, email, and search engines gained traction, and mobile phones and Personal Digital Assistants (PDAs) like Palm Pilot emerged.

Today everything’s exponentially more functionally rich, powerful, and globally connected with lightning-fast fibre-optic and 5G internet connectivity. Cloud computing provides scalable convenience, computing devices are smaller, data storage comes with capacities in the terabyte and petabyte range, and social media, global video conferencing, high-definition multimedia, and gaming is standard. Smartphones are universal, fit in your pocket, have combined the functions of many devices into one, and have processing powers that far exceed those of computers that filled entire rooms in the 1980s and 90s.

But has the Millennial generation benefited from being the first real ‘digital native’ generation? Yes, and no. This generation has faced significant hurdles affecting their earning potential, wealth accumulation, and career opportunities. Student loan debt, rising housing costs, rising taxes, the 2008 global financial crisis and its aftermath, the COVID-19 pandemic, soaring energy costs, and now perhaps tariff wars are just some of these hurdles. When chatting recently to a Millennial group asserting that their generation’s woes were caused by technology, the Badger pointed out first that these hurdles were not the fault of technology, but of ‘people in powerful positions’, and secondly that they should note Forrest Gump’s line ‘Life is like a box of chocolates; you never know what you’re gonna get’. Whatever you get, you have to deal with…and that’s the same for every generation.

When there’s a new sheriff in town…

‘The lunatics have taken over the asylum’. No, that’s not a jab at the world’s leaders, often hyper-wealthy and drunk on power and their own egos, it’s what a young Badger thought many years ago when his employer appointed a new Chief Executive from outside the company. Soon after their arrival, the new CEO appointed more outsiders to  key leadership roles. Unsurprisingly, most of them had worked for the CEO before. The workforce quickly grasped that the ‘new sheriff in town’ and their ‘deputies’ were intent on rapidly and ruthlessly making their mark.

At the time, the Badger was leading his very first systems/software development project. The rationale for the rapid changes made by the new CEO seemed unfathomable to someone who was completely focused on delivering his project. Looking back decades later, having accumulated wide-ranging business and delivery experience, it’s clear the company needed change to sharpen its commercial and financial focus. Indeed, the CEO changed it for the better in these respects, but to the detriment of a great embedded workforce culture that was exceptionally team oriented. Wariness and distrust of the new sheriff and their deputies spread through the company, especially when the scale of the salaries, bonuses, and share options being paid to the new leadership became public knowledge.

The Badger’s respected and long-standing line manager at the time supported the need for change. They were, however, vocal in their dissent about the new CEO’s approach and the chaos it caused. They confided to a number of direct reports, including the Badger, that they expected the new sheriff,  who was ruthlessly intolerant of anyone with the temerity to challenge the changes being promulgated, to exit them from the company. They were right. Within a few months, they left the company having signed a compromise agreement. On their last day at work, they gave the Badger two pieces of advice, namely, ‘When you deal with any CEO or senior executive consider them to be psychopaths until you’re sure they’re not’, and ‘Remember that any CEO or senior executive will be your friend, until it suits them not to be’. These struck a truthful chord which caused the young Badger to learn about the actual characteristics of a psychopath! (In simple terms these are summarized here, for example). Furthermore, these words of wisdom triggered the Badger to learn more about human behaviour and to use that learning to good effect throughout the rest of his own delivery and leadership career.

And that’s the key message from this item. If you have an opportunity to learn about the rudiments of human psychology, then take it and use what you learn when interacting with, and observing, others. His line-manager’s advice stood the Badger in good stead over the years. Keep it in mind, especially when there’s a ‘new sheriff’ with a new set of ‘deputies’ in town intent on change…

A vintage Fortran Source code listing…

The Badger found an old paper Fortran source code listing, in good condition considering its age, at the back of a cupboard this week. It triggered memories of his programming activities early in his IT career. It also caused him to reflect on the changes there have been in IT as a result of the tremendous advances in digital technology, and the way we live and work, over the last 40 years. As illustrated below, this period has been one of continuous, rapid change.

In the 1980s, personal computers began to make their way into businesses and homes. The likes of IBM, Apple, and Microsoft introduced devices that revolutionized how people accessed information and performed tasks. The introduction of graphical user interfaces (GUIs) also made computers more user-friendly enabling a broader audience to embrace technology. The 1990s brought the birth and expansion of the internet, drastically changing communication, commerce, and entertainment. It brought a new level of connectivity and made information accessible globally at the click of a button. E-commerce giants like Amazon and eBay emerged, transforming the retail landscape and giving rise to online shopping.

The 2000s saw the rise of the mobile revolution. With the introduction of smartphones and tablets, technology became ever more integrated into our work and personal lives. Apple’s iPhone and Google’s Android led the charge, creating app-driven ecosystems that allowed users to perform a myriad of tasks on-the-go. Mobile internet access became ubiquitous fostering a new era of social media, instant messaging, and mobile gaming. In the 2010s, Cloud Computing with Amazon Web Services (AWS), Microsoft Azure, and Google Cloud brought scalable, on-demand, computing resources. This facilitated the rise of Software as a Service (SaaS) models which enable access to software applications via the internet and help businesses to reduce infrastructure costs and improve scalability.

In recent years, ‘Big Data’ has meant that organizations can leverage vast amounts of data to gain customer insights, optimize their operations, and make data-driven decisions. AI technologies such as machine learning, natural language processing, and computer vision, are also rapidly being integrated into applications from healthcare and finance to autonomous vehicles and smart home devices. In addition, the COVID-19 pandemic accelerated the adoption of remote working and digital collaboration tools, and video conferencing platforms like Zoom and Microsoft Teams have become essential communication and productivity tools.

Anyone working in the IT world over this period has had an exciting time! The Fortran listing reminded the Badger that it was produced when programming was a very human, hand-crafted activity. Source code today is produced differently, and AI will dominate programming in the future. The Badger’s career spanning all these changes  was challenging, exciting, creative, and one where dynamism, innovation, teamwork, hard work, and a ‘can do’ mentality were embedded workforce traits. Is that the case today? It has to be in a future which is dominated by AI.

AI – A Golden Age or a new Dark Age?

The Badger’s experimented with Microsoft’s Copilot for a while now, sometimes impressed, but often irritated when the tool ends its answer to a question by asking the user’s opinion on the underlying topic of the question. For example, the Badger asked Copilot ‘When will self-driving cars be the majority of vehicles in the UK?’  Copilot’s answer was sensible and distilled from quoted sources, but it ended with ‘What are your thoughts on self-driving cars? Do you think they’ll revolutionize transportation?’. The Badger wanted an answer to his question, not a conversation that will capture, store, and use his opinion for the tool’s own purpose. Responding with ‘None of your business’ gets the reply ‘Got it! If you have any other questions or need assistance with something else, feel free to ask. I’m here to help’. That last phrase should be supplemented with ‘and make money!

Overall, his experimentation has made him wonder if AI is leading to a new Golden Age for humanity, or a new Dark Age. So, what’s the answer? A new Golden Age, or a worrying Dark Age? AI and Machine Intelligence advocates, giant businesses investing huge amounts of money in the technology, and even governments with a ‘fear of missing out’, are quick to say it’s the former. The Nobel Laureate Geoffrey Hinton, the godfather of AI, isn’t so sure. He articulates the risks well, and he’s highlighted that the ability of AI to eventually wipe out humanity isn’t inconceivable. Listening to him interviewed recently on the Today programme, BBC Radio 4’s flagship news and current affairs programme, struck a chord. It made the Badger realise that such concerns are valid, and that a Dark Age is a possibility.

So where does the Badger stand on the Golden or Dark Age question? Well, the last 25 years has made us believe tech-driven change is a good thing, but that premise should be challenged. New technology may drive change, but it doesn’t necessarily drive progress because it’s politics that really determines whether change makes people better off overall. Politicians, however, have struggled woefully to deal with tech-driven change and the new problems it’s created for society so far this century. There’s little sign this is changing for AI. Humans are fallible and can make poor judgements, but if we become reliant on AI to make choices for us, then there’s a real danger that our confidence and capacity to make our own independent decisions will be lost.

The Badger’s answer is thus nuanced. A Golden Age will unfold in areas where AI is a tool providing a tangible benefit under direct human control, but if AI is allowed to become completely autonomous and more intelligent than humans, then a Dark Age is inevitable. Why? Because things with greater overall intelligence always control things of lower overall intelligence. Can you think of an example where the reverse is true?

‘What’s the point of hard work…if the rewards are taken away years later?’

The UK government announced that from April 2027 any remaining unused pension on an individual’s death will count towards their estate for Inheritance Tax purposes. This is a big change which caused the Badger to holistically cogitate! Although farmers, in particular, are already angrily protesting, this change affects anyone, employed or planning retirement, with Defined Contribution (DC) pension schemes, the norm for most companies and auto-enrolment these days.  

From April 2027 any unused pension will be included in a person’s estate for tax purposes. The beneficiary of the unused pension also pays Income Tax when they draw on their legacy. This is double taxation, the morality of which is questionable, and it means that if the beneficiary is a higher rate taxpayer, then their effective tax rate could be a whopping 64%. There’s no doubt that the government’s announcement will significantly change workforce attitudes towards pensions, inheritance planning, and retirement over the coming years.

This change doesn’t just affect the ‘wealthy’, it affects those in the broad IT industry that are ‘modestly comfortable’ too. If you’re employed in IT then you’re well paid compared with the UK average, you work hard for what you earn, and you’re likely to be in the ‘modestly comfortable’ category. The IT and tech sectors, important for the UK government’s economic growth aspirations, can be challenging but lucrative if you work hard and continuously develop your skills. In the Badger’s experience, IT people do indeed work hard, go beyond the call of duty, and most are prudent and make sacrifices to provide a modestly comfortable future for their loved ones. They deserve their rewards, but many will now find themselves, as reported recently in The I, in a similar situation to that of Louise Rollings, a single mum who worked for decades at an IT company. She comments:

The changes announced in the Budget make it feel as though people like me are being penalised for having worked hard, prioritised, budgeted and made sacrifices all our lives. As it stands, very little space has been left for people who have worked hard all their lives to build up modest estates to feel appreciated and rewarded. What’s the point in hard work if the rewards of all that ambition and determination are taken away in later years?

Quite! If you work hard, make sacrifices, pay all your taxes, save and invest prudently, and contribute to a DC  pension scheme (like governments encourage you to do and where you carry the underlying investment risk), then the question captured in Ms Rollings’ last sentence is very apt. If IT and tech are important for the country’s  economic growth, then the government needs to encourage more and more people in these sectors to work hard. That’s not likely to happen if you know with certainty that the rewards from your effort will be taken away many years later…

A ‘Budget for Growth’ for smaller, tech-centric businesses?

Digital technology – the electronic systems and resources that help us communicate, work, play, travel, and live today – is everywhere. The Badger recently conducted an experiment, not one that meets the rigours of professional research, by asking those he’s met over the last week about what they thought of when hearing the phrase ‘digital technology’. A young checkout operator at a local store, for example, said social media, the internet, their smartphone and its apps, online shopping and online banking. That was pretty much a summary of all the responses from young and old alike. Why the experiment? Simply to test a perception that the general public associates ‘digital technology’ mainly with well-known mega global corporations and big brands. The experiment essentially affirmed that perception.

But here’s the thing. The UK has many medium-sized companies with <250 employees, many of which fall under the umbrella of ‘digital technology’.  Such companies, many entrepreneurial family businesses, get little profile even though they are not only part of the UK’s economic bedrock, but also have digital technology which is used globally but invisible to the general public even though it touches them every day. The Badger knows, for example, of a company whose technology enables, controls and cures printed text and images on the packaging used for foodstuffs, medicines, chemicals, and even Christmas wrapping papers! It’s a global leader, employs <250 people, and it’s systems are built in the UK, installed worldwide, and managed and maintained from this country via the internet. It’s innovative companies like this that are crucial to our rhythm of life and the country’s success.

One of the Badger’s neighbours, who’s mid-career with children at school, is part of the leadership team at a different tech-centric, smaller company. While chatting recently, the Badger asked them how the recent UK budget would impact their company. ‘We’re used to challenges’ they said with a grin, adding that recruitment had been frozen, leavers were not being replaced, maximising automation had become the top priority, and work was being moved to lower cost offshore locations. They then added, ‘Now my pension pot is subject to inheritance tax, there’s little point in striving for more success or providing longer-term financial security for my family. I expect to leave the workforce within a decade to ensure I spend whatever wealth I’ve accumulated because there’s no point doing otherwise anymore’.

The Badger flinched. It seems a) that the recent budget isn’t a ‘budget for growth’ as far as smaller, tech-centric companies are concerned, and b) that the mindset and priorities of strivers in such companies is already changing. Has the UK  government’s budget damaged this country’s smaller ‘digital technology’ companies and their employees’ desire to succeed? Time will tell, but the omens don’t look good…

CEEFAX, pocket calculators, and the best music ever…

The postman pushed a package through the letterbox. The delivery of anything by a regular postman is always a surprise these days, especially when no one is expecting it! As the Badger bent down to pick the package up, alarm bells went off in his head as the security training during his career kicked in. Could this be something dangerous? These fleeting thoughts were quickly allayed because there was a return to sender name and address handwritten across the sealed end. It was from an old friend that the Badger had caught up with recently over Zoom. The package was opened to reveal two CD-ROMs holding 40 of his friend’s favourite songs from the 1970s.

The Badger chuckled. His friend is an entertainer who’s passionate about the music of the 1970s, and during our Zoom session we had reminisced about the music and technology of that decade, and our good times together. They had sent the CDs to test if the Badger still has devices that play this ‘old technology’ that first arrived in the early 1980s. The Badger has, and the sounds of the 1970’s filled the home for the rest of the day! Tunes like Mouldy old Dough’ by Lieutenant Pigeon, Sundown’ by Gordon Lightfoot, Joybringer’ by Manfred Mann’s Earth Band, and It’s Only Rock and Roll by the Rolling Stones’ echoed through the house as a reminder that they were part of the soundtrack to the 1970s decade of innovation and technological change.

The Badger remembers the BBC’s launch of CEEFAX 50 years ago in 1974! It was a world first allowing viewers to access text-based information on their TV sets – an internet before the internet! The same decade saw the arrival of battery-operated pocket electronic calculators, electronic ignition systems becoming standard on cars, microprocessors, the start of Apple and Microsoft, the 747 Jumbo Jet, Concorde commercial flights, MRI machines, the Sony Walkman, barcodes, floppy discs and email. There were countless scientific and technological advances, and also an oil crisis and the emergence of Punk!

Today’s life is dominated by digital technology that was science fiction in the 1970s. Developments since have been phenomenal and made the Badger’s career in IT always interesting, perpetually challenging, rewarding, and full of learning. So, if you are a student about to start, for example, a new year at University, then work hard, be inquisitive, learn as much as you can, extend your interests and boundaries, and remember that the technology you use today will be obsolete before long. Remember that today’s science fiction is tomorrow’s reality, and that good music will be played for decades and transcend the generations. After all, music from bands like Abba, Pink Floyd, the Eagles, Queen, Blondie and many more is still popular today proving that the best music ever comes the 1970s!

In a world of complication, simplicity is best…

The need to replace a broken light switch this week made the Badger think about how the march of digital technology produces a world of complication for the average person. Visiting a local electrical store for a new switch, one that simply turns the lights on or off when you press its rocker, led to a discussion with the store owner, a friend of a friend. They asked if the Badger wanted a ‘normal’ switch or a ‘smart ‘one. The Badger said the former. The shop was quiet, so we chatted.

Knowing the Badger’s IT background, the owner expressed surprise that he didn’t want a ‘smart’ switch that controlled lights using a smartphone app. The owner isn’t actually a fan of ‘smart’ lighting products for the home, but they sell them because they are a highly profitable product line with Millennials apparently the main customers, although sales had dropped recently. The Badger said a conventional switch served his need because it was simple, performed its primary function well (turning a light on or off), and devoid of complications like having a smartphone, an app, a Wi-Fi network or worrying about data security. The owner chuckled and called the Badger a dinosaur! ‘You won’t be buying one of my ‘smart’ fridges or washing machines then?’ they asked waving towards models in the store. They knew the answer.

A discussion on the pros and cons of ‘smart’ fridges and washing machines ensued. The owner believes that most customers for these items never used their digital and network features to the full. Most, they asserted, just used the standard functions that are found on more traditional, cheaper, models. We agreed that competition between manufacturers to add more ‘smart home’ capabilities to their products meant they’ve  become packed with features that make the units more complicated for the average person to use. What’s wrong with a simple to use fridge or washing machine that just concentrates on its fundamental purpose at a sensible price? Nothing, we concluded as the conversation drew to an end with an influx of new customers.

Since the 1980s, when the information technology landscape we have today didn’t exist, a host of technological and societal changes have occurred. Computational power, the internet, the digitisation of data, systems that interact independently, and new business models have had a massive impact, and many people still struggle with the changes and complications to their daily lives. Technology will complicate daily life for the foreseeable future, but people are beginning to shun technology for the simplicity of  traditional and familiar things that work and have done so for years. Do you really need to be able to talk to your fridge and washing machine? Just because modern technology means you can, doesn’t mean you should….

Physics, Chemistry, Mathematics and Light underpin the digital world of tomorrow…

A trip to the supermarket provided a reminder that without physics, chemistry, and mathematics none of our modern tech, internet, and online services would exist. Hardly a revelation, but what triggered this heightened awareness? Well, just the simple act of taking a small bag of spent disposable batteries to a recycling bin in the supermarket’s checkout area. The bin was full to overflowing with used disposable batteries from toys, clocks, TV remotes, cordless computer keyboards, wireless mice, and a myriad of other sources that use replaceable batteries as a power source. The act of depositing his spent batteries reminded the Badger that each one is actually a little capsule of physics, chemistry, and mathematics, and that our digital world depends on these subjects and batteries of one form or another.

On returning home, the Badger reflected on the science, materials, manufacturing, and recycling of these disposable batteries and whether they’ll ultimately be made redundant by newer power source innovations in the decades ahead. After all, the Titanium Citizen Eco-Drive watch on the Badger’s wrist uses solar and artificial light for power rather than replaceable batteries. It’s a technology that dates back to the mid-1970s, so it’s not new. Furthermore, the 1980s pocket calculator sitting on the Badger’s desk is also solar powered with no replaceable batteries. It’s a memento from a major 1980’s software development project and it works just as well today as it did back then! The Badger thus found himself wondering why power derived from light sources hasn’t rendered the disposable battery redundant in the last 40 years. Well, to make a functionally reliable, manufacturable, commercially viable product that has physics, chemistry, and mathematics at its core takes years of research to come to fruition. The good news is that it looks like lengthy research is bearing fruit and we may soon see a revolution that makes natural and artificial light the power source for a wide range of our devices, see here and here.

We should not be surprised that the coming years are likely to see a significant change in how our in-home devices, smart tech, and personal computing devices are powered. The use of replaceable batteries looks destined to decline. There’ll ultimately be no more charging cables, and no more trips to the supermarket to recycle spent batteries! Things, of course, are never that clear cut, but if light photons hitting panels on a home’s roof can generate electricity for household use, then it’s inevitably just a matter of time before light will power our gadgets and render disposable batteries redundant.

Fundamentally, power sources – and everything else in our modern digital world – are determined by physics, chemistry, mathematics and years of research. We should never shy away from being educated in these subjects because they – and light – are the seeds that will determine whatever we want the digital world of the future to be…