Work-life balance…

Work can be all-consuming. Organisations emphasise values like ‘employee well-being’ and a ‘people-first culture’, but most really operate with deliverables and timelines as their overwhelming priority. HR departments may advocate for ‘work-life balance’, but business leaders, project, programme, and service delivery leaders always push staff for huge effort and heroics to meet a deadline or milestone. In the IT sector, for example, do organisations ever willingly miss a deadline or milestone because of ‘employee well-being’ or their ‘people-first culture’? No.

The Badger’s just had some downtime in Morthoe on the UK’s North Devon coast. The apartment in which he stayed had wonderful coastal views, and it was while nibbling a scone on its balcony in the afternoon sun that thoughts turned to work-life balance. Life on the North Devon coast still provides access to all of today’s online services, but the sounds, the sea, the geology, the flora and fauna, and the local lifestyle forces relaxation and puts work-life balance into perspective. What did the Badger conclude about work-life balance? Simply that it matters. It isn’t just a trendy phrase. It’s a necessity for sustaining energy, protecting mental and physical health, and keeping one’s mind sharp. It matters because burnout reduces productivity and clouds judgement. Downtime helps the brain reset improving creativity, motivation, and decision making. It also matters because quality time away from work helps to build a broader perspective on life as a whole.

The Badger concluded years ago that there are three certainties regarding people. The first two are a) people and not machines, and b) they are all different. Some thrive on having really intense work periods followed by breaks of really deep rest, while others thrive with a daily structure of predictable routines, boundaries, and pressures interspersed with regular shallower rest periods. We are all different, and so the key to a good work-life balance is simply to adopt a personal rhythm that fuels and refreshes rather than drains your capability. Finding the rhythm that works for you within the terms of your employment contract is important. There’s a paradox, however. Employment contracts normally include a holiday entitlement to rest and recharge, and yet many people don’t take all their entitlement. The reasons for this are numerous, but sometimes it’s because a) the work culture rewards hustle more than rest, and b) that an individual misguidedly thinks that everything will collapse if they take a break. So, what’s the Badger’s third certainty about people? Simple. No one is irreplaceable.

If you accept these people certainties and find your rhythm for work-life balance then you will be healthier, sharper, more productive, and more resilient, and the organisation you work for will perform better too. So, use your holiday entitlement. As the Badger was reminded while nibbling scones in the North Devon sunshine, a break is good for you…

Software – The invisible infrastructure of daily life

We’re living in a world where software is the invisible infrastructure for daily life. There doesn’t appear to be any quantitative measure of what proportion of everything we use actually contains software, but it’s not outrageous to assert that almost everything we use depends on it. Whatever the real proportion is, with the advent of AI and the Internet of Things (IoT) it’s only climbing. Software has blossomed from primitive beginnings to become crucial in every facet of life in less than 75 years. In the 1950s software wasn’t a ‘product’ but something bundled with hardware used primarily for scientific calculations, military simulations, and data processing. Most of it was written in machine code or early assembly languages tailored to specific hardware, and it was written by the same engineers who built the hardware. A large program was a few thousand lines of code loaded manually via cards or magnetic tape. FORTRAN only emerged in 1956 and the term ‘software engineering’ was only coined in the late 1960s/early 1970s.

How things have changed. Today’s internet, search tools, social media, cars, medical devices, satellites, aircraft, trains, weapons, smartphones and so on depend on many, many millions of lines of code, see here and here for example. Modern health, financial, energy, government, intelligence, and defence capabilities all rely on huge amounts of software. Indeed, any item in our homes that can sync with a smartphone app contains software. In less than 75 years software has changed life, taken over the world, and become professionalised in the way it’s produced. Writing machine code for specific hardware in a way that ‘every byte counts’ has evolved into the professional, ever changing and improving, discipline of software engineering which incorporates design and development processes, methods, standards, tools and techniques that ensure that production software meets requirements, is scalable, tested, robust, maintainable, secure, and performant. Software engineering, of which coding is today just a subset, continues to evolve with, for example, the likes of Microsoft and Google anticipating that AI will render hand crafting of code redundant by the end of this decade.  

The software woven into our lives in recent decades has brought immense convenience and transformed communication, public services, business, and conflict, as world events illustrate. Today it’s undeniably critical infrastructure, but software, unlike most infrastructure, isn’t something you can tangibly touch! Compared to what existed 75 years ago, the software in use today is a vulnerable sprawling metropolis with vulnerabilities that bad actors can use to cause disruption. Are we safer today than in the 1950s now that everything depends on software? Hmm, that’s debateable, but what the Badger senses is that even with the advent of AI, software engineering as a discipline and the career prospects for software engineers focused on solutions, security, quality, robustness, and testing rather than coding, are good for many, many years yet…

VE Day, Gen-Z, resilience and preparedness…

Many have family members who lived through the violence and hardships of World War 2 as civilians or combatants. Their experiences shaped not only their own lives, but also the values they instilled in their children. The Badger’s father, for example, proudly served his country in the military and then worked hard to create a better life for his family once he was demobbed. He was the epitome of that ‘Keep calm and carry on’ and ‘There’s no such word as can’t, try!’ generation, and he brought his children up to embody discipline, standards, hard work, duty, calm objectivity, preparedness, and a sense of right and wrong. These instilled values have served the Badger well over the years. The 80th anniversary of Victory in Europe (VE), a day which saw spontaneous rejoicing and street parties, is being celebrated on Thursday the 8th May 2025. It’s an opportunity to reflect on the sacrifices and resilience of a WW2 generation, civilians and combatants, who resisted tyranny. It will be poignant for the Badger because his father, sadly no longer with us, was unable to celebrate on VE Day at the time.

Life is very different today, as the Badger explained to a couple of Generation Z digital natives last weekend. Homes in the 1940s  were different. The internet, social media, instant communication, music and video streaming, electronic games, smartphones, personal computers, online banking, online shopping, robots, and driverless cars were science fiction, and children played physical games that would make today’s health and safety coterie wince. The Gen-Z natives struggled to relate to how life functioned in the 1940s without digital technology. The Badger then asked them two questions – what would you do if a) the UK experienced an electricity blackout akin to that seen recently  on the Iberian peninsula, or b) cyber-attacks took out online and critical infrastructure services for a prolonged period. ‘We’ll get by until someone sorts things out’ was the glib response, although they had no real idea about how they would actually get by! This made the Badger wonder about the resilience of our completely digital-native Gen-Z generation. As individuals, perhaps we’ve all become complacent about the risks associated with our dependence on digital services.  

In fact, do you know how you would ‘keep calm and carry on’ if digital services suddenly disappeared for a prolonged period? Do you have any personal emergency measures or pack of essentials to fall back on if something catastrophic happened to the electricity grid? Individuals rarely consider such questions even though our digital world is highly complex and believing ‘it’ll never happen’ just reflects naivety. Without their tech will digital-native Gen-Z ever be as resilient, resourceful, and prepared to make sacrifices like those of the 1940s in really tough times? If the Badger’s conversation was anything to go by, the jury’s most definitely out…

Have Millennials benefited from being the first real ‘digital native’ generation?

Millennials are the first ‘digital native’ generation. They’ve grown up with the internet, mobile telephony, and instant information at their fingertips. The digital world of the 1980s and 1990s, when Millennials were born, laid the foundation for today’s advanced capabilities. As Millennials have moved from childhood to adulthood and parenthood,  and from education to employment and family responsibilities, they’ve embraced the relentless wave of digital advances and made them part of their life’s ecosystem. A quick recap of the tech in the 1980s and 1990s illustrates the scale of the digital revolution they have embraced.

The 1980s saw the rise of PCs like the IBM PC, Apple II, and Commodore 64. All had limited processing power. MS-DOS was the popular operating system, hard drives were a few tens of megabytes, and 1.44MB  floppy discs were the common removable storage medium. Software had largely text-based user interfaces and WordPerfect and Lotus 1-2-3 dominated word processing and spreadsheets, respectively. Local Area Networks (LANs) started appearing to connect computers within an organisation, and modems provided dial-up internet access at a maximum rate of 2400 bits/second.

The 1990s saw PCs become more powerful. Windows became a popular operating system making software more user-friendly and feature rich, and Microsoft Office gained traction. CD-ROMs arrived providing 700MB of storage to replace floppy discs, hard drive capacities expanded to several gigabytes, and gaming and multimedia capabilities revolutionized entertainment. Ethernet became standard, computer networks expanded, the World Wide Web, email, and search engines gained traction, and mobile phones and Personal Digital Assistants (PDAs) like Palm Pilot emerged.

Today everything’s exponentially more functionally rich, powerful, and globally connected with lightning-fast fibre-optic and 5G internet connectivity. Cloud computing provides scalable convenience, computing devices are smaller, data storage comes with capacities in the terabyte and petabyte range, and social media, global video conferencing, high-definition multimedia, and gaming is standard. Smartphones are universal, fit in your pocket, have combined the functions of many devices into one, and have processing powers that far exceed those of computers that filled entire rooms in the 1980s and 90s.

But has the Millennial generation benefited from being the first real ‘digital native’ generation? Yes, and no. This generation has faced significant hurdles affecting their earning potential, wealth accumulation, and career opportunities. Student loan debt, rising housing costs, rising taxes, the 2008 global financial crisis and its aftermath, the COVID-19 pandemic, soaring energy costs, and now perhaps tariff wars are just some of these hurdles. When chatting recently to a Millennial group asserting that their generation’s woes were caused by technology, the Badger pointed out first that these hurdles were not the fault of technology, but of ‘people in powerful positions’, and secondly that they should note Forrest Gump’s line ‘Life is like a box of chocolates; you never know what you’re gonna get’. Whatever you get, you have to deal with…and that’s the same for every generation.

AI – A Golden Age or a new Dark Age?

The Badger’s experimented with Microsoft’s Copilot for a while now, sometimes impressed, but often irritated when the tool ends its answer to a question by asking the user’s opinion on the underlying topic of the question. For example, the Badger asked Copilot ‘When will self-driving cars be the majority of vehicles in the UK?’  Copilot’s answer was sensible and distilled from quoted sources, but it ended with ‘What are your thoughts on self-driving cars? Do you think they’ll revolutionize transportation?’. The Badger wanted an answer to his question, not a conversation that will capture, store, and use his opinion for the tool’s own purpose. Responding with ‘None of your business’ gets the reply ‘Got it! If you have any other questions or need assistance with something else, feel free to ask. I’m here to help’. That last phrase should be supplemented with ‘and make money!

Overall, his experimentation has made him wonder if AI is leading to a new Golden Age for humanity, or a new Dark Age. So, what’s the answer? A new Golden Age, or a worrying Dark Age? AI and Machine Intelligence advocates, giant businesses investing huge amounts of money in the technology, and even governments with a ‘fear of missing out’, are quick to say it’s the former. The Nobel Laureate Geoffrey Hinton, the godfather of AI, isn’t so sure. He articulates the risks well, and he’s highlighted that the ability of AI to eventually wipe out humanity isn’t inconceivable. Listening to him interviewed recently on the Today programme, BBC Radio 4’s flagship news and current affairs programme, struck a chord. It made the Badger realise that such concerns are valid, and that a Dark Age is a possibility.

So where does the Badger stand on the Golden or Dark Age question? Well, the last 25 years has made us believe tech-driven change is a good thing, but that premise should be challenged. New technology may drive change, but it doesn’t necessarily drive progress because it’s politics that really determines whether change makes people better off overall. Politicians, however, have struggled woefully to deal with tech-driven change and the new problems it’s created for society so far this century. There’s little sign this is changing for AI. Humans are fallible and can make poor judgements, but if we become reliant on AI to make choices for us, then there’s a real danger that our confidence and capacity to make our own independent decisions will be lost.

The Badger’s answer is thus nuanced. A Golden Age will unfold in areas where AI is a tool providing a tangible benefit under direct human control, but if AI is allowed to become completely autonomous and more intelligent than humans, then a Dark Age is inevitable. Why? Because things with greater overall intelligence always control things of lower overall intelligence. Can you think of an example where the reverse is true?

Human Space travel to Mars? Just send Intelligent Machines…

‘Space, the final frontier. These are the voyages of the Starship Enterprise. Its five-year mission – to explore strange new worlds, to seek out new life and new civilizations, to boldly go where no man has gone before’.

In September 1966, these iconic words were heard for the first time when Star Trek arrived on television. They, in essence, remind us that the pursuit of knowledge and the exploration of the unknown are central to what it means to be human. They inspire us to dream big, embrace challenges, and continually seek to expand our understanding. The words were narrated before man landed on the moon, before the internet, before smartphones and laptops, and when the computing power available to astronauts was miniscule compared to that of a mid-range smartphone. Things have changed extraordinarily since 1966,  but the opening words to Star Trek episodes are just as relevant to what it means to be human today.

Space travel is difficult, as US billionaires will attest (see here and here, for example). Today’s Space-race is different to that of the 1960s with, for example, the likes of India and China part of the fray. Putting humans back on the Moon is a key objective, and the USA’s Artemis programme intends to do just that within the next few years, if things go to plan.  Putting human feet on Mars, as reaffirmed by the USA’s President Trump during his inauguration this week, is also an objective. The Badger, however, senses that it’s unlikely to happen for decades yet, if at all.

Why the scepticism? Well, two things. The first is that putting humans on Mars and bringing them back is much more challenging than returning to the Moon. The second thing is more fundamental. In the ~60 years since Star Trek’s iconic words were first heard, life and our knowledge of Space has been transformed through technological advances, especially in the sphere of capturing, processing, and using information digitally. Advances in digital technology continue apace with AI and intelligent machines fast becoming a reality. Indeed, Mr Trump has announced huge investment in Stargate, AI infrastructure.  The automation of everything with machines becoming as intelligent as humans begs a question, namely ‘Is prolonged human travel in Space really viable and economically sensible?’

The evidence implies that humans are unsuited to prolonged Space travel (e.g. see here and here). So why send humans to Mars when intelligent machines are a better option? Perhaps a rethink of putting humans on Mars will happen as AI and intelligent machines become mainstream, perhaps it won’t. Meantime the Badger wholly subscribes to the pursuit of knowledge and exploration of the unknown, but he will enjoy Star Trek for what it is, just imaginative  entertainment…

‘Free speech’ and Social Media…

Social media started 2025 with a bang! Mr Musk expressed opinions on X about various UK politicians and UK issues, and Mr Zuckerberg announced the end of Meta’s fact-checking programme and changes to its content moderation policy. These two events produced lots of commentary about social media platforms and ‘free speech’ in the traditional media and in political circles. The Badger sighed on reading much of the discourse because ‘free speech’ has existed long before the existence of social media platforms. There are a wide variety of views about the importance of social media for ‘free speech’, but the Badger’s view is simple. Society as a whole, through its institutions, laws, and cultural norms, is the bastion of ‘free speech’, not Mr Musk, Mr Zuckerberg, or anyone else who owns a social media platform which, let’s not forget, is a business striving to maximise profit from its users.

Musing on some of the media discourse over a coffee on returning from a walk through a snowy park, the Badger’s thoughts converged on three points. The first was that social media is here to stay and cannot be ignored. With ‘free speech’, however, comes responsibility, and this seems to be in relatively short supply in the social media domain. The second was that social media platforms are businesses, and those that own or run them have a vested interest, an inevitable focus on making money, and an aversion to regulation. Messrs Musk, Zuckerberg, and indeed other leaders of massive corporations, will always have ‘an agenda’, and what they say and how they act will always be determined by that agenda and their vested interest. The relationship between  social media and ‘free speech’ must be considered with this in mind.

The third point was more holistic. It embraced more of our  current world’s dynamics. Technology,  ‘free speech’, and social media may be components of world dynamics, but the recent discourse illustrates something about the wielding of power in today’s world. That something is captured by John Lennon’s words uttered a quarter of a century ago. He said ‘Our society is run by insane people for insane objectives. I think we’re being run by maniacs for maniacal ends, and I think I’m liable to be put away as insane for expressing that. That’s what’s insane about it’. He has a point. ’Free speech’ existed when he said those words, and social media didn’t.

As the Badger finished his coffee, he decided that the key take-away from his musing was simply this, not to let tech and social media dominate one’s life. After all, life will go on if social media didn’t exist. Outsourcing one’s life to social media and being a slave to its content is a risky thing to do, but if you do, then keep John Lennon’s words in mind and don’t be naïve about the veracity of the content you consume…

2025 – A year of ‘Strain and Change’…

The festive season is over, and most people are once again embroiled in the routine of normal life. Many start the year mentally refreshed, physically rested, and game for the next challenge, but some do not. And there’s the rub, to use an idiom from Shakespeare, because those starting the year unprepared for a challenge will surely find this year difficult. Why’s that, especially when every year presents challenges that must be dealt with? Well, the omens for 2025 suggest it’s going to be a particularly testing one across a broad range of fronts. As a relative put it over the holiday period, the world order’s changing fast, there’s disgruntlement with political leaders, AI and disruptive advances in digital tech driven by huge corporations continue unabated, retrenchment from the globalisation that’s been a norm for years  is underway, and so ‘Strain and Change’ will be everywhere in 2025.   Those stepping back into life’s rhythms expecting the status quo and unprepared for challenges are thus likely in for a rude awakening.

With this in mind, the Badger found himself chuckling as he read what the BBC’s Tomorrow’s World TV programme predicted in 1995 for 2025. When Professor Stephen Hawking told that programme that ‘Some of these changes are very exciting, and some are alarming. The one thing we can be sure of is that it will be very different, and probably not what we expect’, little did he (or the Badger) know that the Badger’s last post for 2024 would echo the same sentiment! The Badger started wondering what advice Professor Hawking, who produced many pearls of wisdom, might have given us at the start of a year of ‘Change and Strain’. After a little research, the Badger decided he would simply concatenate two of his memorable pearls of wisdom to say:

It is very important for young people to keep their sense of wonder and keep asking why. It’s a crazy world out there. Be curious. However difficult life may seem, there is always something you can do and succeed at’.

This seems apt in many ways, but especially for today’s always on, social media dominated, digital world where Hawking’s sentiment can be expressed as  ‘Don’t take anything you read, watch, or hear at face value. Be curious, ask questions, and always believe that you can take action to better your situation’. The Badger thinks that ‘Strain and Change’ is the drumbeat of 2025 technologically, nationally, geopolitically, commercially, and economically. Accordingly, whatever challenges lay ahead, they must be faced with the mindset embodied in Professor Hawking’s concatenated words above. As for the Badger? Well, he’s motivated, refreshed, and well prepared. The only status quo he’s anticipating in 2025 is the continuation of timeless, good, vintage music of which Living on an Island is a good example…

Looking forward…

Do you know exactly what you were doing at a specific time on Christmas Eve 45 years ago? Regardless of your age, it’s unlikely that you do! But the Badger does. He and his brothers were helping their father complete deliveries so that he could get home at a reasonable time on Christmas Eve. We’d started at 3am, and on completing the last delivery in the middle of the afternoon we were exhausted! The Badger remembers the time, location, the weather, and what we were wearing for this last delivery because one of his brothers had a 35mm camera with him and asked a passerby to take a photo of us in front of the delivery vehicle. That photo is date and time-stamped and is cherished by the Badger and his brothers.

What’s this got to do with ‘looking forward’? Well, the Badger’s father, who’s no longer with us, never made predictions about the year ahead. Being orphaned while an evacuee from London during the Second World War meant he dealt with life one day at a time. The unpredictability of the future world and his personal circumstances made not looking beyond tomorrow routine. He joined the Army as soon as he was old enough to, as he put it, ‘get an education, some semblance of  structure and family, and to establish good life skills and standards’. He thrived, served in Germany and the Middle East, and only left the service to marry.

The Badger and his brothers frequently heard mantras rooted in his Army days and childhood experiences while growing up. Advice like ‘there’s no such thing as can’t, try’, ‘if you’re knocked back, pull yourself together and start again’, and ‘learn from your mistakes but don’t dwell on the past, look forward to the future’ were heard frequently. His favourites were ‘remember, if it looks wrong, feels wrong, smells wrong, or sounds wrong, then it’s wrong’, if something untoward happens don’t ignore it, deal with it’, and ‘look forward,  because you can’t change the past’. He often said he never made predictions about the future because he’d learned that the future never turned out the way anyone expected. The Badger has thus resisted the temptation to express any opinion about what 2025 will hold. While ‘look forward, because you can’t change the past’ continues to be a key ethos, his father was right – the future will almost certainly turn out to be different to what’s anticipated!

Thank you for reading the Badger’s Blog during 2024,  and best wishes for Christmas and the New Year. The Badger and his brothers will be toasting those of the pre-internet/tech generation who are no longer with us because they provided a drumbeat of sound advice and wisdom that’s become much diluted in today’s world. Merry Christmas!

Banning social media for the under-16s…

Richard Holway,  a well-known, respected, and influential analyst in the UK software and IT services markets, penned an item last week for TechMarketView entitled What have we done?’. The item relates to the harm that social media and smartphones are doing to children. As a grandparent with a background in software and IT services, and having a grandchild who’s just started school, it struck a chord and reinforced the Badger’s own opinion that they have indeed caused great harm for children under 16. Holding this view doesn’t make the Badger, or anyone else with the same opinion come to that, an anti-tech dinosaur, just a human being who is pro technology that has safety, security, privacy, and human well-being as its paramount priorities. When it comes to ensuring the best for children in their formative years, it seems to be mainly the unprincipled and unscrupulous who argue about having these as dominant priorities.

History is littered with ‘products’ of one kind or another that were widely popular but were ultimately recognised over time as being a danger to human well-being. Plastics, DDT, cigarettes, fossil fuels, asbestos, paint with lead in it, illustrate the point. Did you know that a century ago cigarettes were advertised as being beneficial for asthma and anxiety? Also, incredibly popular patent medicines in the 19th and early 20th centuries  had no restrictions on what they contained. Many contained cocaine, morphine, and heroin. A very popular cough mixture for children did, indeed, include  heroin! Things, of course, changed once society eventually realised the scale of addiction and early deaths that occurred. It has long seemed to the Badger that aspects of our rampant tech-dominated world, especially with regard to social media, are following this same historical template, especially when it comes to use by children.

In little more than two decades, social media has evolved from being a novel way of staying connected to family and friends, into a powerful global force that shapes many dimensions of daily life. Evidence that social media has harmful effects on children is growing all the time. Science shows that social media causes the release of large amounts of dopamine into the human brain just like addictive drugs such as heroin, and even alcohol. No wonder it’s easy to get hooked!

Like Mr Holway, the Badger fully supports the ban on smartphones and social media apps for children under the age of 16. As you can see here, the legal age in the UK is 18 to buy alcohol, tobacco products, knives, and certain types of DVDs and games. The legal age is 16 to buy pets and animals, petrol, matches, and to be in fulltime employment. Why, therefore, shouldn’t smartphones and social media apps be banned for children under the age of 16? As Mr Spock from Star Wars would say, ‘Isn’t it illogical, Jim, to do otherwise?