A week without access to the online world…

Are you brave enough to survive for a week without accessing the online world using your personal smartphone, tablet, laptop, or desktop? This was the question asked by the Badger’s wife shortly before the Badger and his millennial son departed for a short adventure on the North Devon coast last week. We answered affirmatively but decided to take our smartphones, which would remain switched off all week, in case they were needed in an emergency. We all saw this as commonsense given our intent to walk the rugged North Devon coastal path which, at the time, was covered by a yellow weather warning for high wind and rain. With a little trepidation about relinquishing personal access to the virtual world by taking no laptops or tablets and only having switched off smartphones in our pockets, we departed for North Devon wondering how long it would take before we succumbed to turning on our phones. Did we survive the week without succumbing to temptation? Of course we did.

The first evening at our destination was unsurprisingly difficult given that everyone today has become conditioned to having instant access to communication, banking, shopping, social media, and the internet through personal devices. People in the UK, for example, apparently check their smartphones every ten minutes, so imagine how you’d feel if this wasn’t possible. It took an iron will, some beers, and some proper conversation about the world that evening to keep our discipline and not succumb to switching on our smartphones.

The subsequent days were easier. Walking the coastal path in blustery, variable weather concentrated the mind on real, rather than virtual, world matters. The dormant smartphones in our pockets provided reassurance as we walked, but they stayed unused because no emergencies arose. In fact, we never turned them on all week. On the final night of our stay, we visited a bar and reflected on our week of virtual-world disconnection while watching a magnificent sunset over a choppy sea. We realised that our ‘fear of missing out’ from having no access to the virtual world had disappeared within 48 hours of arriving in Devon. We were proud to have resisted the temptation to use our smartphones, and we felt that detachment from the online world, and its pushed content, had contributed to how refreshed we felt mentally and physically.

We drove home the next morning and then ceremonially turned on our smartphones. We had, as expected, missed nothing of substance by our detachment from the virtual world for a week. This prompted the Badger’s son to state that although the online world has its place in modern life, real life will always go on if it’s not there. That’s a truth. The question is, are you brave and disciplined enough to survive without access from personal devices to the online world the next time you take a short break? If not, why not?

Work-life balance…

Work can be all-consuming. Organisations emphasise values like ‘employee well-being’ and a ‘people-first culture’, but most really operate with deliverables and timelines as their overwhelming priority. HR departments may advocate for ‘work-life balance’, but business leaders, project, programme, and service delivery leaders always push staff for huge effort and heroics to meet a deadline or milestone. In the IT sector, for example, do organisations ever willingly miss a deadline or milestone because of ‘employee well-being’ or their ‘people-first culture’? No.

The Badger’s just had some downtime in Morthoe on the UK’s North Devon coast. The apartment in which he stayed had wonderful coastal views, and it was while nibbling a scone on its balcony in the afternoon sun that thoughts turned to work-life balance. Life on the North Devon coast still provides access to all of today’s online services, but the sounds, the sea, the geology, the flora and fauna, and the local lifestyle forces relaxation and puts work-life balance into perspective. What did the Badger conclude about work-life balance? Simply that it matters. It isn’t just a trendy phrase. It’s a necessity for sustaining energy, protecting mental and physical health, and keeping one’s mind sharp. It matters because burnout reduces productivity and clouds judgement. Downtime helps the brain reset improving creativity, motivation, and decision making. It also matters because quality time away from work helps to build a broader perspective on life as a whole.

The Badger concluded years ago that there are three certainties regarding people. The first two are a) people and not machines, and b) they are all different. Some thrive on having really intense work periods followed by breaks of really deep rest, while others thrive with a daily structure of predictable routines, boundaries, and pressures interspersed with regular shallower rest periods. We are all different, and so the key to a good work-life balance is simply to adopt a personal rhythm that fuels and refreshes rather than drains your capability. Finding the rhythm that works for you within the terms of your employment contract is important. There’s a paradox, however. Employment contracts normally include a holiday entitlement to rest and recharge, and yet many people don’t take all their entitlement. The reasons for this are numerous, but sometimes it’s because a) the work culture rewards hustle more than rest, and b) that an individual misguidedly thinks that everything will collapse if they take a break. So, what’s the Badger’s third certainty about people? Simple. No one is irreplaceable.

If you accept these people certainties and find your rhythm for work-life balance then you will be healthier, sharper, more productive, and more resilient, and the organisation you work for will perform better too. So, use your holiday entitlement. As the Badger was reminded while nibbling scones in the North Devon sunshine, a break is good for you…

Youngsters outsourcing their mental effort to technology…

Live Aid happened on Saturday 13th July 1985. If you were a young adult then, do you remember what you were doing when the concert happened? Were you there? Did you watch it live on television? The Badger had his hands full that day doing some home renovations while having a one-year-old baby in the house. He thus only saw snippets of the televised live concert. Last weekend, however, he made up for it by watching the highlights broadcast to celebrate the concert’s 40th anniversary.

Watching the highlights brought home why the music at the concert has stood the passage of time. It was delivered by talented people with great skill and showmanship without today’s cosseting production techniques and tech wizardry. What struck a chord most, however, was the enthusiasm of the Wembley Stadium crowd, the vast majority of whom are now grandparents in, or facing, retirement! People in that crowd had none of the internet access, smartphones, or online services we take for granted today. In 1985 the UK’s first cellular telephone services were only just being introduced by Cellnet and Vodafone, and ‘home computing’ meant the likes of the Sinclair ZX Spectrum and the BBC Micro. A far cry from today! Furthermore, those in that crowd represent a generation that thought for themselves and didn’t have their minds dulled by reliance on digital technology and internet-based online services. Their grandchildren, on the other hand, only know life based around the internet, and they often seem oblivious to the likelihood that their reliance on online things like social media might be dulling their minds, nudging them towards a passivity of thought, and perhaps ultimately causing atrophy of their brain.  

Concern about technology dulling human minds isn’t new. In 370 BC, for example, Socrates worried that writing would erode a person’s memory!  With AI endlessly expanding, however, the potential for today’s youngsters to completely outsource mental effort to technology seems very real. More and more  scientific evidence shows  that while the human brain is highly adaptable, digital immersion changes attentiveness, the way we process information, and decision-making. Some brain functions weaken due to digital immersion, others evolve, but the Badger thinks that when our digital world provides instant answers, the joy and effort of discovery through independent thought is dwindling. Always available digital content at our fingertips means fragmented attention spans and contemplation and reflection taking a back seat,  especially for youngsters with no life-experience without today’s online world.

Watching the 40th anniversary highlights thus did more than provide a reminder of the great music of that day. It brought home the fact that today’s  grandparents have something precious – a lived experience of independent thought and contemplation without an overreliance on our digital world. It feels, however, that their grandchildren are progressively outsourcing their mental effort to ever more advanced digital technology which, this grandfather senses, doesn’t augur well for the human race…

Have Millennials benefited from being the first real ‘digital native’ generation?

Millennials are the first ‘digital native’ generation. They’ve grown up with the internet, mobile telephony, and instant information at their fingertips. The digital world of the 1980s and 1990s, when Millennials were born, laid the foundation for today’s advanced capabilities. As Millennials have moved from childhood to adulthood and parenthood,  and from education to employment and family responsibilities, they’ve embraced the relentless wave of digital advances and made them part of their life’s ecosystem. A quick recap of the tech in the 1980s and 1990s illustrates the scale of the digital revolution they have embraced.

The 1980s saw the rise of PCs like the IBM PC, Apple II, and Commodore 64. All had limited processing power. MS-DOS was the popular operating system, hard drives were a few tens of megabytes, and 1.44MB  floppy discs were the common removable storage medium. Software had largely text-based user interfaces and WordPerfect and Lotus 1-2-3 dominated word processing and spreadsheets, respectively. Local Area Networks (LANs) started appearing to connect computers within an organisation, and modems provided dial-up internet access at a maximum rate of 2400 bits/second.

The 1990s saw PCs become more powerful. Windows became a popular operating system making software more user-friendly and feature rich, and Microsoft Office gained traction. CD-ROMs arrived providing 700MB of storage to replace floppy discs, hard drive capacities expanded to several gigabytes, and gaming and multimedia capabilities revolutionized entertainment. Ethernet became standard, computer networks expanded, the World Wide Web, email, and search engines gained traction, and mobile phones and Personal Digital Assistants (PDAs) like Palm Pilot emerged.

Today everything’s exponentially more functionally rich, powerful, and globally connected with lightning-fast fibre-optic and 5G internet connectivity. Cloud computing provides scalable convenience, computing devices are smaller, data storage comes with capacities in the terabyte and petabyte range, and social media, global video conferencing, high-definition multimedia, and gaming is standard. Smartphones are universal, fit in your pocket, have combined the functions of many devices into one, and have processing powers that far exceed those of computers that filled entire rooms in the 1980s and 90s.

But has the Millennial generation benefited from being the first real ‘digital native’ generation? Yes, and no. This generation has faced significant hurdles affecting their earning potential, wealth accumulation, and career opportunities. Student loan debt, rising housing costs, rising taxes, the 2008 global financial crisis and its aftermath, the COVID-19 pandemic, soaring energy costs, and now perhaps tariff wars are just some of these hurdles. When chatting recently to a Millennial group asserting that their generation’s woes were caused by technology, the Badger pointed out first that these hurdles were not the fault of technology, but of ‘people in powerful positions’, and secondly that they should note Forrest Gump’s line ‘Life is like a box of chocolates; you never know what you’re gonna get’. Whatever you get, you have to deal with…and that’s the same for every generation.

Banning social media for the under-16s…

Richard Holway,  a well-known, respected, and influential analyst in the UK software and IT services markets, penned an item last week for TechMarketView entitled What have we done?’. The item relates to the harm that social media and smartphones are doing to children. As a grandparent with a background in software and IT services, and having a grandchild who’s just started school, it struck a chord and reinforced the Badger’s own opinion that they have indeed caused great harm for children under 16. Holding this view doesn’t make the Badger, or anyone else with the same opinion come to that, an anti-tech dinosaur, just a human being who is pro technology that has safety, security, privacy, and human well-being as its paramount priorities. When it comes to ensuring the best for children in their formative years, it seems to be mainly the unprincipled and unscrupulous who argue about having these as dominant priorities.

History is littered with ‘products’ of one kind or another that were widely popular but were ultimately recognised over time as being a danger to human well-being. Plastics, DDT, cigarettes, fossil fuels, asbestos, paint with lead in it, illustrate the point. Did you know that a century ago cigarettes were advertised as being beneficial for asthma and anxiety? Also, incredibly popular patent medicines in the 19th and early 20th centuries  had no restrictions on what they contained. Many contained cocaine, morphine, and heroin. A very popular cough mixture for children did, indeed, include  heroin! Things, of course, changed once society eventually realised the scale of addiction and early deaths that occurred. It has long seemed to the Badger that aspects of our rampant tech-dominated world, especially with regard to social media, are following this same historical template, especially when it comes to use by children.

In little more than two decades, social media has evolved from being a novel way of staying connected to family and friends, into a powerful global force that shapes many dimensions of daily life. Evidence that social media has harmful effects on children is growing all the time. Science shows that social media causes the release of large amounts of dopamine into the human brain just like addictive drugs such as heroin, and even alcohol. No wonder it’s easy to get hooked!

Like Mr Holway, the Badger fully supports the ban on smartphones and social media apps for children under the age of 16. As you can see here, the legal age in the UK is 18 to buy alcohol, tobacco products, knives, and certain types of DVDs and games. The legal age is 16 to buy pets and animals, petrol, matches, and to be in fulltime employment. Why, therefore, shouldn’t smartphones and social media apps be banned for children under the age of 16? As Mr Spock from Star Wars would say, ‘Isn’t it illogical, Jim, to do otherwise?

Social media – in the doghouse again…

Social media platforms are in the doghouse again due to the spread of misinformation, falsehoods, incitement, and hate as a result of the horrendous attack on innocent children in Southport. Media and political rhetoric about the role of social media in the violence and criminality that followed this incident has been predictable. It can be of no surprise that social media was a factor because it’s part of the very fabric of modern life. It’s used by 82.8% of the UK population. Most individuals, businesses, and media, community, and political organisations have a presence on, and actively use, at least one social media platform. Most normal, law-abiding, social media users and organisations will thus have been exposed at some stage to the vitriol, falsehoods, and distorted content that is becoming more and more commonplace on these platforms.

Elon Musk’s war of words with the UK’s Prime Minister, a government minister’s thoughts on X, and a debate about whether we should say goodbye to Mr Musk’s platform,  simply illustrate, the Badger feels, that social media has become more divisive and polarizing than a force for convergence and solutions.  It has disrupted society in just a couple of decades, and it will continue to do so because the platforms are commercial enterprises whose business models and legal status are centred on profiting, without editorial responsibility, from the content their users post. The platforms have become too powerful, and politicians have been like plodding donkeys in dealing with their impact on society.

Social media isn’t all bad and it isn’t going away anytime soon. Handwringing about its role in free speech, something that platforms assert as a defence against regulation, is futile. What’s needed is a lucid articulation of free speech like that given by Rowan Atkinson (Mr Bean) some years ago, followed by aligned, rapid, regulation that a) society’s law-abiding majority can relate to and understand, and b) holds the platforms and their users to account fairly. At the very least, users of a platform must take responsibility for the content they post, and platforms cannot shirk accountability for distributing and making money from content that damages society. Perhaps things will change with the UK’s Online Safety Law now coming into effect? Time, as they say, will tell.

The Badger’s agnostic about social media. He’s never felt that it’s really a good use of his time, but the chances of everyone significantly reducing their addiction to it in today’s world are negligible. But what if they did? The power of platforms would dissipate as their revenues and profits decline, and people would realise they can actually cope and adapt quickly to life without them. Perhaps the riot aftermath of Southport would not have happened? Perhaps it’s time to fight against being addicted slaves? Oops, just remember this is a musing, not an incitement to riot…

Is social media the new tobacco?

The UK’s in the throes of a General Election and, whether we like it or not, social media is an important part of campaigning for politicians, political parties, and any person or organisation wanting to influence the outcome. Social media is the modern billboard. The Badger’s always been cautious about social media, and he engages with it in moderation. Why? Because his IT career spanned the time from its origin through to its evolution into being dominated by the global, revenue and profit dominated goliaths we have today.  He’s learned that it’s a minefield for the unwary, and perilous for those vulnerable to the tsunami of memes, misinformation, disinformation, sales and marketing spin, scams and bile that is regularly delivered. Social media is, of course, here to stay. The Badger, however, overcame any fear of missing out (FOMO) regarding its content many years ago. He thus ignores any content that is election related.

Aside from the UK election, something relevant to social media caught the Badger’s attention this week. It was the USA’s Surgeon General’s call for  tobacco-style warnings on the hazards of using social media. This struck a chord because the Badger’s quietly thought for some time that social media is the new tobacco! The Badger hasn’t lost his mind because, as they say, ‘there’s method to the madness’.

Tobacco’s been with us for centuries, see here.  Cigarettes evolved in the 1830s, and smoking was a norm for adults across UK society in the 1920s, driven ostensibly by cigarettes being included in First World War military rations and heavy advertising by tobacco companies. Smoking continued to grow, with the highest level for men recorded as 82% in1948. Tobacco companies, of course, grew fast, and became extremely rich and powerful. The health issues associated with tobacco were known long before the 1950s when the evidence of the impact of smoking on public health became incontrovertible. Since then, steps have been taken to eliminate smoking. The tobacco companies have fought to protect their revenues, and tobacco-related legislation only really started changing significantly in the early 2000s.

Doesn’t this progression of a product, mass marketing, widespread public adoption as a norm, the growth of wealthy and powerful companies protecting their product at all costs, eventual public realisation of the product’s damage to society and individual health, followed by long overdue corrective action resonate with what’s happening with social media? The Badger thinks it does. For tobacco, the progression has taken a century or more, but for social media it’s happening over just a few decades. The Badger senses that the Surgeon General’s call for tobacco style warnings has its place, but more needs to be done faster or society and individual health will be in an even bigger pickle at the end of this decade. Just a thought…

Work-life balance and an unexpected call from the CEO…

Summer beckons and many will be looking forward to a break from work to enjoy a holiday. Modern technology, however, means that it takes an iron will not to occasionally check work email when relaxing on the beach or quaffing beer in a bar in the evening. Completely detaching from work while on holiday is really important because it benefits your mental and physical wellbeing, and it makes you more focused, creative, and productive on returning to work. A refreshed mind, for example, generates better ideas, is more objective and productive, and is more creative when problem-solving.

The Badger normally took a two-week summer vacation throughout his career. One year, however, after leading a major fixed-price, IT system delivery to completion, his employer approved a three-week break to enable his batteries to fully recharge! The project had been challenging for the whole team from the outset. Everyone had done a magnificent job and were exhausted. The Badger’s three-week break proved to be seminal. It was the first time that he truly detached from every aspect of work while on holiday. The break fully revived his mental sharpness, physical energy, and motivation, and it produced much greater awareness that work-life balance is important no matter what role you fulfil at work.

The Badger returned to work afterwards refreshed, focused, and determined to establish a better work-life balance. On his first day back, while liberally applying the delete key to his email backlog, the company Chief Executive called unexpectedly. Caught off-guard, the Badger’s initial surprise and immediate pang of anxiety quickly dissipated. The CEO wanted the Badger, a delivery practitioner, to join the company’s overall leadership team to oversee all projects across the company. The CEO sensed the Badger’s hesitation and made three points. Firstly, that it was a good career move and also what the company needed. Secondly, that the role would broaden the Badger’s leadership skills, his perspective of how the company operated, and that it would  sharpen the overall leadership team and improve decision-making with company-wide impact. The third was that delivery actually produced the company’s profits, and so home-grown delivery leadership talent was preferable for the role rather than  recruiting externally.

The Badger mentioned his greater appreciation of work-life balance. The CEO chuckled and noted that while every person is different, the reality was that intelligent, focused individuals who want job satisfaction and success find a balance that enables them to achieve these objectives. The Badger took on the role, never looked back, and learned over the years that the CEO was right. Successful careers are built primarily on hard work and getting the job done, and finding the right work-life dynamic that works for the individual and their personal circumstances…

Contracted working hours, and achieving your potential…

The UK’s A-Level exam period is underway and runs until the end of June.  Students sitting these exams receive their results in the middle of August. It’s an intense time, especially for those who’ve applied for University and need to achieve certain grades to confirm a place on their preferred course. According to UCAS, the proportion of UK 18-year-olds applying for University this year stands at 41.3%. That’s up from 38.2% in 2019, but marginally down on 41.5% for 2023. Since last year, however, applications for engineering/technology courses, and mathematical sciences/computing courses, have increased by 10% and 7%, respectively. The Badger thinks that’s a good thing. These subjects are, after all, at the heart of our lives on this planet. Whether we like it or not, it’s science, engineering, maths, and computing  that make everything possible.

While chatting to a teacher recently, their passionate focus on their pupils and desire for good exam results was strongly evident. In particular, they mentioned that seeing their students attain or exceed expectations in their exams was a source of great personal reward for their teaching over the school year. The teacher had strong opinions, one being that people don’t really appreciate that the hours worked by teachers far outweigh those stipulated in their employment contract. ‘That’s actually no different to people working in commercial enterprises; at least you have a long break over the summer’, the Badger commented without thinking. If looks could kill, the Badger would be dead!

The teacher, who’s never worked in a commercial enterprise, was adamant that no one works as hard, or as far beyond the hours stipulated in their employment contract, as teachers. This rankled with the Badger, because it’s not true! An incoming call to the teacher’s smartphone, however, fortuitously stopped the conversation from taking a potentially disagreeable turn. Health professionals in the NHS often convey a view similar to the teacher’s too, but the reality is that many in technical, management, and leadership positions at project, business, and executive levels in commercial operations often work beyond the hours in their employment contract without tangible reward irrespective of greater work-life balance awareness. The performance of their companies would suffer if they didn’t. In fact, research shows it’s the setting and profile of how additional hours are worked that differs greatly between teachers, doctors and their commercial enterprise counterparts, not the actual number of additional hours worked which do not differ vastly.

Well, good luck to those sitting their exams and striving for a place at University. Whatever the outcome, remember one thing. To be successful and have the job satisfaction and the type of rewards you want in your chosen field, an intelligent, hard-working, flexible and can-do ethos will always be a necessary imperative. Working only the hours in an employment contract will rarely help you achieve your full potential…

The biggest challenge of 2024 and beyond…

This year, 2024, will bring many challenges of one kind or another. Every new year, of course, contains challenges, some which already feature in our awareness, and some which don’t because they tend to emerge from leftfield in due course. The online world, the traditional press, and broadcast media provide plenty of opinion on forthcoming challenges at this time of year, but they tend to highlight things that are already the larger blips on our awareness radar. To start the year off, therefore, the Badger set himself a personal challenge, namely, to decide on the world’s biggest challenge for 2024 and beyond, one that deserves to be a much bigger blip on everyone’s  radar.

The following reality provided the backdrop for the Badger’s deliberations:

  • Life today is dominated by digital technology, global connectivity, the internet, automation, and an addiction to smartphones whose applications provide immediacy of information, anytime, anyplace, for ~75% of the world’s population.
  • Digital evolution continues apace, AI is advancing rapidly and cannot be ignored, international conflict is on the rise, politics is increasingly polarised, and the world order is under considerable strain.
  • Unforeseen natural, humanitarian, financial, and economic crises, are an inevitability.

A front runner for the biggest challenge of 2024 and beyond emerged quickly in the Badger’s thoughts, ostensibly because it had already been bubbling in his mind for months. He quickly concluded that this front runner was indeed the world’s biggest challenge. So, what is it? Put simply, it’s to stem the rise of distrust.

Trust is a fundamental component of cooperation, relationships of all kinds, business, service, and interactions between social groups and different cultures. Society is on a slippery slope to failure without it. Unfortunately, research over the last decade or so shows that our levels of distrust have been progressively rising. Distrust in politicians, governments, corporates, and their leaders continues to rise. Similarly, distrust of the internet and social media continues to grow as we all become more aware of data breaches, fake and weaponised news, misinformation, disinformation, online safety, security and privacy issues, swindles, and cyber-crime. AI seems unlikely to change the trend. The Badger thus feels that stemming the rise of distrust  warrants being the world’s greatest challenge if we want a better society for our children and grandchildren.

Addressing this challenge is not easy, but change starts when lots of people make small adjustments to their behaviour. This year the Badger has resolved to stem his rising distrust of  ‘pushed’ online content that has become the norm in our 24×7 online world. He’s breaking the mould, taking back control, and engaging with it differently and more selectively in 2024. New Year resolutions, of course, have a habit of falling by the wayside. It’s early days, but so far so good…