A smartwatch for wellbeing and health?

Last week the Badger attended his uncle’s 90th birthday. He sat with a group of mostly millennial adults and found himself watching how often they checked their smartphone or smartwatch, and sometimes both. Before the Badger’s uncle blew out the candles on his birthday cake, conversation in the group was convivial and centred on catching up since the last time everyone was together. A smartwatch noisily tinkled and buzzed, and the person sitting opposite the Badger got up and announced to everyone that their watch had told them they’d been sitting for too long! They walked away and returned a few minutes later. When they took their seat, they began talking in a way that sounded like a commercial for smartwatches equipped with health and wellness tracking apps.

A discussion ensued. People in the group were asked if they had smartwatches and found their health apps useful. Most younger adults nodded. A few admitted to being addicted to the well-being and health metrics their smartwatches provided. A couple said they had a smartwatch but rarely used the health and well-being functions, and the remainder, including the Badger, did not have a smartwatch. The Badger was asked why he doesn’t have a smartwatch given his IT/tech background, especially when, as the questioner put it, the health apps ‘would be beneficial at your age.’  In reply, the Badger made two curt points. The first was that his solar powered but otherwise conventional watch and the smartphone in his pocket met all his needs to function while out and about in today’s world. The second was that smartwatches are not approved medical devices, and so their health metrics fundamentally provide the same health guidance that doctors have given for decades – walk more, don’t drink too much alcohol, and maintain a healthy weight. You don’t need an expensive device and constant checking of metrics to comply with that advice. The cutting of the birthday cake stopped further discussion.

While the well-being and health functions on smartwatches do, of course, encourage good health and lifestyle habits for those individuals that need such prompts, many who glance at their smartwatch dozens of times a day to check their metrics are doing so unnecessarily. Does this habitual attention to the likes of step count, heart rate, sleep quality, and sitting too long simply illustrate that people are becoming needlessly addicted to another digital device? Possibly. Smartwatch firms are profit-motivated businesses not health services, and concern about profiling, advertising, and losing control of sensitive personal data would be prudent. Remember, it’s cheaper and better for privacy to simply do what the doctor’s ordered for decades, namely walk more, drink less alcohol, and maintain a healthy weight. Concentrate on living life rather than being a slave to metrics provided by your smartwatch. After all, the Badgers sprightly uncle has reached 90 years of age by doing just that…

The Future; microchipped, monitored and tracked?

The Badger sank onto the sofa after his infant grandson’s parents collected the little whirlwind following a weekend sleepover. The Badger had been reminded that Generation Alpha are the most digital-immersed cohort yet. Born into a world full of tech, they are digital natives from an early age, as was evident during the activities we did over the weekend. Struck by the youngster’s digital awareness and especially their independence, curiosity, and eagerness to grasp not just what things are, but also why and how they work, the Badger found himself wondering about the digital world that his grandson might encounter in the future.

From his IT experience, the Badger knows that change is continuous and disruptive for IT professionals, organisations, and the public alike. Change in the digital landscape over the last 40 years has been phenomenal. All of the following have caused upheavals on the journey to the digital world we have today: the move from mainframes to client-server and computer networks, relational databases, the PC, spreadsheets and word processing packages, mobile networks and satellite communications, mobile computing, image processing, the internet, online businesses, social media, the cloud, microchip miniaturisation, and advances in software engineering. These have changed the way organisations function, how the general public engages with them, and how people interact with family, friends, and others globally. AI is essentially another transformative upheaval, and one that will impact Generation Alpha and future generations the most.

Data, especially personal data, is the ‘oil’ of today’s and tomorrow’s digital world, and the entities that hold and control it will use it to progress their own objectives. With AI and the automation of everything, the thirst for our data is unlikely to be quenched, which should make us worry about the digital world for Generation Alpha and beyond. Why? Because humans in the hands of tech, rather than the other way around, increasingly seems to be the direction of travel for our world. The UK government’s announcement of a digital ID ‘to help tackle illegal migration, make accessing government services easier, and enable wider efficiencies’ has made the Badger a little uneasy about the digital world his grandson will experience. A backlash, as illustrated by this petition to Parliament, illustrates the scale of worry that it’s a step towards mass surveillance and state control. Governments, after all, do not have good track records in delivering what they say they will.

As the Badger started to doze on the sofa, he envisaged a future where humans are microchipped and have their lives monitored and tracked in real time from birth to death, as happens with farm animals. He resolved to make sure his grandson learns about protecting his personal data and that he values a life with personal freedom rather than control by digital facilities. The Badger then succumbed to sleep, worn out from activities with a member of Generation Alpha…  

A week without access to the online world…

Are you brave enough to survive for a week without accessing the online world using your personal smartphone, tablet, laptop, or desktop? This was the question asked by the Badger’s wife shortly before the Badger and his millennial son departed for a short adventure on the North Devon coast last week. We answered affirmatively but decided to take our smartphones, which would remain switched off all week, in case they were needed in an emergency. We all saw this as commonsense given our intent to walk the rugged North Devon coastal path which, at the time, was covered by a yellow weather warning for high wind and rain. With a little trepidation about relinquishing personal access to the virtual world by taking no laptops or tablets and only having switched off smartphones in our pockets, we departed for North Devon wondering how long it would take before we succumbed to turning on our phones. Did we survive the week without succumbing to temptation? Of course we did.

The first evening at our destination was unsurprisingly difficult given that everyone today has become conditioned to having instant access to communication, banking, shopping, social media, and the internet through personal devices. People in the UK, for example, apparently check their smartphones every ten minutes, so imagine how you’d feel if this wasn’t possible. It took an iron will, some beers, and some proper conversation about the world that evening to keep our discipline and not succumb to switching on our smartphones.

The subsequent days were easier. Walking the coastal path in blustery, variable weather concentrated the mind on real, rather than virtual, world matters. The dormant smartphones in our pockets provided reassurance as we walked, but they stayed unused because no emergencies arose. In fact, we never turned them on all week. On the final night of our stay, we visited a bar and reflected on our week of virtual-world disconnection while watching a magnificent sunset over a choppy sea. We realised that our ‘fear of missing out’ from having no access to the virtual world had disappeared within 48 hours of arriving in Devon. We were proud to have resisted the temptation to use our smartphones, and we felt that detachment from the online world, and its pushed content, had contributed to how refreshed we felt mentally and physically.

We drove home the next morning and then ceremonially turned on our smartphones. We had, as expected, missed nothing of substance by our detachment from the virtual world for a week. This prompted the Badger’s son to state that although the online world has its place in modern life, real life will always go on if it’s not there. That’s a truth. The question is, are you brave and disciplined enough to survive without access from personal devices to the online world the next time you take a short break? If not, why not?

Nuclear reactors on the moon – a geopolitical investment in future dominance of Space…

The building of software and systems for Space missions, and to control satellites and process associated data, was an interesting and  fascinating area throughout the Badger’s IT career. Today it’s easy to forget that the imagery we take for granted with the weather forecast is produced by systems and software created by developers with excellent science,  engineering, and computing credentials, most of whom have little interest in working outside the Space sector. The Badger observed, over the years, that developers in this area often preferred to leave for another Space sector company rather than be assigned to a project outside the sector if there was a lull in available projects.

The Badger thus had two initial thoughts when the US announced an acceleration of its plans to put a nuclear reactor on the moon. The first was ‘Great. More opportunities for developers in the Space sector if AI hasn’t taken their jobs’. The second was more philosophical and about the tension between visionary ambitions and pragmatic, grounded responsibility. The US plan, and the equivalents of Russia and China, is driven by a mix of strategic, technological, and geopolitical motives. However, is it sensible and in humanity’s interest for the Earth’s most powerful nations to spend huge amounts of money on Space endeavours when there’s a pressing need for it to be spent resolving problems on our planet? Should there be investment in long-term Space infrastructure that might, a long time from now, redefine humanity’s future? The answers depend, of course, on your perspective on life and our world.

Some see Space endeavours as a driver of innovation and ultimate human survival, whereas others see them as distractions from addressing real problems here on Earth. To the Badger, all plans for a nuclear reactor on the moon simply illustrate the shift away from an ethos of inquisitive exploration to one of establishing national strategic dominance making Space a domain of economic leverage, diplomacy, and warfare. Regardless of who does it, putting a reactor on the moon is an outright geopolitical investment in establishing future dominance. The prospect of the geopolitical tensions we see on Earth playing out on the Moon and beyond seems, at least to the Badger, grotesque.

Investments in Space endeavours push technological boundaries, reshape thinking, and stimulate innovation, but the fact is that humans are biologically unsuited to the environment beyond our planet is undeniable. So, in an age of automation, robotics, and AI, why spend huge sums sending and supporting humans on the Moon and beyond when robots can do the same job and the savings can be used to address humanity’s issues here on Earth? Is that idealism? Perhaps, but all it would take is leadership on behalf of all of humanity rather than individual nations. And there’s the rub, the likelihood of that ever happening, of course, is…er…zero.

Work-life balance…

Work can be all-consuming. Organisations emphasise values like ‘employee well-being’ and a ‘people-first culture’, but most really operate with deliverables and timelines as their overwhelming priority. HR departments may advocate for ‘work-life balance’, but business leaders, project, programme, and service delivery leaders always push staff for huge effort and heroics to meet a deadline or milestone. In the IT sector, for example, do organisations ever willingly miss a deadline or milestone because of ‘employee well-being’ or their ‘people-first culture’? No.

The Badger’s just had some downtime in Morthoe on the UK’s North Devon coast. The apartment in which he stayed had wonderful coastal views, and it was while nibbling a scone on its balcony in the afternoon sun that thoughts turned to work-life balance. Life on the North Devon coast still provides access to all of today’s online services, but the sounds, the sea, the geology, the flora and fauna, and the local lifestyle forces relaxation and puts work-life balance into perspective. What did the Badger conclude about work-life balance? Simply that it matters. It isn’t just a trendy phrase. It’s a necessity for sustaining energy, protecting mental and physical health, and keeping one’s mind sharp. It matters because burnout reduces productivity and clouds judgement. Downtime helps the brain reset improving creativity, motivation, and decision making. It also matters because quality time away from work helps to build a broader perspective on life as a whole.

The Badger concluded years ago that there are three certainties regarding people. The first two are a) people and not machines, and b) they are all different. Some thrive on having really intense work periods followed by breaks of really deep rest, while others thrive with a daily structure of predictable routines, boundaries, and pressures interspersed with regular shallower rest periods. We are all different, and so the key to a good work-life balance is simply to adopt a personal rhythm that fuels and refreshes rather than drains your capability. Finding the rhythm that works for you within the terms of your employment contract is important. There’s a paradox, however. Employment contracts normally include a holiday entitlement to rest and recharge, and yet many people don’t take all their entitlement. The reasons for this are numerous, but sometimes it’s because a) the work culture rewards hustle more than rest, and b) that an individual misguidedly thinks that everything will collapse if they take a break. So, what’s the Badger’s third certainty about people? Simple. No one is irreplaceable.

If you accept these people certainties and find your rhythm for work-life balance then you will be healthier, sharper, more productive, and more resilient, and the organisation you work for will perform better too. So, use your holiday entitlement. As the Badger was reminded while nibbling scones in the North Devon sunshine, a break is good for you…

Software – The invisible infrastructure of daily life

We’re living in a world where software is the invisible infrastructure for daily life. There doesn’t appear to be any quantitative measure of what proportion of everything we use actually contains software, but it’s not outrageous to assert that almost everything we use depends on it. Whatever the real proportion is, with the advent of AI and the Internet of Things (IoT) it’s only climbing. Software has blossomed from primitive beginnings to become crucial in every facet of life in less than 75 years. In the 1950s software wasn’t a ‘product’ but something bundled with hardware used primarily for scientific calculations, military simulations, and data processing. Most of it was written in machine code or early assembly languages tailored to specific hardware, and it was written by the same engineers who built the hardware. A large program was a few thousand lines of code loaded manually via cards or magnetic tape. FORTRAN only emerged in 1956 and the term ‘software engineering’ was only coined in the late 1960s/early 1970s.

How things have changed. Today’s internet, search tools, social media, cars, medical devices, satellites, aircraft, trains, weapons, smartphones and so on depend on many, many millions of lines of code, see here and here for example. Modern health, financial, energy, government, intelligence, and defence capabilities all rely on huge amounts of software. Indeed, any item in our homes that can sync with a smartphone app contains software. In less than 75 years software has changed life, taken over the world, and become professionalised in the way it’s produced. Writing machine code for specific hardware in a way that ‘every byte counts’ has evolved into the professional, ever changing and improving, discipline of software engineering which incorporates design and development processes, methods, standards, tools and techniques that ensure that production software meets requirements, is scalable, tested, robust, maintainable, secure, and performant. Software engineering, of which coding is today just a subset, continues to evolve with, for example, the likes of Microsoft and Google anticipating that AI will render hand crafting of code redundant by the end of this decade.  

The software woven into our lives in recent decades has brought immense convenience and transformed communication, public services, business, and conflict, as world events illustrate. Today it’s undeniably critical infrastructure, but software, unlike most infrastructure, isn’t something you can tangibly touch! Compared to what existed 75 years ago, the software in use today is a vulnerable sprawling metropolis with vulnerabilities that bad actors can use to cause disruption. Are we safer today than in the 1950s now that everything depends on software? Hmm, that’s debateable, but what the Badger senses is that even with the advent of AI, software engineering as a discipline and the career prospects for software engineers focused on solutions, security, quality, robustness, and testing rather than coding, are good for many, many years yet…

VE Day, Gen-Z, resilience and preparedness…

Many have family members who lived through the violence and hardships of World War 2 as civilians or combatants. Their experiences shaped not only their own lives, but also the values they instilled in their children. The Badger’s father, for example, proudly served his country in the military and then worked hard to create a better life for his family once he was demobbed. He was the epitome of that ‘Keep calm and carry on’ and ‘There’s no such word as can’t, try!’ generation, and he brought his children up to embody discipline, standards, hard work, duty, calm objectivity, preparedness, and a sense of right and wrong. These instilled values have served the Badger well over the years. The 80th anniversary of Victory in Europe (VE), a day which saw spontaneous rejoicing and street parties, is being celebrated on Thursday the 8th May 2025. It’s an opportunity to reflect on the sacrifices and resilience of a WW2 generation, civilians and combatants, who resisted tyranny. It will be poignant for the Badger because his father, sadly no longer with us, was unable to celebrate on VE Day at the time.

Life is very different today, as the Badger explained to a couple of Generation Z digital natives last weekend. Homes in the 1940s  were different. The internet, social media, instant communication, music and video streaming, electronic games, smartphones, personal computers, online banking, online shopping, robots, and driverless cars were science fiction, and children played physical games that would make today’s health and safety coterie wince. The Gen-Z natives struggled to relate to how life functioned in the 1940s without digital technology. The Badger then asked them two questions – what would you do if a) the UK experienced an electricity blackout akin to that seen recently  on the Iberian peninsula, or b) cyber-attacks took out online and critical infrastructure services for a prolonged period. ‘We’ll get by until someone sorts things out’ was the glib response, although they had no real idea about how they would actually get by! This made the Badger wonder about the resilience of our completely digital-native Gen-Z generation. As individuals, perhaps we’ve all become complacent about the risks associated with our dependence on digital services.  

In fact, do you know how you would ‘keep calm and carry on’ if digital services suddenly disappeared for a prolonged period? Do you have any personal emergency measures or pack of essentials to fall back on if something catastrophic happened to the electricity grid? Individuals rarely consider such questions even though our digital world is highly complex and believing ‘it’ll never happen’ just reflects naivety. Without their tech will digital-native Gen-Z ever be as resilient, resourceful, and prepared to make sacrifices like those of the 1940s in really tough times? If the Badger’s conversation was anything to go by, the jury’s most definitely out…

Have Millennials benefited from being the first real ‘digital native’ generation?

Millennials are the first ‘digital native’ generation. They’ve grown up with the internet, mobile telephony, and instant information at their fingertips. The digital world of the 1980s and 1990s, when Millennials were born, laid the foundation for today’s advanced capabilities. As Millennials have moved from childhood to adulthood and parenthood,  and from education to employment and family responsibilities, they’ve embraced the relentless wave of digital advances and made them part of their life’s ecosystem. A quick recap of the tech in the 1980s and 1990s illustrates the scale of the digital revolution they have embraced.

The 1980s saw the rise of PCs like the IBM PC, Apple II, and Commodore 64. All had limited processing power. MS-DOS was the popular operating system, hard drives were a few tens of megabytes, and 1.44MB  floppy discs were the common removable storage medium. Software had largely text-based user interfaces and WordPerfect and Lotus 1-2-3 dominated word processing and spreadsheets, respectively. Local Area Networks (LANs) started appearing to connect computers within an organisation, and modems provided dial-up internet access at a maximum rate of 2400 bits/second.

The 1990s saw PCs become more powerful. Windows became a popular operating system making software more user-friendly and feature rich, and Microsoft Office gained traction. CD-ROMs arrived providing 700MB of storage to replace floppy discs, hard drive capacities expanded to several gigabytes, and gaming and multimedia capabilities revolutionized entertainment. Ethernet became standard, computer networks expanded, the World Wide Web, email, and search engines gained traction, and mobile phones and Personal Digital Assistants (PDAs) like Palm Pilot emerged.

Today everything’s exponentially more functionally rich, powerful, and globally connected with lightning-fast fibre-optic and 5G internet connectivity. Cloud computing provides scalable convenience, computing devices are smaller, data storage comes with capacities in the terabyte and petabyte range, and social media, global video conferencing, high-definition multimedia, and gaming is standard. Smartphones are universal, fit in your pocket, have combined the functions of many devices into one, and have processing powers that far exceed those of computers that filled entire rooms in the 1980s and 90s.

But has the Millennial generation benefited from being the first real ‘digital native’ generation? Yes, and no. This generation has faced significant hurdles affecting their earning potential, wealth accumulation, and career opportunities. Student loan debt, rising housing costs, rising taxes, the 2008 global financial crisis and its aftermath, the COVID-19 pandemic, soaring energy costs, and now perhaps tariff wars are just some of these hurdles. When chatting recently to a Millennial group asserting that their generation’s woes were caused by technology, the Badger pointed out first that these hurdles were not the fault of technology, but of ‘people in powerful positions’, and secondly that they should note Forrest Gump’s line ‘Life is like a box of chocolates; you never know what you’re gonna get’. Whatever you get, you have to deal with…and that’s the same for every generation.

AI – A Golden Age or a new Dark Age?

The Badger’s experimented with Microsoft’s Copilot for a while now, sometimes impressed, but often irritated when the tool ends its answer to a question by asking the user’s opinion on the underlying topic of the question. For example, the Badger asked Copilot ‘When will self-driving cars be the majority of vehicles in the UK?’  Copilot’s answer was sensible and distilled from quoted sources, but it ended with ‘What are your thoughts on self-driving cars? Do you think they’ll revolutionize transportation?’. The Badger wanted an answer to his question, not a conversation that will capture, store, and use his opinion for the tool’s own purpose. Responding with ‘None of your business’ gets the reply ‘Got it! If you have any other questions or need assistance with something else, feel free to ask. I’m here to help’. That last phrase should be supplemented with ‘and make money!

Overall, his experimentation has made him wonder if AI is leading to a new Golden Age for humanity, or a new Dark Age. So, what’s the answer? A new Golden Age, or a worrying Dark Age? AI and Machine Intelligence advocates, giant businesses investing huge amounts of money in the technology, and even governments with a ‘fear of missing out’, are quick to say it’s the former. The Nobel Laureate Geoffrey Hinton, the godfather of AI, isn’t so sure. He articulates the risks well, and he’s highlighted that the ability of AI to eventually wipe out humanity isn’t inconceivable. Listening to him interviewed recently on the Today programme, BBC Radio 4’s flagship news and current affairs programme, struck a chord. It made the Badger realise that such concerns are valid, and that a Dark Age is a possibility.

So where does the Badger stand on the Golden or Dark Age question? Well, the last 25 years has made us believe tech-driven change is a good thing, but that premise should be challenged. New technology may drive change, but it doesn’t necessarily drive progress because it’s politics that really determines whether change makes people better off overall. Politicians, however, have struggled woefully to deal with tech-driven change and the new problems it’s created for society so far this century. There’s little sign this is changing for AI. Humans are fallible and can make poor judgements, but if we become reliant on AI to make choices for us, then there’s a real danger that our confidence and capacity to make our own independent decisions will be lost.

The Badger’s answer is thus nuanced. A Golden Age will unfold in areas where AI is a tool providing a tangible benefit under direct human control, but if AI is allowed to become completely autonomous and more intelligent than humans, then a Dark Age is inevitable. Why? Because things with greater overall intelligence always control things of lower overall intelligence. Can you think of an example where the reverse is true?

Human Space travel to Mars? Just send Intelligent Machines…

‘Space, the final frontier. These are the voyages of the Starship Enterprise. Its five-year mission – to explore strange new worlds, to seek out new life and new civilizations, to boldly go where no man has gone before’.

In September 1966, these iconic words were heard for the first time when Star Trek arrived on television. They, in essence, remind us that the pursuit of knowledge and the exploration of the unknown are central to what it means to be human. They inspire us to dream big, embrace challenges, and continually seek to expand our understanding. The words were narrated before man landed on the moon, before the internet, before smartphones and laptops, and when the computing power available to astronauts was miniscule compared to that of a mid-range smartphone. Things have changed extraordinarily since 1966,  but the opening words to Star Trek episodes are just as relevant to what it means to be human today.

Space travel is difficult, as US billionaires will attest (see here and here, for example). Today’s Space-race is different to that of the 1960s with, for example, the likes of India and China part of the fray. Putting humans back on the Moon is a key objective, and the USA’s Artemis programme intends to do just that within the next few years, if things go to plan.  Putting human feet on Mars, as reaffirmed by the USA’s President Trump during his inauguration this week, is also an objective. The Badger, however, senses that it’s unlikely to happen for decades yet, if at all.

Why the scepticism? Well, two things. The first is that putting humans on Mars and bringing them back is much more challenging than returning to the Moon. The second thing is more fundamental. In the ~60 years since Star Trek’s iconic words were first heard, life and our knowledge of Space has been transformed through technological advances, especially in the sphere of capturing, processing, and using information digitally. Advances in digital technology continue apace with AI and intelligent machines fast becoming a reality. Indeed, Mr Trump has announced huge investment in Stargate, AI infrastructure.  The automation of everything with machines becoming as intelligent as humans begs a question, namely ‘Is prolonged human travel in Space really viable and economically sensible?’

The evidence implies that humans are unsuited to prolonged Space travel (e.g. see here and here). So why send humans to Mars when intelligent machines are a better option? Perhaps a rethink of putting humans on Mars will happen as AI and intelligent machines become mainstream, perhaps it won’t. Meantime the Badger wholly subscribes to the pursuit of knowledge and exploration of the unknown, but he will enjoy Star Trek for what it is, just imaginative  entertainment…