A smartwatch for wellbeing and health?

Last week the Badger attended his uncle’s 90th birthday. He sat with a group of mostly millennial adults and found himself watching how often they checked their smartphone or smartwatch, and sometimes both. Before the Badger’s uncle blew out the candles on his birthday cake, conversation in the group was convivial and centred on catching up since the last time everyone was together. A smartwatch noisily tinkled and buzzed, and the person sitting opposite the Badger got up and announced to everyone that their watch had told them they’d been sitting for too long! They walked away and returned a few minutes later. When they took their seat, they began talking in a way that sounded like a commercial for smartwatches equipped with health and wellness tracking apps.

A discussion ensued. People in the group were asked if they had smartwatches and found their health apps useful. Most younger adults nodded. A few admitted to being addicted to the well-being and health metrics their smartwatches provided. A couple said they had a smartwatch but rarely used the health and well-being functions, and the remainder, including the Badger, did not have a smartwatch. The Badger was asked why he doesn’t have a smartwatch given his IT/tech background, especially when, as the questioner put it, the health apps ‘would be beneficial at your age.’  In reply, the Badger made two curt points. The first was that his solar powered but otherwise conventional watch and the smartphone in his pocket met all his needs to function while out and about in today’s world. The second was that smartwatches are not approved medical devices, and so their health metrics fundamentally provide the same health guidance that doctors have given for decades – walk more, don’t drink too much alcohol, and maintain a healthy weight. You don’t need an expensive device and constant checking of metrics to comply with that advice. The cutting of the birthday cake stopped further discussion.

While the well-being and health functions on smartwatches do, of course, encourage good health and lifestyle habits for those individuals that need such prompts, many who glance at their smartwatch dozens of times a day to check their metrics are doing so unnecessarily. Does this habitual attention to the likes of step count, heart rate, sleep quality, and sitting too long simply illustrate that people are becoming needlessly addicted to another digital device? Possibly. Smartwatch firms are profit-motivated businesses not health services, and concern about profiling, advertising, and losing control of sensitive personal data would be prudent. Remember, it’s cheaper and better for privacy to simply do what the doctor’s ordered for decades, namely walk more, drink less alcohol, and maintain a healthy weight. Concentrate on living life rather than being a slave to metrics provided by your smartwatch. After all, the Badgers sprightly uncle has reached 90 years of age by doing just that…

The Future; microchipped, monitored and tracked?

The Badger sank onto the sofa after his infant grandson’s parents collected the little whirlwind following a weekend sleepover. The Badger had been reminded that Generation Alpha are the most digital-immersed cohort yet. Born into a world full of tech, they are digital natives from an early age, as was evident during the activities we did over the weekend. Struck by the youngster’s digital awareness and especially their independence, curiosity, and eagerness to grasp not just what things are, but also why and how they work, the Badger found himself wondering about the digital world that his grandson might encounter in the future.

From his IT experience, the Badger knows that change is continuous and disruptive for IT professionals, organisations, and the public alike. Change in the digital landscape over the last 40 years has been phenomenal. All of the following have caused upheavals on the journey to the digital world we have today: the move from mainframes to client-server and computer networks, relational databases, the PC, spreadsheets and word processing packages, mobile networks and satellite communications, mobile computing, image processing, the internet, online businesses, social media, the cloud, microchip miniaturisation, and advances in software engineering. These have changed the way organisations function, how the general public engages with them, and how people interact with family, friends, and others globally. AI is essentially another transformative upheaval, and one that will impact Generation Alpha and future generations the most.

Data, especially personal data, is the ‘oil’ of today’s and tomorrow’s digital world, and the entities that hold and control it will use it to progress their own objectives. With AI and the automation of everything, the thirst for our data is unlikely to be quenched, which should make us worry about the digital world for Generation Alpha and beyond. Why? Because humans in the hands of tech, rather than the other way around, increasingly seems to be the direction of travel for our world. The UK government’s announcement of a digital ID ‘to help tackle illegal migration, make accessing government services easier, and enable wider efficiencies’ has made the Badger a little uneasy about the digital world his grandson will experience. A backlash, as illustrated by this petition to Parliament, illustrates the scale of worry that it’s a step towards mass surveillance and state control. Governments, after all, do not have good track records in delivering what they say they will.

As the Badger started to doze on the sofa, he envisaged a future where humans are microchipped and have their lives monitored and tracked in real time from birth to death, as happens with farm animals. He resolved to make sure his grandson learns about protecting his personal data and that he values a life with personal freedom rather than control by digital facilities. The Badger then succumbed to sleep, worn out from activities with a member of Generation Alpha…  

Cyber security – a ‘Holy Grail’?

King Arthur was a legendary medieval king of Britain. His association with the search for the ‘Holy Grail’, described in various traditions as a cup, dish, or stone with miraculous healing powers and, sometimes, providing eternal youth or infinite sustenance, stems from the 12th century. Since then, the search has become an essential part of Arthurian legend, so much so that Monty Python parodied it in their 1975 film. Indeed, it’s common for people today to refer to any goal that seems impossible to reach as a ‘Holy Grail’. It’s become a powerful metaphor for a desired, ultimate achievement that’s beyond reach.

Recently, bad cyber actors – a phrase used here to refer collectively to wicked individuals, gangs, and organisations, regardless of their location, ideology, ultimate sponsorship or specific motives – have caused a plethora of highly disruptive incidents in the UK. Incidents at the Co-op, Marks & Spencer, Harrods, JLR, and  Kido  have been high profile due to the nature and scale of the impact on the companies themselves, their supply chains, their customers, and also potentially the economy. Behind the scenes (see here, for example) questions are, no doubt, being asked not only of the relevant IT service providers, but also more generally about how vulnerable we are to cyber security threats.

While taking in the colours of Autumn visible through the window by his desk, the Badger found himself mulling over what these incidents imply in a modern world reliant on the internet, online services, automation and underlying IT systems. As the UK government’s ‘Cyber security breaches survey – 2025’ shows, the number of bad cyber actor incidents reported is high, with many more going unreported. AI, as the National Cyber Security Centre  indicates, means that bad actors will inevitably become more effective in their intrusion operations, and so we can expect an increase in the frequency and intensity of cyber threats in the coming years. The musing Badger, therefore, concluded that organisations need to be relentlessly searching for a ‘Holy Grail’ to protect their operations from being vulnerable to serious cyber security breaches. As he watched a few golden leaves flutter to the ground, the Badger also concluded that in a world underpinned by complex IT, continuous digital evolution, and AI, this ‘Holy Grail’ will never be found. But that doesn’t mean organisations should stop searching for it!

These damaging incidents highlight again that cyber security cannot be taken for granted, especially when the tech revolution of recent decades has enabled anyone with a little knowledge and internet access to be a bad cyber actor. The UK government’s just announced the introduction of  digital ID by 2029. Perhaps they have found a ‘Holy Grail’ that guarantees not only the security of personal data, but also that its IT programmes will deliver on time and to their original budget? Hmm, that’s very doubtful…

Security: People are always the weakest link…

The Badger tried to suppress a giggle when the accidental inclusion of a journalist in the US administration’s Signal group chat hit the media. He failed. On watching the US President on television call the journalist in question a ‘sleazebag’, the Badger laughed aloud as the proverbial idiom ‘pot calling the kettle black’ came to mind. The administration’s subsequent bluster about the journalist’s inclusion and the group’s messages has not been its finest hour. Asserting that the military attack information shared was unclassified is, for most independent observers, just ludicrous. Indeed, the whole episode raises many questions, not least being whether the administration’s senior echelons actually respect and adhere to standard security policies and protocols.

Signature of the UK Official Secrets Act and being thoroughly vetted for a high level of security clearance were pre-requisites for the Badger’s first IT projects. Security has thus been an embedded ethos throughout his working life. Sometimes the constraints imposed by security policy and associated processes were frustrating, but the Badger has learned that a cavalier approach to compliance is never a good idea. Rightly, clients and his employer had zero-tolerance for any kind of security misdemeanour. Indeed, on the rare occasions over the years when a security mishap occurred, the situation was quickly rectified and the culprit dealt with swiftly and definitively. Something similar may be happening behind the scenes following the Signal incident, but the US administration’s public messaging doesn’t imply this to be the case.

Later in his career, the Badger was asked to oversee the operations of his employer’s security department. The head of the department expanded the Badger’s appreciation of security matters pertinent to premises, personal safety, vetting, and cyber threats. The department head emphasised the need to keep in mind just one phrase, namely ‘people are always the weakest link‘, when it came to security doctrine. This has proved to be wise advice over the years, and the recent Signal incident simply reinforces the point.

Today, the use of Signal, WhatsApp, X, and social media platforms is rife in the general public and in political and governmental circles. The Signal incident is a reminder for us all that it takes just one participant to leak the substance of a group chat for there to be a problem, and that there’s a greater chance that someone will spill the beans beyond the group when it has a large number of participants. The incident is also a reminder to think carefully about what you write in a group chat. If you don’t then you only have yourself to blame if something you have written comes back to bite you in the future. Think before you write, always, but most of all remember that technology is not normally the weakest link, people are. That’s right…you and me!

The price for being a Digital Citizen…

The vast majority of people are now ‘digital citizens’. There are many definitions of what being a digital citizen means, but the Badger thinks the term simply describes anyone who regularly uses the internet, online services, and IT to engage with social society, business, work, politics, and government. Becoming a digital citizen, in the Badger’s view, starts when any individual acquires an email address and then shares information online, uses e-commerce to buy merchandise, uses any other online service, or simply browses the internet. Everyone reading this is a digital citizen.

The Badger’s been a digital citizen for more years than he cares to admit to, but over the last decade he’s become circumspect and increasingly alarmed by the deterioration in responsible use of online technology and the internet by individuals and organisations. Yesterday the Badger helped an elderly neighbour carry their shopping bags the last few metres to their doorstep and was invited in for a quick cup of tea as a thank you. During the ensuing conversation, the Badger’s neighbour, a sharp 85-year-old ex-civil servant, mentioned they were a digital agnostic who strongly believed that the digital world has produced a surveillance society. They have a point, especially when you consider the following.

Supermarkets know what we purchase and when from our online transactions and use of debit and loyalty cards. They use this data for their business and to market products to us via, for example, voucher and loyalty point schemes. They don’t sell the data to others, but its theft by bad actors via security breaches can never be ruled out. The same is true for other retail companies. And then there’s the online giants like Amazon, Google, and Meta et al who capture so much data about our interests, behaviours, and habits that they often know more about a person than the person knows about themselves. All of this coupled with the fact that energy, transport, banking, central and local government functions are now also ‘online first’,  just reinforces the fact that the data describing our personal lives is in the digital ether and can be used for purposes which are invisible to most people.

Putting this together holistically with the fact that the UK has one of the highest densities of CCTV cameras amongst Western countries, with approximately 1 camera for every 13 people, it’s difficult to deny that the digital world has produced a surveillance society in barely 25 years. The price for being a digital citizen is thus personal acceptance of more and more surveillance. But here’s an interesting thought to end this musing with. Digital citizens are not just victims of surveillance, they are perpetrators too! Anyone who has checked out others using social media or internet searches has essentially engaged in surveillance. The digital world has thus made us all spies…

The Law of Unintended Consequences…

If you’ve a couple of minutes spare then read the item here. It was published in 2013 and what’s striking is that the exact same words could be used if it had been written today! A 2010 item, ‘Technology: The law of unintended consequences, by the same author also stands the test of time. Reading both has caused the Badger to muse on unintended consequences, especially those that have emerged from the digital and online world over the last few decades.

The ‘Law of Unintended Consequences’ is real and is, in essence, quite simple. It declares that every action by a person, company, or government can have positive or negative consequences that are unforeseen. An amusing manifestation of the law in action happened in 2016 when a UK Government agency conducted an online poll for the public to name the agency’s latest polar research ship. The public’s choice, Boaty McBoatface, wasn’t the kind of name the agency anticipated!

One characteristic of unintended consequences is that they tend to emerge over a long period. The internet and social media illustrate this neatly. Both have changed the behaviour of people (especially the young), companies, and governments, and both have challenged safety, security, and privacy like never before. Indeed, the Australian government’s recent decision to ban those under 16 years old from social media demonstrates just how long it’s taken to address some of social media’s unintended consequences since its advent a couple of decades ago.

During his IT career, the Badger participated in delivering the many benefits of digital and online technology to society, but now, more mindful of unintended consequences, he wonders if a future dominated by virtuality, AI, and colossal tech corporations is a good thing for his grandson’s generation. After all, the online and digital world is not where real, biological, life takes place, and there’s more to life than being a slave to our devices.

The ‘Law of Unintended Consequences’ can never be ignored. Although a professional and disciplined approach to progress always reduces the scope for unintended consequences, the fact is these will happen. This means, for example, that there’ll be unintended consequences from the likes of AI, driverless vehicles, and robots at home, and that, in practice, it will take years for these unintended consequences to emerge properly. But emerge they will!

Looking back over recent decades, it’s clear that digital and online technology has delivered benefits. It’s also clear that it’s brought complication, downsides, and unintended consequences to the lives of people in all age groups. The Badger’s concluded that we need a law that captures the relationship between progress, unintended consequences, and real life. So, here’s Badger’s Law: ‘Progress always produces unintended consequences that complicate and compromise the real life of people’. Gosh, it’s astonishing where articles penned over a decade ago can take your thoughts…

Once privacy has gone in the digital world, it’s gone…

Sitting quietly under a parasol, beer in hand, observing a beach full of people enjoying  the recent sunny weather, triggered fond memories of days at the same beach in the 1970s. How things have changed since then! Today, those on the beach are, let’s put it tactfully, ‘bigger’. (The average British man is around 7.62 cm taller, and 10.4 kgs heavier than 50 years ago). Adults with tattoos are commonplace, whereas in the 1970s tattoos featured primarily on seafarers and unruly motorcyclists. When soaking up the sun’s rays today, most beachgoers are using their smartphone or tablet for social media and surfing the internet, for taking copious photos and videos, and for streaming music or watching movies. Printed newspapers and magazines, portable transistor radios and cassette players, and cameras requiring photographic film – all commonplace at the beach in the 1970s – are a rare sight on the beach today.

As he quaffed his beer, the Badger reflected on how the digital world has changed our lives since the 1970s, a decade when pen and paper dominated, a computer was programmed with cards or paper tape, and an affordable electronic pocket calculator was a great leap forward! Way back then, what we take for granted today was science fiction. Progress, however, always comes at a price, and today’s frequent security breaches, data thefts, IT system problems causing widespread disruption and inconvenience, and misinformation, disinformation, and scurrilous content on social media, all expose the fact that part of this price has been an erosion of personal privacy.

When today’s world is typified by things like those reported here, here, here, here, and here, and AI- produced, deepfake video, photos, and audio are ever more commonplace, then people who value their privacy must be wary, clear-headed, and ruthlessly objective about protecting it, much more so than in the 1970s. The Badger, observing the beachgoers liberally using their personal devices, asked himself whether they were doing so with their privacy in the forefront of their mind? Were they conscious of how many online enterprises know their email address, contact details, their likes and dislikes, what they buy and when? Were they conscious that there is a reasonable probability that their personal data has been leaked in cyber-attacks? Were they aware that a deep fake of them can be produced by anyone with scurrilous intent in the digital world from a single image, ~40 seconds of speech audio, and a few cheap AI tools? The Badger’s doubtful.

The advent of the digital world since the 1970s has brought many benefits, but it’s been at the expense of eroded personal privacy. Who’s to blame? Well, blaming others misses the point because protecting our own privacy starts with our own actions and behaviours. So, if you value your privacy, then think very carefully whenever you upload content to the virtual world, because once privacy’s gone, it’s gone…  

Protecting your privacy…

The arrival of a scam email, a television programme on Banking Scams, scurrilous AI generated images of Taylor Swift, news of a fake robocall using President Biden’s voice, and the UK’s National Cyber Security Centre’s warning that the global ransomware threat will rise with AI, made the Badger think about protecting privacy this week.

The following facts underpinned his musing. LinkedIn, Facebook, X (Twitter), Instagram, Snapchat, and TikTok were launched in 2003, 2004, 2006, 2010, 2011, and 2016, respectively. Amazon was founded in 1994, Netflix in 1997, Google in 1998, Spotify in 2006, and WhatsApp in 2009. The first smartphone with internet connectivity arrived in 2000 when life was very different, as neatly illustrated here. Over barely 30 years, tech and these companies have changed the dynamics of daily life, and what constitutes personal privacy, for everyone. These companies, fledglings 25 years ago but now more powerful than many countries, harvest, hold, and use vast swathes of our personal data. What constitutes privacy for an individual has thus inevitably changed, and, the Badger feels, not for the better compared with 25 years ago. What other conclusion could you make when huge data breaches and scandals like Cambridge Analytica expose individuals to security threats and privacy risk like never before? And along comes AI making the risk to individuals much, much worse!

Everything done online today is tracked and used for some purpose. If you use an internet-connected personal device then the world’s plumbing knows where you are and what you’re doing. When it comes to privacy, therefore, the old saying ‘an Englishman’s home is his castle’ was much more relevant 30 years ago than it is today. With vast swathes of our personal data held online it’s hardly surprising that bad actors want to get their hands on it for nefarious purposes. As Channel 5’s  ‘Banking Scams; Don’t get caught out’ programme recently highlighted, just a small amount of your personal data in the wrong hands can make your life a misery. AI just adds another dimension to the potential scale of that misery.

With online interactions a norm of modern life and AI manipulation of images, video, and speech becoming more widespread, the Badger wondered if there’s something other than good cyber security practices that anyone can do to bolster their personal privacy. Well, there is. Don’t post photos, videos, or voice recordings of yourself on social media platforms! Your face, your body, and your voice are part of your real identity, so why make them easy pickings for anyone of a wicked disposition? The Badger’s lost the plot, you may think, but his fundamental point is this. Think about your privacy the next time you post photos, video, or voice recordings on a social media platform. After all, the responsibility for protecting your privacy fundamentally rests with you…  

Your face, your voice, AI, and human rights…

In the gap between completing his undergraduate degree and starting post-graduate study, the Badger took a temporary job as an assistant in a dockyard laboratory performing marine metallurgical failure investigations and associated corrosion research. It was a great few months which enabled the application of what he learned during his undergraduate degree to real world events. Those few months are the reason why, for example, the Badger has a particular interest today in the findings of the investigation into the Titan deep-sea submersible failure. The dockyard lab staff were experts with colourful personalities and diverse opinions on a wide range of topics. Engaging in wide-ranging discussions with them, especially at lunchtime in the canteen, was enlightening, thought-provoking, and has been the source of fond memories lasting for years.

One particular memory is of one senior expert, highly respected but always cantankerous and quarrelsome, refusing to be photographed sitting at their electron microscope for a newspaper feature about the laboratory. They didn’t want their image captured and used because, they claimed, it was part of ‘who they were as an individual’ and therefore it was part of their human rights to own and control its use. The lab boss saw things differently, and for days there was a lot of philosophical discussion amongst staff about the expert’s position. The newspaper feature ultimately used a photo of the electron microscope by itself.

The current strike by Hollywood actors, due in part to proposals relating to AI and the use of an actor’s image and voice, brought the memory of the lab expert’s stance regarding their image to the fore. In those days, the law was more straightforward because the internet, social media, personal computers, smart phones, and artificial intelligence didn’t exist. In today’s world, however, images of a person and their voice are routinely captured, shared, and manipulated, often for commercial gain without an individual’s real awareness. The law has, of course, developed – all be it slowly – since the expert’s days at the lab, but the surge in AI in its various guises over the last year seems to illustrate that the gap between legal/regulatory controls and the digital world continues to widen.    

Today, and with advancing AI, an image of you or snippet of your voice can be manipulated for any purpose, good or evil. Whilst there’s some teaching of online safety at school, is it enough? Does it sufficiently raise awareness about protecting ‘your image and your voice which are both key attributes that characterise who you are as a person’? Did the dockyard lab expert have a point, all those years ago, in asserting that it was part of their human rights to own and control their image? The Badger doesn’t have the answers, but he senses that AI and human-rights will inevitably be a fertile ground for campaigners, legislators, and regulators for many decades to come…

Smart Warfare…

The Badger recently saw an elderly pensioner clash with an Extinction Rebellion (XR) activist at a demonstration in London. The clash triggered the Badger to think about ‘smart warfare’ and reminded him that anything prefixed with ‘smart’ might mean ‘clever’, but it doesn’t necessarily mean ‘good’ or ‘beneficial’.

‘Can you zealots stop blocking my way please’, the pensioner asked politely. ‘I’m not a zealot; I’m a climate activist engaged in smart warfare’, the activist replied with a sneering arrogance. The pensioner responded indignantly with ‘I’ve been climate and environment conscious for years, so you should be ashamed about being at war with me when it’s the big countries in other parts of the world that have the biggest impact on the planet’s climate’. With that the pensioner pushed past the activist blocking their way. With a derogatory hand gesture, the activist turned their attention to someone else. The activist believed they were engaged in ‘smart warfare’, but to the Badger they seemed to be illustrating the polarising, fixated, tribal behaviour that is prevalent in today’s world.

The phrase ‘smart warfare’ tends to trigger thoughts of ever-evolving advanced military weaponry and a future of cyber warfare, swarms of drones, and robots. The activist’s use of the phrase, however, illustrates that ‘smart warfare’  is really modern-day terminology for the centuries-old execution of power over people using whatever clever tools and techniques are available. Tools range from extreme physical violence to the most subtle psychological techniques that enable one mind to influence and control another. Yes, clever advances in technology broadens the tools available and changes how wars are contested, but truly ‘smart warfare’ requires more than just technology, it requires clever, effective, and inspirational leaders, and committed and united people. Russia’s invasion of Ukraine shows that subjugating a population takes more than advanced ballistic, information, cyber, economic, and propaganda weaponry. It’s people that conduct ‘smart warfare’ not technology, and people will always find ways to resist against the odds regardless of the clever technology in use.

As the XR activist’s use of the phrase illustrates, ‘smart warfare’ has broadened beyond the military domain into the routines of normal life in our globally connected, online world full of misinformation, disinformation, processing of personal data, and location and preference tracking. When you buy a traditional newspaper from a shop, there’s no record of the articles and adverts you look at or share with other people, your opinions, other people you associate with, or whether something in the paper prompted you to make a purchase or change your behaviour. The opposite is true when we use the phones, tablets, and laptops that dominate life today. ‘Smart warfare’ is thus a routine aspect of life today because organisations are using clever tools to analyse this information to wield power over us! With this in mind, always use the apps on your phone, tablet, or laptop wisely…