The Future; microchipped, monitored and tracked?

The Badger sank onto the sofa after his infant grandson’s parents collected the little whirlwind following a weekend sleepover. The Badger had been reminded that Generation Alpha are the most digital-immersed cohort yet. Born into a world full of tech, they are digital natives from an early age, as was evident during the activities we did over the weekend. Struck by the youngster’s digital awareness and especially their independence, curiosity, and eagerness to grasp not just what things are, but also why and how they work, the Badger found himself wondering about the digital world that his grandson might encounter in the future.

From his IT experience, the Badger knows that change is continuous and disruptive for IT professionals, organisations, and the public alike. Change in the digital landscape over the last 40 years has been phenomenal. All of the following have caused upheavals on the journey to the digital world we have today: the move from mainframes to client-server and computer networks, relational databases, the PC, spreadsheets and word processing packages, mobile networks and satellite communications, mobile computing, image processing, the internet, online businesses, social media, the cloud, microchip miniaturisation, and advances in software engineering. These have changed the way organisations function, how the general public engages with them, and how people interact with family, friends, and others globally. AI is essentially another transformative upheaval, and one that will impact Generation Alpha and future generations the most.

Data, especially personal data, is the ‘oil’ of today’s and tomorrow’s digital world, and the entities that hold and control it will use it to progress their own objectives. With AI and the automation of everything, the thirst for our data is unlikely to be quenched, which should make us worry about the digital world for Generation Alpha and beyond. Why? Because humans in the hands of tech, rather than the other way around, increasingly seems to be the direction of travel for our world. The UK government’s announcement of a digital ID ‘to help tackle illegal migration, make accessing government services easier, and enable wider efficiencies’ has made the Badger a little uneasy about the digital world his grandson will experience. A backlash, as illustrated by this petition to Parliament, illustrates the scale of worry that it’s a step towards mass surveillance and state control. Governments, after all, do not have good track records in delivering what they say they will.

As the Badger started to doze on the sofa, he envisaged a future where humans are microchipped and have their lives monitored and tracked in real time from birth to death, as happens with farm animals. He resolved to make sure his grandson learns about protecting his personal data and that he values a life with personal freedom rather than control by digital facilities. The Badger then succumbed to sleep, worn out from activities with a member of Generation Alpha…  

Frustration caused by the plague of delivery vans…

Life’s full of ups and downs, and some weeks are better than others! For the Badger, Easter week was full of frustrations, all ostensibly caused by society’s addiction to online shopping with home delivery. Like many, the Badger used his car to visit family, friends, and for errands over the Easter period. Every journey was delayed at some point by the stop, start, and illegal parking activities of vehicles that were part of the ever-growing plague of multi-drop delivery vans on UK streets.

Here’s one example that caused frustration. The Badger drove an elderly neighbour to their appointment at the local health centre, a journey which normally takes ~7 minutes with a route that entails driving through the town’s High Street. Well before reaching this busy street, traffic had slowed to a snail’s pace. This was because a well-known company’s multi-drop delivery van had parked on double-yellow (no-waiting) lines in the middle of the High Street. The driver had left the van to deliver a collection of packages to nearby residences. The illegally parked van caused havoc as car drivers tried to navigate around it against the constant flow of traffic coming in the opposite direction. Just as the Badger reached the High Street, the van driver returned, collected another armful of packages, and walked off with them in a different direction ignoring the obvious disruption their van was causing.

Just before it was the Badger’s turn to navigate past the van, the driver returned, drove off, and stopped again on double-yellow lines 50 metres further along the street. This made the disruption worse because another multi-drop delivery van had parked close by on the opposite side of the road creating a chicane for traffic in both directions. As a result of all this, the 7-minute drive to the Health Centre took nearly 25 minutes, making the Badger’s neighbour slightly late for their appointment. This, and similar experiences on other journeys over the Easter period, triggered some musing.

Online shopping with home delivery has revolutionised convenience, but one consequence is the plague of vans on our roads and the tendency of their drivers to ignore the rules of the road due to tight schedules, high delivery volumes, and the need for frequent stops. Since these van drivers seem to be immune to the rules of the roads, the Badger thinks there’s a need for an enforcement solution. If today’s digital tech can tell you when your online purchase will arrive at your door, then it’s clearly possible to use drone, satellite, and information technologies to a) detect in real-time when multi-drop van drivers park illegally on double yellow lines and b) automatically fine them and their employer for the misdemeanour. It currently seems that no amount of ‘company policy’ or ‘driver training’ makes a difference, but hitting them in their pockets probably will…

A vintage Fortran Source code listing…

The Badger found an old paper Fortran source code listing, in good condition considering its age, at the back of a cupboard this week. It triggered memories of his programming activities early in his IT career. It also caused him to reflect on the changes there have been in IT as a result of the tremendous advances in digital technology, and the way we live and work, over the last 40 years. As illustrated below, this period has been one of continuous, rapid change.

In the 1980s, personal computers began to make their way into businesses and homes. The likes of IBM, Apple, and Microsoft introduced devices that revolutionized how people accessed information and performed tasks. The introduction of graphical user interfaces (GUIs) also made computers more user-friendly enabling a broader audience to embrace technology. The 1990s brought the birth and expansion of the internet, drastically changing communication, commerce, and entertainment. It brought a new level of connectivity and made information accessible globally at the click of a button. E-commerce giants like Amazon and eBay emerged, transforming the retail landscape and giving rise to online shopping.

The 2000s saw the rise of the mobile revolution. With the introduction of smartphones and tablets, technology became ever more integrated into our work and personal lives. Apple’s iPhone and Google’s Android led the charge, creating app-driven ecosystems that allowed users to perform a myriad of tasks on-the-go. Mobile internet access became ubiquitous fostering a new era of social media, instant messaging, and mobile gaming. In the 2010s, Cloud Computing with Amazon Web Services (AWS), Microsoft Azure, and Google Cloud brought scalable, on-demand, computing resources. This facilitated the rise of Software as a Service (SaaS) models which enable access to software applications via the internet and help businesses to reduce infrastructure costs and improve scalability.

In recent years, ‘Big Data’ has meant that organizations can leverage vast amounts of data to gain customer insights, optimize their operations, and make data-driven decisions. AI technologies such as machine learning, natural language processing, and computer vision, are also rapidly being integrated into applications from healthcare and finance to autonomous vehicles and smart home devices. In addition, the COVID-19 pandemic accelerated the adoption of remote working and digital collaboration tools, and video conferencing platforms like Zoom and Microsoft Teams have become essential communication and productivity tools.

Anyone working in the IT world over this period has had an exciting time! The Fortran listing reminded the Badger that it was produced when programming was a very human, hand-crafted activity. Source code today is produced differently, and AI will dominate programming in the future. The Badger’s career spanning all these changes  was challenging, exciting, creative, and one where dynamism, innovation, teamwork, hard work, and a ‘can do’ mentality were embedded workforce traits. Is that the case today? It has to be in a future which is dominated by AI.

AI – A Golden Age or a new Dark Age?

The Badger’s experimented with Microsoft’s Copilot for a while now, sometimes impressed, but often irritated when the tool ends its answer to a question by asking the user’s opinion on the underlying topic of the question. For example, the Badger asked Copilot ‘When will self-driving cars be the majority of vehicles in the UK?’  Copilot’s answer was sensible and distilled from quoted sources, but it ended with ‘What are your thoughts on self-driving cars? Do you think they’ll revolutionize transportation?’. The Badger wanted an answer to his question, not a conversation that will capture, store, and use his opinion for the tool’s own purpose. Responding with ‘None of your business’ gets the reply ‘Got it! If you have any other questions or need assistance with something else, feel free to ask. I’m here to help’. That last phrase should be supplemented with ‘and make money!

Overall, his experimentation has made him wonder if AI is leading to a new Golden Age for humanity, or a new Dark Age. So, what’s the answer? A new Golden Age, or a worrying Dark Age? AI and Machine Intelligence advocates, giant businesses investing huge amounts of money in the technology, and even governments with a ‘fear of missing out’, are quick to say it’s the former. The Nobel Laureate Geoffrey Hinton, the godfather of AI, isn’t so sure. He articulates the risks well, and he’s highlighted that the ability of AI to eventually wipe out humanity isn’t inconceivable. Listening to him interviewed recently on the Today programme, BBC Radio 4’s flagship news and current affairs programme, struck a chord. It made the Badger realise that such concerns are valid, and that a Dark Age is a possibility.

So where does the Badger stand on the Golden or Dark Age question? Well, the last 25 years has made us believe tech-driven change is a good thing, but that premise should be challenged. New technology may drive change, but it doesn’t necessarily drive progress because it’s politics that really determines whether change makes people better off overall. Politicians, however, have struggled woefully to deal with tech-driven change and the new problems it’s created for society so far this century. There’s little sign this is changing for AI. Humans are fallible and can make poor judgements, but if we become reliant on AI to make choices for us, then there’s a real danger that our confidence and capacity to make our own independent decisions will be lost.

The Badger’s answer is thus nuanced. A Golden Age will unfold in areas where AI is a tool providing a tangible benefit under direct human control, but if AI is allowed to become completely autonomous and more intelligent than humans, then a Dark Age is inevitable. Why? Because things with greater overall intelligence always control things of lower overall intelligence. Can you think of an example where the reverse is true?

Human Space travel to Mars? Just send Intelligent Machines…

‘Space, the final frontier. These are the voyages of the Starship Enterprise. Its five-year mission – to explore strange new worlds, to seek out new life and new civilizations, to boldly go where no man has gone before’.

In September 1966, these iconic words were heard for the first time when Star Trek arrived on television. They, in essence, remind us that the pursuit of knowledge and the exploration of the unknown are central to what it means to be human. They inspire us to dream big, embrace challenges, and continually seek to expand our understanding. The words were narrated before man landed on the moon, before the internet, before smartphones and laptops, and when the computing power available to astronauts was miniscule compared to that of a mid-range smartphone. Things have changed extraordinarily since 1966,  but the opening words to Star Trek episodes are just as relevant to what it means to be human today.

Space travel is difficult, as US billionaires will attest (see here and here, for example). Today’s Space-race is different to that of the 1960s with, for example, the likes of India and China part of the fray. Putting humans back on the Moon is a key objective, and the USA’s Artemis programme intends to do just that within the next few years, if things go to plan.  Putting human feet on Mars, as reaffirmed by the USA’s President Trump during his inauguration this week, is also an objective. The Badger, however, senses that it’s unlikely to happen for decades yet, if at all.

Why the scepticism? Well, two things. The first is that putting humans on Mars and bringing them back is much more challenging than returning to the Moon. The second thing is more fundamental. In the ~60 years since Star Trek’s iconic words were first heard, life and our knowledge of Space has been transformed through technological advances, especially in the sphere of capturing, processing, and using information digitally. Advances in digital technology continue apace with AI and intelligent machines fast becoming a reality. Indeed, Mr Trump has announced huge investment in Stargate, AI infrastructure.  The automation of everything with machines becoming as intelligent as humans begs a question, namely ‘Is prolonged human travel in Space really viable and economically sensible?’

The evidence implies that humans are unsuited to prolonged Space travel (e.g. see here and here). So why send humans to Mars when intelligent machines are a better option? Perhaps a rethink of putting humans on Mars will happen as AI and intelligent machines become mainstream, perhaps it won’t. Meantime the Badger wholly subscribes to the pursuit of knowledge and exploration of the unknown, but he will enjoy Star Trek for what it is, just imaginative  entertainment…

Today’s cutting edge is tomorrow’s obsolescence…

As the Badger sat in traffic, a news item on the car radio grabbed his attention. It was a report that there are now no new car models in the UK that come with a CD player. The built-in CD player is joining the cassette tape player in the great scrapyard in the sky! The Badger’s reaction on listening to the report? A little sadness, but not surprised given the speedy evolution of in-car digital infotainment over the last 15 years. The march of connected, integrated, digital technology and the advent of Apple CarPlay and Android Auto have rendered CDs in vehicles obsolete. The Badger glanced at the half-dozen music CDs, a mix of factory-pressed and self-burned, resting in the cubbyhole behind the handbrake and was hit by a wave of nostalgia.

Nostalgia is a natural and common human experience that can help in navigating the  present by drawing comfort and strength from the past. The Badger has a kinship  with his car CDs because they’ve often been played during notable journeys full of either happiness or great sadness. There’s something personally satisfying and engaging about physically selecting a CD, taking it from its case, putting it into the car’s player, adjusting the volume, and then doing the reverse when the last track’s played. Tapping a digital screen or giving voice commands to play your music in a vehicle is a different, less engaging experience. The Badger’s CDs will continue to be played in his car until it too is beckoned by the great scrapyard in the sky.

The demise of in-car CD players is just another illustration that obsolescence is an unavoidable aspect of the rapidly advancing digital age. In the 1980s, the CD put the in-car cassette tape on the path to oblivion with the fitment of cassette players as standard in new vehicles ending completely in the first decade of this century. Now digital systems sourcing music and other entertainment from the ether have essentially done the same thing for the CD player. This implies, of course, that what’s replacing the CD in vehicles today will itself become obsolete in due course, especially as obsolescence is happening faster and faster in the consumer electronics, software, media and entertainment, manufacturing, and automotive industries.

Things once acclaimed as cutting edge are always eventually relegated to the side lines by something else, so what will in-car entertainment look like in a few decades time? Well, if mass adoption of truly self-driving cars becomes a reality, then occupants will absorb entertainment without the distraction of actually driving. In-car entertainment will be dominated by immersive technologies, AI, well-being/mood sensors, and so on, making the driving experience into something akin to that of lazing about in a mobile digital living room. The thought makes the Badger shudder because it represents  another step towards the potential obsolescence of the human race!

The digital world needs nuclear fusion!

When reading recently  that the ITER experimental nuclear fusion reactor under construction in Cadarache, France, is delayed by at least a decade, the Badger sighed deeply. The delay to this huge endeavour, a collaboration involving 35 nations,  inevitably puts back the holy grail of ‘limitless clean energy for the benefit of mankind’ from a tokomak reactor by many decades…again! Why the Badger’s sigh? Because his PhD and subsequent research, many years ago, related to the damage helium ions can cause in potential tokomak first wall materials. At that time, the Joint European Torus (JET) was under construction at Culham in the UK and the ultimate goal of ‘limitless clean energy for the benefit of mankind’ via fusion seemed achievable within the Badger’s lifetime. Realistically, that’s no longer the case, hence the sigh. The holy grail of ‘limitless clean energy‘ from fusion reactors is still far away even though the need for it has never been greater.

ITER, however, is getting a run for its money from private firms within the Fusion Industry Association (FIA). In July 2023, the FIA said that 4 private firms believed they could deliver fusion power into the grid by 2030 and 19 firms believed they could do so by 2035. The Badger’s sceptical. However, given the speedy technological advances of recent decades, these beliefs cannot be completely dismissed if recent technological momentum continues unabated. Wait a minute, you might say if the word ‘nuclear’ always sends uncomfortable shudders down your spine, why do we need power produced by nuclear fusion at all? The answer’s quite simple. As this article points out, and this one reinforces, the world is 86% driven by fossil fuels and energy demand is forecast to rise by 50% from today’s level by 2050. Global energy demand is then expected to triple between 2050 and 2100!  To get anywhere near meeting these forecasts, and have a decarbonised world, requires fusion to provide ‘limitless clean energy for the benefit of mankind’. Yes, wind, solar, and tidal power will play their part, but can they service the scale of this demand without blighting every picturesque part of our planet? That’s debatable.

So, here’s the thing. Digital transformation of the world economy continues at pace. The amount of data created, captured, copied, and consumed will be nearly three times as much in 2025 as in 2020.   AI, the Internet of Things, cryptocurrency, and the digital automation of everything comes with a dramatic increase in electricity usage which cannot be met by non-nuclear renewables alone. When we use our computers, tablets and smartphones we are contributing to the rising demand for electricity, and we are also thus unconsciously making the case for why we need fusion reactors to provide ‘limitless clean energy for the benefit of mankind’. Let’s hope ITER isn’t the only game in town, because if it is then a digital future may not be quite what we currently envisage.

History suggests that a future generation will face a ‘Digital Crisis’…

Spanish philosopher George Santayana is credited with saying ‘Those who cannot remember the past are condemned to repeat it’, and Karl Marx remarked that ‘History repeats itself first as a tragedy, and then as a farce’. These came to mind while quietly musing on a future which is in the hands of younger generations who’ve grown up with global communication, the internet, social media, and online services as a norm. It’s sobering to be reminded that in just a few decades, digital technology and IT has transformed life faster than at any time in human history. AI adds to the unabated momentum of tech-driven change. But here’s the thing. History shows that many things that have a transformational impact on society have serious consequences that only become fully apparent decades later, creating a crisis for society that a future generation is forced to address. History thus implies that a future generation will have to deal with a crisis caused by the digital revolution.

Bold thinking? Maybe, but consider this. History shows that motor vehicles revolutionised transportation. It’s only in recent decades, however, that society has realised, and started addressing, the true impact of motor vehicles on public health and the planet. History also shows that the use of fossil fuels (particularly coal) during the Industrial Revolution transformed the world. Our dependence on them since, however, has impacted the climate and sustainability of life forcing society into corrective action, but only in recent decades. Similarly, plastic – a material that’s made the modern world possible – has gone from being a wonder substance a century ago to being reviled as an environmental scourge today. It therefore seems perfectly feasible that history will repeat itself with regard to the digital revolution we are living through.

Falling happiness in younger generations (see here, for example) and a tense interview with Elon Musk , who remarked that ‘moderation is a propaganda word for censorship’, illustrate that history may well repeat itself regarding social media. Social media platforms have revolutionised information sharing over two decades, but amplifying misinformation, disinformation, bullying, mental health issues, and eroding personal privacy in the process. They are commercial enterprises bound by the law, but they set their own rules and guidelines for content and its moderation. When a US Surgeon General says allowing young people to use social media is like giving them medicine not proven to be safe, and that it’s insane that governments have failed to adequately regulate them, then society has a problem regardless of Mr Musk’s dislike of challenging scrutiny. History means that society today is having to face up to a ‘Climate Crisis’. Taking note of history is always wise, which is why it’s not outlandish to think that a future generation will face and need to address some kind of existential ‘Digital Crisis’ …  

Dr Who and the batteries…

The first episode of Dr Who aired on television on the 23rd November 1963. The series became part of the Badger’s childhood routine, although it almost didn’t! It aired on Saturday evenings, and initially the Badger’s parents didn’t think it suitable for their children to watch on the family’s black and white television. They capitulated following tantrums by the Badger and his siblings, however, on the understanding that if  we had nightmares then the programme would be excluded from Saturday night viewing. We never had nightmares, but we often cowered behind the sofa when our parents were out of the room and an episode included the Daleks or Cybermen.

As an undergraduate at university years later, watching Dr Who with friends on a communal television in the Students Union was a weekly ritual, one which always led to discussions about the episode’s ‘whimsical science’ in the bar afterwards. One friend, a chemistry student who became an electrochemical research scientist in the battery industry, always asserted that the gadgets in Dr Who, the Daleks, and the Cybermen had one thing in common – a fundamental reliance on batteries! Dr Who’s still on television today and the Badger’s still in contact with his friend. In fact, we chatted recently after the Dr Who 60th anniversary special episodes. His friend asserted the same point about batteries that they’d made all those years ago, and they added that any of Dr Who’s gadgets, cyborgs, or robots that were more than two years old needed charging multiple times a day! Since the Badger’s two-year old smartphone now needs more frequent charging than six months ago, we laughed and agreed that smartphones proved their point!

The physics, materials, chemistry and design of modern batteries is complex. According to his friend, in the coming years we’ll see improvements in how fast batteries can charge and how many charging cycles they can withstand, but not a huge change in how long they can last between charges. If that’s the case then battery life, charging frequency, charging speed and depreciation will be key criteria when buying goods requiring batteries for years to come. Depreciation is an often forgotten but particularly sobering point because after 3 years an iPhone, an Android phone, and a battery electric vehicle will have lost ~50%,  ~75%, and 50% of their initial value, respectively.

Dr Who, of course, doesn’t worry about such things, but for those of us in the real-world batteries and the depreciation of the goods they power are key aspects of modern life and the cost of living. Dr Who is full of creative license and not practical matters like batteries and depreciation, and so it should be! It’s science fiction and highly imaginative escapist entertainment. It should trigger to interesting discussions about ‘whimsical science’ and batteries over a beer in a Student Union bar for years to come…

Your face, your voice, AI, and human rights…

In the gap between completing his undergraduate degree and starting post-graduate study, the Badger took a temporary job as an assistant in a dockyard laboratory performing marine metallurgical failure investigations and associated corrosion research. It was a great few months which enabled the application of what he learned during his undergraduate degree to real world events. Those few months are the reason why, for example, the Badger has a particular interest today in the findings of the investigation into the Titan deep-sea submersible failure. The dockyard lab staff were experts with colourful personalities and diverse opinions on a wide range of topics. Engaging in wide-ranging discussions with them, especially at lunchtime in the canteen, was enlightening, thought-provoking, and has been the source of fond memories lasting for years.

One particular memory is of one senior expert, highly respected but always cantankerous and quarrelsome, refusing to be photographed sitting at their electron microscope for a newspaper feature about the laboratory. They didn’t want their image captured and used because, they claimed, it was part of ‘who they were as an individual’ and therefore it was part of their human rights to own and control its use. The lab boss saw things differently, and for days there was a lot of philosophical discussion amongst staff about the expert’s position. The newspaper feature ultimately used a photo of the electron microscope by itself.

The current strike by Hollywood actors, due in part to proposals relating to AI and the use of an actor’s image and voice, brought the memory of the lab expert’s stance regarding their image to the fore. In those days, the law was more straightforward because the internet, social media, personal computers, smart phones, and artificial intelligence didn’t exist. In today’s world, however, images of a person and their voice are routinely captured, shared, and manipulated, often for commercial gain without an individual’s real awareness. The law has, of course, developed – all be it slowly – since the expert’s days at the lab, but the surge in AI in its various guises over the last year seems to illustrate that the gap between legal/regulatory controls and the digital world continues to widen.    

Today, and with advancing AI, an image of you or snippet of your voice can be manipulated for any purpose, good or evil. Whilst there’s some teaching of online safety at school, is it enough? Does it sufficiently raise awareness about protecting ‘your image and your voice which are both key attributes that characterise who you are as a person’? Did the dockyard lab expert have a point, all those years ago, in asserting that it was part of their human rights to own and control their image? The Badger doesn’t have the answers, but he senses that AI and human-rights will inevitably be a fertile ground for campaigners, legislators, and regulators for many decades to come…