AI – Pop goes the weasel!

The Badger’s five-year old grandson, full of energy, innocence, and inquisitiveness, has been staying for a few days. It’s been fun, tiring, and a reminder that grandparents can be important influencers for Generation Alpha!  It was also a reminder that today’s childhood is vastly different to that of previous generations. The Badger’s grandson considers being on WhatsApp video calls, watching kids YouTube videos, and engaging with technology like phones, tablets, and laptops in classroom and home settings as routine. This wasn’t the case when the Badger was five, nor was it when the youngster’s Millennial parents were that age!

One evening, just before the lad’s bedtime, the Badger was on the sofa engrossed in the news feed on his smartphone. Reports of anxiety that AI is a stock market bubble about to pop had grabbed his attention. Some reports (like the one here), but certainly not all, derived from a report from MIT noting that most AI investments made by companies have so far provided zero returns. This fuelled concerns, existing in some quarters for a while, that AI is a stock market bubble soon to crash. Many of the reports drew parallels between AI and the dot.com crash of 25 years ago. As a professional in the IT sector at that time, the Badger experienced first-hand the dot.com era and its aftermath, and so he became absorbed in his own thoughts about the parallels. Until, that is, his grandson jumped on the sofa, prodded the Badger’s ribs, and asked to watch a ‘Pop goes the weasel’ cartoon. Initially struck by the synergy between ‘Pop goes the weasel’ as a good label for his AI thoughts, a suitable YouTube cartoon was found and the two of us watched it on the Badger’s smartphone. (A kids punk-music version of the rhyme didn’t seem suitable just before bedtime).

Once the youngster was in bed, the Badger cogitated further on the dot.com era and AI. The late 1990s saw rapid tech advances with many investors expecting internet-based companies to succeed simply because the internet was an innovation. Companies launched on stock markets even though they had yet to generate meaningful revenue or profits and had no proprietary technology or finished products. Valuations boomed regardless of dodgy fundamentals, and the dot.com crash was thus, to those with objectivity, inevitable. To an extent, some of the same dynamics exist with AI today. It may be a transformative technology, with the likes of ChatGPT having impressive traction with people, but AI is really still in its infancy striving to show a return on investment in a company setting. The Badger senses, therefore, that AI is likely in  sizeable correction rather than dot.com crash territory. This should be no surprise, because the history of tech stock market valuations suggests, to quote the nursery rhyme, ’that’s the way the money goes. Pop goes the weasel’… 

Late payments to subcontractors and suppliers…

Enterprises often hold an annual leadership conference to review the highs, lows, and lessons from the year, and to align their leaders with the business objectives for the year ahead. The Badger first attended such a conference decades ago when all attendees were gathered in the same place for an intense couple of days of formality and informal networking with peers. Enterprises today are increasingly sensitive about the logistical costs and environmental issues associated with gathering people in one place. Many such leadership conferences have thus become more hybrid in nature with smaller, distributed gatherings connected using online video streaming services. This very modern, tech-based approach has many benefits in terms of cost and convenience.

Although the Badger’s first annual leadership conference was a long time ago, he still remembers vividly a particular point made by the company CEO during a presentation lamenting the difficulties of being an IT subcontractor delivering  projects into client’s major programmes. The point was ‘Being a subcontractor is great, but being the prime contractor controlling when a subcontractor gets paid is much, much, much better!’  For some of its projects, the company had been struggling to get prime contractors to pay valid invoices for achieved milestones within contracted terms. The prime contractors had played all kinds of games to pay their subcontractors and suppliers when it suited them, rather than to what was written in their agreed contracted terms. They knew that apart from chasing and whining, subcontractors and suppliers were unlikely to take more forthright action because they wanted to avoid lasting damage to the client relationship in case it excluded them from potential future work opportunities.

Since then, UK legislation in 1998  has made provision for interest on late payment under commercial contracts. However, recent information suggests that only 1 in 10 subcontractors/suppliers enforce this right by actively charging interest, claiming compensation, or seeking debt recovery. This suggests that some level of reluctance remains due to concern about damaging customer relations,  especially for smaller businesses who are, after all, the majority of the UK economy and often heavily dependent on a small number of clients. It may be decades later, but the CEO’s point noted above remains relevant.

Cash flow difficulties can cause liquidity crises and even collapse for any size of enterprise, and so when the Badger heard that the UK government is introducing tougher late payment legislation his first thought was not alleluia, but why hasn’t AI and automation revolutionised payment processing in enterprises to ensure that payments  against valid invoices are always fully paid within contracted terms?  After all, digital technology has been transforming everything for years, and so perhaps this new legislation will add momentum to making a payment revolution happen faster. Let’s hope so. By the way, if you’re interested, you can check how well an enterprise does in paying within terms using the government tool here

AI and copyright…

Elton John recently had some sharp words to say about the UK government’s plans to exempt AI technology firms from copyright laws. Apparently, there’s currently a game of ping-pong underway between the House of Commons and the House of Lords regarding this plan. Many writers, musicians, and artists are furious about the plan, and Elton’s comments caused the Badger to scratch his head and ponder. Why? Because, like many individuals and bloggers, his website’s content could be plundered by AI without his knowledge or permission regardless of the copyright statement on its home page. With AI models and tools increasingly mainstream, Elton’s words made the Badger realise that he, and probably many others around the globe, should have copyright more prominent in our thoughts.

Copyright law is complex and, as far as the Badger understands, ‘fair dealing’ or ‘fair use’ allows limited use of copyright material without permission from the copyright owner under specific circumstances. Fair dealing/use is not a blanket permission, and what constitutes this depends on factors such as how much of the material is used, whether its use is justified, and whether it affects the copyright owner’s income. The Badger’s not a lawyer, but  he senses that AI and copyright is a legal minefield that will keep experts with digital and legal qualifications in lucrative work for years to come.

As the Badger pondered, he scratched his head again and then asked Copilot if AI used material held on copyrighted websites. The short response was that it (and other AI) follows strict copyright guidelines and only generates brief summaries of copyrighted material respecting fair use principles and with pointers to official sources. To test the efficacy of the answer, the Badger asked Copilot for the lyrics of Elton John’s song ‘Candle in the wind’. Copilot responded with ‘Can’t do that due to copyright’. Typing the same request, however, into the Badger’s browser readily produced the lyrics. Make of that what you will, but it does make you wonder why you would need to use AI like Copilot for this kind of interaction.

At the heart of Elton John’s point is the long-established principle that if someone or an enterprise wants to use copyrighted material in something that produces a commercial gain for themselves, then the copyright owner should give prior permission and be paid. AI is a disruptive technology, much of it controlled by the same giant US corporations that already dominate the tech world. AI cannot be ignored, but exempting tech firms from copyright law seems wrong on many different levels. The Badger’s concluded that he should improve his understanding of copyright law, and that AI tech firms must not be exempt from such laws. After all, if you were to take a leaf out of President Trump’s playbook then if you want something, you need permission AND  you must pay.

Have Millennials benefited from being the first real ‘digital native’ generation?

Millennials are the first ‘digital native’ generation. They’ve grown up with the internet, mobile telephony, and instant information at their fingertips. The digital world of the 1980s and 1990s, when Millennials were born, laid the foundation for today’s advanced capabilities. As Millennials have moved from childhood to adulthood and parenthood,  and from education to employment and family responsibilities, they’ve embraced the relentless wave of digital advances and made them part of their life’s ecosystem. A quick recap of the tech in the 1980s and 1990s illustrates the scale of the digital revolution they have embraced.

The 1980s saw the rise of PCs like the IBM PC, Apple II, and Commodore 64. All had limited processing power. MS-DOS was the popular operating system, hard drives were a few tens of megabytes, and 1.44MB  floppy discs were the common removable storage medium. Software had largely text-based user interfaces and WordPerfect and Lotus 1-2-3 dominated word processing and spreadsheets, respectively. Local Area Networks (LANs) started appearing to connect computers within an organisation, and modems provided dial-up internet access at a maximum rate of 2400 bits/second.

The 1990s saw PCs become more powerful. Windows became a popular operating system making software more user-friendly and feature rich, and Microsoft Office gained traction. CD-ROMs arrived providing 700MB of storage to replace floppy discs, hard drive capacities expanded to several gigabytes, and gaming and multimedia capabilities revolutionized entertainment. Ethernet became standard, computer networks expanded, the World Wide Web, email, and search engines gained traction, and mobile phones and Personal Digital Assistants (PDAs) like Palm Pilot emerged.

Today everything’s exponentially more functionally rich, powerful, and globally connected with lightning-fast fibre-optic and 5G internet connectivity. Cloud computing provides scalable convenience, computing devices are smaller, data storage comes with capacities in the terabyte and petabyte range, and social media, global video conferencing, high-definition multimedia, and gaming is standard. Smartphones are universal, fit in your pocket, have combined the functions of many devices into one, and have processing powers that far exceed those of computers that filled entire rooms in the 1980s and 90s.

But has the Millennial generation benefited from being the first real ‘digital native’ generation? Yes, and no. This generation has faced significant hurdles affecting their earning potential, wealth accumulation, and career opportunities. Student loan debt, rising housing costs, rising taxes, the 2008 global financial crisis and its aftermath, the COVID-19 pandemic, soaring energy costs, and now perhaps tariff wars are just some of these hurdles. When chatting recently to a Millennial group asserting that their generation’s woes were caused by technology, the Badger pointed out first that these hurdles were not the fault of technology, but of ‘people in powerful positions’, and secondly that they should note Forrest Gump’s line ‘Life is like a box of chocolates; you never know what you’re gonna get’. Whatever you get, you have to deal with…and that’s the same for every generation.

AI – A Golden Age or a new Dark Age?

The Badger’s experimented with Microsoft’s Copilot for a while now, sometimes impressed, but often irritated when the tool ends its answer to a question by asking the user’s opinion on the underlying topic of the question. For example, the Badger asked Copilot ‘When will self-driving cars be the majority of vehicles in the UK?’  Copilot’s answer was sensible and distilled from quoted sources, but it ended with ‘What are your thoughts on self-driving cars? Do you think they’ll revolutionize transportation?’. The Badger wanted an answer to his question, not a conversation that will capture, store, and use his opinion for the tool’s own purpose. Responding with ‘None of your business’ gets the reply ‘Got it! If you have any other questions or need assistance with something else, feel free to ask. I’m here to help’. That last phrase should be supplemented with ‘and make money!

Overall, his experimentation has made him wonder if AI is leading to a new Golden Age for humanity, or a new Dark Age. So, what’s the answer? A new Golden Age, or a worrying Dark Age? AI and Machine Intelligence advocates, giant businesses investing huge amounts of money in the technology, and even governments with a ‘fear of missing out’, are quick to say it’s the former. The Nobel Laureate Geoffrey Hinton, the godfather of AI, isn’t so sure. He articulates the risks well, and he’s highlighted that the ability of AI to eventually wipe out humanity isn’t inconceivable. Listening to him interviewed recently on the Today programme, BBC Radio 4’s flagship news and current affairs programme, struck a chord. It made the Badger realise that such concerns are valid, and that a Dark Age is a possibility.

So where does the Badger stand on the Golden or Dark Age question? Well, the last 25 years has made us believe tech-driven change is a good thing, but that premise should be challenged. New technology may drive change, but it doesn’t necessarily drive progress because it’s politics that really determines whether change makes people better off overall. Politicians, however, have struggled woefully to deal with tech-driven change and the new problems it’s created for society so far this century. There’s little sign this is changing for AI. Humans are fallible and can make poor judgements, but if we become reliant on AI to make choices for us, then there’s a real danger that our confidence and capacity to make our own independent decisions will be lost.

The Badger’s answer is thus nuanced. A Golden Age will unfold in areas where AI is a tool providing a tangible benefit under direct human control, but if AI is allowed to become completely autonomous and more intelligent than humans, then a Dark Age is inevitable. Why? Because things with greater overall intelligence always control things of lower overall intelligence. Can you think of an example where the reverse is true?

The NHS doesn’t engage or communicate with patients on waiting lists at all…

Statistics show that >80% of the UK population engage in online shopping, an impressive number given Amazon et al only launched in the 1990s. The Badger uses Amazon, amongst others, because the ‘customer journey’ from choosing goods, payment, through to and including delivery, is straightforward, reliable, and provided with  informative tracking information about the journey of the goods. This ‘customer journey’ is founded on solid, integrated IT, designed to engage and communicate with the customer throughout the whole process. Good, proactive, interaction with customers is a norm in today’s online world, which means that any public facing service that doesn’t have it sticks out like a sore thumb!

Last week the Badger visited a neighbour, a statistician long retired from the UK Civil Service,  who’d recently had a fall in the street. Their wife invited the Badger round for coffee and a chat to lift her husband’s spirits. The coffee was good, the conversation lively, and her husband’s spirits were indeed lifted! Given their Civil Service career, government and the NHS inevitably came up in our conversation. At one point the Badger laughed when the statistician asserted that ‘All governments are somewhere on the incompetency spectrum’. They were forthright about the NHS too, saying ‘Unlike Amazon with its customers, the NHS doesn’t engage and communicate with patients on waiting lists at all’.

What triggered this remark is the fact that a NHS hospital consultant told them a year ago not only that they needed an operation, but also that its clinical priority meant it would happen within 2 to 3 months. After 3 months had elapsed with no communication from the hospital, the statistician called to enquire what was happening only to be told they were on the waiting list and would hear something soon. After another 3  months of no contact, they enquired again and got the same response. A year has now passed and there’s been no proactive communication from the hospital at all. Understandably, their trust in the NHS has almost completely evaporated. It wasn’t surprising, therefore, that we decided during our conversation that government should get Amazon to implement proper, ‘customer journey’-like,  21st century ‘patient journey’ engagement, IT, and waiting list communication practices for the NHS. Radical, wacky? May be, but the status quo isn’t working. Not proactively communicating with patients who’ve been on waiting lists for months sticks out like a sore thumb as being behind the times and is plain wrong!

The IT for the online shopping ‘customer journey’ is well established, so surely its principles and mechanisms can be adapted to proactively keep patients informed during their ‘patient journey’? The government’s consulting about NHS changes but can it cut through NHS vested interests? It has to, because there’s a mountain of waiting list patients who’ve already lost confidence that this complex 20th century supertanker will ever be truly fit for the 21st century…

The NHS; a super-sized jumbo jet flying with only one engine…

There’s one thing currently dominating the chatter of many people the Badger encounters, and that’s the UK Budget on the 30th October‘How is it right for me to pay more tax for politicians to fritter away, when the Prime Minister doesn’t buy his own clothes or glasses?’ one pensioner commented. The Badger tries to maintain political neutrality, but there’s little doubt that the new UK government has got off to a bumpy start. However, it’s now starting to flesh out its ‘Change’ agenda and also setting expectations regarding the budget. On the former, for example, the government is calling on the nation to ‘help fix our NHS’. As reported in many places, e.g. here, it wants people to share their experiences and ideas given that we are all users of this huge institution employing more than 1.34 million people. The Badger, having had some exposure to NHS IT during his career and as a patient, has thus contributed to ‘help build a health service fit for the future’ via the government’s website here.

The NHS has been a political football for decades. There’s a regular clamour to give it more money. When it gets additional money, however, it never seems to make an impact, other than to fuel clamour for even more funds – at least that’s how it seems to the Badger. The NHS’s use of modern, integrated, IT is woeful, as neatly illustrated by this New Statesman article in March. By IT, the Badger means the systems that support basic operational processes within and across the NHS’s entities, not the diagnostic and robotic tools that get airtime in the media.

People often tell the Badger of their frustrating NHS experiences, most of which involve aspects where IT plays a part. For example, an NHS phlebotomist bemoaned needing 13 different logon/passwords to deal with blood tests. A relation was appalled on receiving a letter confirming a hospital appointment with Audiology when it should have been with Cardiology! A neighbour was dismayed when a consultant at a post-operative outpatient appointment told them they couldn’t find a CT scan ‘on the system’ even though the scan happened 6 weeks previously at the same hospital. A pensioner, referred from a local hospital for urgent follow-up at a regional hospital, enquired after hearing nothing for 2 months only to be told that ‘there’s no record on our system’ of the referral. The list of similar experiences is long.

Building a ‘health service fit for the future’ is like modernising every aspect of an aging, super-sized, jumbo jet while it’s flying with only one temperamental engine. Few government transformation programmes deliver real change to time and budget, but this one must break the mould, or the jumbo will soon spectacularly crash. That’s why the Badger has not only contributed on the website here, but also urges you to do the same regardless of your political views.

The price for being a Digital Citizen…

The vast majority of people are now ‘digital citizens’. There are many definitions of what being a digital citizen means, but the Badger thinks the term simply describes anyone who regularly uses the internet, online services, and IT to engage with social society, business, work, politics, and government. Becoming a digital citizen, in the Badger’s view, starts when any individual acquires an email address and then shares information online, uses e-commerce to buy merchandise, uses any other online service, or simply browses the internet. Everyone reading this is a digital citizen.

The Badger’s been a digital citizen for more years than he cares to admit to, but over the last decade he’s become circumspect and increasingly alarmed by the deterioration in responsible use of online technology and the internet by individuals and organisations. Yesterday the Badger helped an elderly neighbour carry their shopping bags the last few metres to their doorstep and was invited in for a quick cup of tea as a thank you. During the ensuing conversation, the Badger’s neighbour, a sharp 85-year-old ex-civil servant, mentioned they were a digital agnostic who strongly believed that the digital world has produced a surveillance society. They have a point, especially when you consider the following.

Supermarkets know what we purchase and when from our online transactions and use of debit and loyalty cards. They use this data for their business and to market products to us via, for example, voucher and loyalty point schemes. They don’t sell the data to others, but its theft by bad actors via security breaches can never be ruled out. The same is true for other retail companies. And then there’s the online giants like Amazon, Google, and Meta et al who capture so much data about our interests, behaviours, and habits that they often know more about a person than the person knows about themselves. All of this coupled with the fact that energy, transport, banking, central and local government functions are now also ‘online first’,  just reinforces the fact that the data describing our personal lives is in the digital ether and can be used for purposes which are invisible to most people.

Putting this together holistically with the fact that the UK has one of the highest densities of CCTV cameras amongst Western countries, with approximately 1 camera for every 13 people, it’s difficult to deny that the digital world has produced a surveillance society in barely 25 years. The price for being a digital citizen is thus personal acceptance of more and more surveillance. But here’s an interesting thought to end this musing with. Digital citizens are not just victims of surveillance, they are perpetrators too! Anyone who has checked out others using social media or internet searches has essentially engaged in surveillance. The digital world has thus made us all spies…

In a world of complication, simplicity is best…

The need to replace a broken light switch this week made the Badger think about how the march of digital technology produces a world of complication for the average person. Visiting a local electrical store for a new switch, one that simply turns the lights on or off when you press its rocker, led to a discussion with the store owner, a friend of a friend. They asked if the Badger wanted a ‘normal’ switch or a ‘smart ‘one. The Badger said the former. The shop was quiet, so we chatted.

Knowing the Badger’s IT background, the owner expressed surprise that he didn’t want a ‘smart’ switch that controlled lights using a smartphone app. The owner isn’t actually a fan of ‘smart’ lighting products for the home, but they sell them because they are a highly profitable product line with Millennials apparently the main customers, although sales had dropped recently. The Badger said a conventional switch served his need because it was simple, performed its primary function well (turning a light on or off), and devoid of complications like having a smartphone, an app, a Wi-Fi network or worrying about data security. The owner chuckled and called the Badger a dinosaur! ‘You won’t be buying one of my ‘smart’ fridges or washing machines then?’ they asked waving towards models in the store. They knew the answer.

A discussion on the pros and cons of ‘smart’ fridges and washing machines ensued. The owner believes that most customers for these items never used their digital and network features to the full. Most, they asserted, just used the standard functions that are found on more traditional, cheaper, models. We agreed that competition between manufacturers to add more ‘smart home’ capabilities to their products meant they’ve  become packed with features that make the units more complicated for the average person to use. What’s wrong with a simple to use fridge or washing machine that just concentrates on its fundamental purpose at a sensible price? Nothing, we concluded as the conversation drew to an end with an influx of new customers.

Since the 1980s, when the information technology landscape we have today didn’t exist, a host of technological and societal changes have occurred. Computational power, the internet, the digitisation of data, systems that interact independently, and new business models have had a massive impact, and many people still struggle with the changes and complications to their daily lives. Technology will complicate daily life for the foreseeable future, but people are beginning to shun technology for the simplicity of  traditional and familiar things that work and have done so for years. Do you really need to be able to talk to your fridge and washing machine? Just because modern technology means you can, doesn’t mean you should….

The Law of Unintended Consequences…

If you’ve a couple of minutes spare then read the item here. It was published in 2013 and what’s striking is that the exact same words could be used if it had been written today! A 2010 item, ‘Technology: The law of unintended consequences, by the same author also stands the test of time. Reading both has caused the Badger to muse on unintended consequences, especially those that have emerged from the digital and online world over the last few decades.

The ‘Law of Unintended Consequences’ is real and is, in essence, quite simple. It declares that every action by a person, company, or government can have positive or negative consequences that are unforeseen. An amusing manifestation of the law in action happened in 2016 when a UK Government agency conducted an online poll for the public to name the agency’s latest polar research ship. The public’s choice, Boaty McBoatface, wasn’t the kind of name the agency anticipated!

One characteristic of unintended consequences is that they tend to emerge over a long period. The internet and social media illustrate this neatly. Both have changed the behaviour of people (especially the young), companies, and governments, and both have challenged safety, security, and privacy like never before. Indeed, the Australian government’s recent decision to ban those under 16 years old from social media demonstrates just how long it’s taken to address some of social media’s unintended consequences since its advent a couple of decades ago.

During his IT career, the Badger participated in delivering the many benefits of digital and online technology to society, but now, more mindful of unintended consequences, he wonders if a future dominated by virtuality, AI, and colossal tech corporations is a good thing for his grandson’s generation. After all, the online and digital world is not where real, biological, life takes place, and there’s more to life than being a slave to our devices.

The ‘Law of Unintended Consequences’ can never be ignored. Although a professional and disciplined approach to progress always reduces the scope for unintended consequences, the fact is these will happen. This means, for example, that there’ll be unintended consequences from the likes of AI, driverless vehicles, and robots at home, and that, in practice, it will take years for these unintended consequences to emerge properly. But emerge they will!

Looking back over recent decades, it’s clear that digital and online technology has delivered benefits. It’s also clear that it’s brought complication, downsides, and unintended consequences to the lives of people in all age groups. The Badger’s concluded that we need a law that captures the relationship between progress, unintended consequences, and real life. So, here’s Badger’s Law: ‘Progress always produces unintended consequences that complicate and compromise the real life of people’. Gosh, it’s astonishing where articles penned over a decade ago can take your thoughts…