Software – The invisible infrastructure of daily life

We’re living in a world where software is the invisible infrastructure for daily life. There doesn’t appear to be any quantitative measure of what proportion of everything we use actually contains software, but it’s not outrageous to assert that almost everything we use depends on it. Whatever the real proportion is, with the advent of AI and the Internet of Things (IoT) it’s only climbing. Software has blossomed from primitive beginnings to become crucial in every facet of life in less than 75 years. In the 1950s software wasn’t a ‘product’ but something bundled with hardware used primarily for scientific calculations, military simulations, and data processing. Most of it was written in machine code or early assembly languages tailored to specific hardware, and it was written by the same engineers who built the hardware. A large program was a few thousand lines of code loaded manually via cards or magnetic tape. FORTRAN only emerged in 1956 and the term ‘software engineering’ was only coined in the late 1960s/early 1970s.

How things have changed. Today’s internet, search tools, social media, cars, medical devices, satellites, aircraft, trains, weapons, smartphones and so on depend on many, many millions of lines of code, see here and here for example. Modern health, financial, energy, government, intelligence, and defence capabilities all rely on huge amounts of software. Indeed, any item in our homes that can sync with a smartphone app contains software. In less than 75 years software has changed life, taken over the world, and become professionalised in the way it’s produced. Writing machine code for specific hardware in a way that ‘every byte counts’ has evolved into the professional, ever changing and improving, discipline of software engineering which incorporates design and development processes, methods, standards, tools and techniques that ensure that production software meets requirements, is scalable, tested, robust, maintainable, secure, and performant. Software engineering, of which coding is today just a subset, continues to evolve with, for example, the likes of Microsoft and Google anticipating that AI will render hand crafting of code redundant by the end of this decade.  

The software woven into our lives in recent decades has brought immense convenience and transformed communication, public services, business, and conflict, as world events illustrate. Today it’s undeniably critical infrastructure, but software, unlike most infrastructure, isn’t something you can tangibly touch! Compared to what existed 75 years ago, the software in use today is a vulnerable sprawling metropolis with vulnerabilities that bad actors can use to cause disruption. Are we safer today than in the 1950s now that everything depends on software? Hmm, that’s debateable, but what the Badger senses is that even with the advent of AI, software engineering as a discipline and the career prospects for software engineers focused on solutions, security, quality, robustness, and testing rather than coding, are good for many, many years yet…

Once upon a time there was the Strategic Defense Initiative (Starwars)…

There comes a time when a room at home needs a decorative refresh. That time recently came in the Badger household, and so he deployed his practical skills to refurbish the room himself. The project was planned, agreed with an important stakeholder (the wife), and fastidiously executed. The room’s now in the post-delivery phase with the small list of defects pointed out at acceptance by the important stakeholder now corrected. Painting walls listening to good music playing on the radio during the project proved a more satisfying experience than expected. On finishing one wall, and while stepping back admiring his handiwork, the Badger found himself listening to the broadcaster’s regular news bulletin and sighing deeply on hearing that President Trump had unveiled plans for a ~$175 billion US ‘Golden Dome’ missile defence system. Memories of President Reagan’s 1983 Strategic Defence Initiative (SDI) came flooding back.

The goal of SDI was to develop a system that could intercept and destroy incoming nuclear missiles, effectively shielding the USA from a potential Soviet attack during the Cold War. Many dubbed it ‘Star Wars’ because of its proposed use of space-based technology. At the time, the Badger was working on the software design and development of a Relational Database Management System (RDMS) product – pretty cutting edge at the time. He remembers thinking that SDI would never come to fruition. Indeed, SDI itself was never fully realised, but its ideas have shaped military technology and policies in Missile and Space-based defence, Cybersecurity strategy, and International Collaboration ever since.

Rolling forward 40 years, the world is a quite different place geopolitically, technologically, economically, and militarily. Daily civilian and military life now depends on digital capabilities that didn’t exist in 1983, and continued rapid tech advances, innovation and AI are changing both domains at a rate never imagined just a few decades ago. Reagan’s SDI initiative and President Trump’s ‘Golden Dome’ share some similarities, but whilst the available tech in 1983 meant the former’s space-based missile defence was largely theoretical, President Trump’s benefits from modern, real, sophisticated satellite, space, sensor, and missile technologies. ‘Golden Dome’ revives elements of SDI but it also suffers from some of the same challenges, particularly, around cost, scepticism about its effectiveness, and concern that it dramatically escalates the global arms race. It’s certain, however, that just as happened when SDI was announced in 1983, military and tech sector commercial organisations will be relishing the prospect of picking up ‘Golden Dome’ contracts regardless of whether its stated ambitions will ever fully come to fruition.

But why did the Badger sigh so deeply on hearing about ‘Golden Dome’ on the radio? It was simply an instant reaction to the feeling that it’s another step on the road to creating the Terminator film’s SKYNET system for real, and that our species seems intent on a path that can lead to eventual self-inflicted extinction.

AI and copyright…

Elton John recently had some sharp words to say about the UK government’s plans to exempt AI technology firms from copyright laws. Apparently, there’s currently a game of ping-pong underway between the House of Commons and the House of Lords regarding this plan. Many writers, musicians, and artists are furious about the plan, and Elton’s comments caused the Badger to scratch his head and ponder. Why? Because, like many individuals and bloggers, his website’s content could be plundered by AI without his knowledge or permission regardless of the copyright statement on its home page. With AI models and tools increasingly mainstream, Elton’s words made the Badger realise that he, and probably many others around the globe, should have copyright more prominent in our thoughts.

Copyright law is complex and, as far as the Badger understands, ‘fair dealing’ or ‘fair use’ allows limited use of copyright material without permission from the copyright owner under specific circumstances. Fair dealing/use is not a blanket permission, and what constitutes this depends on factors such as how much of the material is used, whether its use is justified, and whether it affects the copyright owner’s income. The Badger’s not a lawyer, but  he senses that AI and copyright is a legal minefield that will keep experts with digital and legal qualifications in lucrative work for years to come.

As the Badger pondered, he scratched his head again and then asked Copilot if AI used material held on copyrighted websites. The short response was that it (and other AI) follows strict copyright guidelines and only generates brief summaries of copyrighted material respecting fair use principles and with pointers to official sources. To test the efficacy of the answer, the Badger asked Copilot for the lyrics of Elton John’s song ‘Candle in the wind’. Copilot responded with ‘Can’t do that due to copyright’. Typing the same request, however, into the Badger’s browser readily produced the lyrics. Make of that what you will, but it does make you wonder why you would need to use AI like Copilot for this kind of interaction.

At the heart of Elton John’s point is the long-established principle that if someone or an enterprise wants to use copyrighted material in something that produces a commercial gain for themselves, then the copyright owner should give prior permission and be paid. AI is a disruptive technology, much of it controlled by the same giant US corporations that already dominate the tech world. AI cannot be ignored, but exempting tech firms from copyright law seems wrong on many different levels. The Badger’s concluded that he should improve his understanding of copyright law, and that AI tech firms must not be exempt from such laws. After all, if you were to take a leaf out of President Trump’s playbook then if you want something, you need permission AND  you must pay.

A vintage Fortran Source code listing…

The Badger found an old paper Fortran source code listing, in good condition considering its age, at the back of a cupboard this week. It triggered memories of his programming activities early in his IT career. It also caused him to reflect on the changes there have been in IT as a result of the tremendous advances in digital technology, and the way we live and work, over the last 40 years. As illustrated below, this period has been one of continuous, rapid change.

In the 1980s, personal computers began to make their way into businesses and homes. The likes of IBM, Apple, and Microsoft introduced devices that revolutionized how people accessed information and performed tasks. The introduction of graphical user interfaces (GUIs) also made computers more user-friendly enabling a broader audience to embrace technology. The 1990s brought the birth and expansion of the internet, drastically changing communication, commerce, and entertainment. It brought a new level of connectivity and made information accessible globally at the click of a button. E-commerce giants like Amazon and eBay emerged, transforming the retail landscape and giving rise to online shopping.

The 2000s saw the rise of the mobile revolution. With the introduction of smartphones and tablets, technology became ever more integrated into our work and personal lives. Apple’s iPhone and Google’s Android led the charge, creating app-driven ecosystems that allowed users to perform a myriad of tasks on-the-go. Mobile internet access became ubiquitous fostering a new era of social media, instant messaging, and mobile gaming. In the 2010s, Cloud Computing with Amazon Web Services (AWS), Microsoft Azure, and Google Cloud brought scalable, on-demand, computing resources. This facilitated the rise of Software as a Service (SaaS) models which enable access to software applications via the internet and help businesses to reduce infrastructure costs and improve scalability.

In recent years, ‘Big Data’ has meant that organizations can leverage vast amounts of data to gain customer insights, optimize their operations, and make data-driven decisions. AI technologies such as machine learning, natural language processing, and computer vision, are also rapidly being integrated into applications from healthcare and finance to autonomous vehicles and smart home devices. In addition, the COVID-19 pandemic accelerated the adoption of remote working and digital collaboration tools, and video conferencing platforms like Zoom and Microsoft Teams have become essential communication and productivity tools.

Anyone working in the IT world over this period has had an exciting time! The Fortran listing reminded the Badger that it was produced when programming was a very human, hand-crafted activity. Source code today is produced differently, and AI will dominate programming in the future. The Badger’s career spanning all these changes  was challenging, exciting, creative, and one where dynamism, innovation, teamwork, hard work, and a ‘can do’ mentality were embedded workforce traits. Is that the case today? It has to be in a future which is dominated by AI.