A vintage Fortran Source code listing…

The Badger found an old paper Fortran source code listing, in good condition considering its age, at the back of a cupboard this week. It triggered memories of his programming activities early in his IT career. It also caused him to reflect on the changes there have been in IT as a result of the tremendous advances in digital technology, and the way we live and work, over the last 40 years. As illustrated below, this period has been one of continuous, rapid change.

In the 1980s, personal computers began to make their way into businesses and homes. The likes of IBM, Apple, and Microsoft introduced devices that revolutionized how people accessed information and performed tasks. The introduction of graphical user interfaces (GUIs) also made computers more user-friendly enabling a broader audience to embrace technology. The 1990s brought the birth and expansion of the internet, drastically changing communication, commerce, and entertainment. It brought a new level of connectivity and made information accessible globally at the click of a button. E-commerce giants like Amazon and eBay emerged, transforming the retail landscape and giving rise to online shopping.

The 2000s saw the rise of the mobile revolution. With the introduction of smartphones and tablets, technology became ever more integrated into our work and personal lives. Apple’s iPhone and Google’s Android led the charge, creating app-driven ecosystems that allowed users to perform a myriad of tasks on-the-go. Mobile internet access became ubiquitous fostering a new era of social media, instant messaging, and mobile gaming. In the 2010s, Cloud Computing with Amazon Web Services (AWS), Microsoft Azure, and Google Cloud brought scalable, on-demand, computing resources. This facilitated the rise of Software as a Service (SaaS) models which enable access to software applications via the internet and help businesses to reduce infrastructure costs and improve scalability.

In recent years, ‘Big Data’ has meant that organizations can leverage vast amounts of data to gain customer insights, optimize their operations, and make data-driven decisions. AI technologies such as machine learning, natural language processing, and computer vision, are also rapidly being integrated into applications from healthcare and finance to autonomous vehicles and smart home devices. In addition, the COVID-19 pandemic accelerated the adoption of remote working and digital collaboration tools, and video conferencing platforms like Zoom and Microsoft Teams have become essential communication and productivity tools.

Anyone working in the IT world over this period has had an exciting time! The Fortran listing reminded the Badger that it was produced when programming was a very human, hand-crafted activity. Source code today is produced differently, and AI will dominate programming in the future. The Badger’s career spanning all these changes  was challenging, exciting, creative, and one where dynamism, innovation, teamwork, hard work, and a ‘can do’ mentality were embedded workforce traits. Is that the case today? It has to be in a future which is dominated by AI.

Leave a comment