Landmark moments in the history of the IT industry

cloud computation big data

The IT industry is ever evolving, and has witnessed some remarkable innovations and transformative milestones over its relatively short lifetime. From visionary polymath Charles Babbage dreaming of the first computer to Steve Job’s introduction of the first iPhone, it’s clear the IT industry has transformed our society - changing the way we live, work and communicate with each other. So, what are some of the biggest moments in the IT industry’s history, and how did it impact the world as we know it? We explore below. 

What do IT professionals do today?

IT professionals are indispensable in today’s technological world, responsible for the complex and foundational systems we rely on in our personal lives and at work. From software development to network administration, cybersecurity and data analysis, there are a variety of subfields that IT professionals can specialise in.
Some of their common responsibilities include:

  • Troubleshooting
  • Network administration
  • System administration
  • Software development
  • Cybersecurity
  • Database management
  • Data analysis
  • Project management

1. The invention of computers

Charles Babbage, a brilliant English mathematician and inventor, is often seen as the father of modern computers. In the 19th century, Babbage conceptualised a mechanical device that could solve complex calculations. Though it wasn’t fully built in his lifetime, this device (which he called the ‘Analytical Engine’) laid the groundwork for modern computers. 

Major advancements were to come later in history, especially during World War II, where computer development was facilitated by British scientists like the famous Alan Turing at the forefront. These scientists were desperately trying to decipher encrypted German messages that used the infamous Enigma code. To crack the Enigma code, Turing and his team designed and built early versions of the computers we know today. These computers used advanced algorithms to decrypt the German messages, which gave the Allies a massive advantage during the war.
While not the only innovators in the field, the combination of Babbage's ideas with Turing’s innovations largely ushered in the era of information technology.

2. The introduction of personal computers

Personal computers changed the way we live and work, offering entertainment at home and efficiency in the office. Before personal computers like the ones offered by Apple and Windows were available, computers were mostly used by large companies or government agencies, who alone had access to them.

The introduction of the IBM Personal Computer in 1981 was a landmark moment not only in the IT industry but society at large, because for the first time the power of computing was available to individuals and businesses.  Computing technology was now widely available and the IT landscape was revolutionised, with innovation fuelling its development. Individuals and organisations could now integrate computers more easily into their processes and personal lives, and at a lower cost. They also had the freedom to develop the technology to find new use cases and functions. 

Woman playing games on gaming PC

3. The internet

The invention of the internet is one of the greatest innovations in history, and without which the IT industry would not exist as it does today. The internet began as a governmental tool for researchers to share information. During the Cold War, researchers needed access to computers, which were scarce and which required long distance travel to access. The internet solved this problem, allowing remote access and information sharing. 

Tim Berners-Lee's creation of the World Wide Web in 1989 ushered in the era of the internet as it is known today, enabling global connectivity and transforming how people access and share information. January 1, 1983, is considered the birthday of the internet - the day where the communications protocol Transfer Control Protocol/Internetwork Protocol (TCP/IP) was used as the standard protocol by the ARPANET (an experimental computer network and forerunner of the internet) and Defense Data Network. 

Blue hued image of a computer keyboard, and fibre optic cable

4. Smartphones

Cell phones were already an innovation in its own right – and then computers were added to them! And then cameras. Smartphones combine communication, the internet, computing and photography. Smartphone development isn’t front-of-mind when picturing IT professionals, but it is still a large part of the overall industry. 

It only took 19 years for the smartphone to be invented after the first cell phone (conceived in 1973), thanks to the innovations made by computer hardware manufacturer IBM. The most groundbreaking smartphone, and the one which made the biggest impact and which many people associate with the beginning of the smartphone era, was the iPhone. Steve Jobs himself described the iPhone as a “revolutionary and magical product that is literally five years ahead of any other mobile phone,” - and he wasn’t wrong.

The first iPhone was groundbreaking for several reasons, including:

  • It had a touchscreen
  • It had multi-touch technology
  • It had apps and an app ecosystem
  • It had full internet accessibility 
  • Multiple features were integrated in a single device (mobile phone, music player, internet communication device, camera)
  • It received regular software updates

Today, smartphones have become an essential part of life for most people. According to Statista, around 23 million Australians will have a smartphone by 2026.They’re responsible for kickstarting countless new businesses and innovations, and we rely on them for everything from banking to dating to home security.

Iphone being used in photography

5. Cloud computing

Cloud computing is a technology that allows individuals and organisations to access and use servers, storage, databases, networking, software and more over the internet. Instead of physical hardware, users can rely on cloud service providers to provide these as a service.

There are several benefits to cloud computing, including its scalability, cost-efficiency, flexibility and accessibility from anywhere with an internet connection. No longer do businesses or individuals need their own infrastructure, allowing them to host their own websites and applications and run data analytics and handle machine learning workloads.

There are a few drawbacks, however. For instance, there are usually additional fees associated with cloud storage. There is also the risk of hackers and of privacy breaches by the companies which develop the cloud technology. 

Cloud computing has become foundational in modern IT infrastructure, allowing businesses to scale, innovate and operate more efficiently in a competitive digital environment.

Female IT professional standing in front of cloud computing technology

6. Artificial intelligence

The latest and greatest development in the IT industry - it seems we can’t turn our heads without seeing artificial intelligence and finding a new AI product. Canva, Adobe, Snapchat, even Bing – all have implemented AI into their existing products. The great hope of AI is the reduction of labourious work. Before AI was as ubiquitous as it is today, it was thought that monotonous jobs would be the first to be replaced by its development. However, it came as a surprise to many when ChatGPT and AI image generators like DALL-E 3 were released and creative skills were suddenly threatened. The world was changed almost overnight for writers, artists, musicians, graphic designers and other creators, who were always assumed to be the last class of professionals who would be threatened by machine intelligence.  

Cyber illustration of human hand connecting with robot hand

IT vacancies


Related articles

Learn how robot bees and other technologies are leading the race to a sustainable world. Learn how both small and large firms are gaining rich insights to make better decisions. Learn how the use of nanotechnology is transforming industries from mining to life sciences. An easy to digest overview of front and backend development terms and skills - the ingredients that make up the full stack hamburger. Learn everything involved in the career path of a Business Analyst