From ancient counting tools to artificial general intelligence
3000 BCE – 1450 CE — Humans develop writing, numeral systems, and the first calculating aids
Originating in Mesopotamia, the abacus is the earliest known computing device. Using beads on rods, it enabled merchants and scholars to perform arithmetic — addition, subtraction, multiplication, and division — millennia before electronic calculators.
The Sumerians developed cuneiform, one of the earliest writing systems. This enabled the first "information storage" — recording transactions, laws, and knowledge on clay tablets, making information persistent and transferable.
An ancient Greek analog computer used to predict astronomical positions and eclipses decades in advance. With 30+ meshing bronze gears, it is the most sophisticated known technology from the ancient world.
The invention of paper in Han Dynasty China revolutionised information storage. Lighter and cheaper than clay, bamboo, or parchment, paper enabled widespread record-keeping, scholarship, and eventually printing.
1450 – 1840 — Printing, early calculators, and the theoretical foundations of programmable machines
Johann Gutenberg's movable-type printing press is often called the most important invention of the second millennium. It made mass production of books possible, slashing the cost of information from exclusive to affordable and igniting the Renaissance and Reformation.
Blaise Pascal, aged just 19, built the Pascaline — a mechanical calculator that could add and subtract. It used interlocking gears and a carry mechanism, establishing principles still found in odometers today.
Gottfried Wilhelm Leibniz extended Pascal's work with a machine capable of all four arithmetic operations. He also pioneered the binary number system, the very foundation of all modern digital computing.
Joseph-Marie Jacquard's loom used punched cards to control weaving patterns. This concept of "stored programs" on physical media directly inspired Charles Babbage and, later, early electronic computers that read punched cards.
Charles Babbage designed the Analytical Engine — the first general-purpose computer concept with an ALU, control flow, and memory. Ada Lovelace wrote the first algorithm for it, earning her the title of first computer programmer.
1840 – 1940 — Electricity transforms communication and computation
Samuel Morse's electric telegraph transmitted encoded messages over wires using Morse code. For the first time in human history, information could travel faster than a person — near the speed of light.
The telephone enabled real-time voice communication over electrical wires. It transformed business, government, and personal communication, laying the groundwork for all voice networks that followed.
Herman Hollerith built an electromechanical punch-card system for the US Census, reducing processing time from 8 years to just 1. His company later merged to become IBM.
Guglielmo Marconi developed practical radio communication, enabling wireless transmission of information across vast distances. By 1901, he sent the first transatlantic radio signal.
Alan Turing published "On Computable Numbers," describing a theoretical machine that could compute anything calculable. The Turing Machine defined the mathematical limits of computation and became the foundation of computer science.
German engineer Konrad Zuse built the Z1 in his parents' living room — the first freely programmable binary computer. Though mechanical and unreliable, it proved binary computation was practical.
1940 – 1980 — Vacuum tubes, transistors, integrated circuits, and the birth of personal computing
Built to crack Nazi Lorenz ciphers during WWII, Colossus was the first programmable electronic digital computer. It was kept secret until the 1970s. Its success demonstrated that electronic computation could solve problems impossible for humans.
The Electronic Numerical Integrator and Computer, built by Mauchly & Eckert at the University of Pennsylvania, was the first general-purpose electronic computer. It weighed 30 tons, used 18,000 vacuum tubes, and could perform 5,000 additions per second.
Shockley, Bardeen, and Brattain invented the transistor — a tiny solid-state switch replacing bulky vacuum tubes. Smaller, faster, more reliable, and energy-efficient, transistors made all modern electronics possible.
Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild) independently invented the integrated circuit — multiple transistors on a single chip. This launched Moore's Law: transistor density doubling every ~2 years.
The US Department of Defense's ARPANET sent its first message between UCLA and Stanford — the word "LOGIN" (it crashed after "LO"). This packet-switching network was the direct ancestor of the modern Internet.
Intel released the 4004, the first commercial microprocessor — an entire CPU on a single chip. With 2,300 transistors, it had the same computing power as ENIAC but fit on your fingertip.
The Altair 8800 (1975) sparked the hobbyist movement. Apple II (1977), TRS-80, and Commodore PET brought computers to homes and small businesses for the first time. Bill Gates and Paul Allen founded Microsoft to write software for the Altair.
1980 – 2010 — GUIs, the World Wide Web, mobile phones, and the connected world
IBM's Personal Computer standardised the industry with an open architecture. Running MS-DOS, it established the IBM-compatible standard that still dominates. Within 2 years, PCs outsold all other computer types combined.
Apple's Macintosh popularised the Graphical User Interface (GUI) — windows, icons, menus, and a mouse. Computing went from cryptic command lines to visual point-and-click, making computers accessible to everyone.
At CERN, Tim Berners-Lee invented HTML, HTTP, and URLs — the three pillars of the World Wide Web. He published the first website in 1991. The Web transformed the Internet from an academic tool into a global information platform.
The Mosaic web browser (later Netscape) made the Web visual and user-friendly. Internet traffic exploded. Within 3 years, the number of websites grew from 130 to over 100,000.
Larry Page and Sergey Brin launched Google with its PageRank algorithm, solving the problem of finding information on the rapidly growing Web. "Google it" became a verb, and the company grew into the world's most powerful information gateway.
Facebook (2004), YouTube (2005), and Twitter (2006) transformed the Web from read-only to read-write. User-generated content, social networks, and viral sharing created entirely new communication paradigms.
AWS launched EC2 and S3, offering computing infrastructure as a service. Companies no longer needed to buy servers — they could rent capacity on demand. This "utility computing" model revolutionised how software is built and deployed.
The iPhone combined a phone, iPod, and internet device with a revolutionary multi-touch interface. It created the modern smartphone era and the app economy, putting a powerful computer in every pocket.
Satoshi Nakamoto published the Bitcoin whitepaper (2008) and launched the network (2009). The underlying blockchain technology — a decentralised, immutable ledger — opened new paradigms for trust, finance, and decentralised applications.
2010 – Present — Machine learning, smart devices, generative AI, and the frontier of quantum computing
IBM's Watson AI system defeated human champions on Jeopardy!, demonstrating that AI could understand natural language, process vast knowledge bases, and reason in real time. It was a public watershed moment for AI.
Apple launched Siri (2011), followed by Amazon Alexa (2014) and Google Assistant (2016). Voice-activated AI assistants brought natural language processing into millions of homes, normalising human-computer conversation.
AlexNet won the ImageNet competition by a huge margin using deep neural networks and GPU acceleration. This moment triggered the modern AI revolution, as deep learning proved dramatically superior for image recognition and beyond.
By 2020, over 10 billion IoT devices were connected — smart thermostats, wearables, industrial sensors, autonomous vehicles. Edge computing emerged to process data closer to the source, reducing latency from milliseconds to microseconds.
Google researchers published the Transformer paper, introducing the attention mechanism that would power GPT, BERT, and all modern large language models. This single architecture breakthrough is the foundation of today's generative AI.
Google's Sycamore quantum processor performed a computation in 200 seconds that would take the world's fastest supercomputer 10,000 years. While limited in practical applications, it proved quantum computing is real.
The COVID-19 pandemic accelerated digital transformation by a decade. Remote work, telehealth, e-commerce, and video conferencing (Zoom) became essential. 5G rollouts began enabling speeds up to 10 Gbps for mobile devices.
OpenAI's ChatGPT (Nov 2022) reached 100 million users in 2 months — the fastest-growing app in history. Anthropic's Claude, Google's Gemini, and image generators like Midjourney demonstrated AI that can write, code, reason, and create visual art.
AI agents that can autonomously code, browse, and execute multi-step tasks (Claude Code, Devin). Apple Vision Pro brings spatial computing. AI is being integrated into operating systems, search, healthcare, and scientific discovery at unprecedented scale.
Throughout 5,000 years of information technology, a handful of breakthroughs radically altered the trajectory of human civilisation. Each one solved a fundamental limitation of the previous era.
Problem solved: Books were hand-copied by monks, taking months per copy. Knowledge was locked in monasteries and courts.
Impact: Within 50 years, 20 million books were printed. Literacy rates soared. The Reformation, Scientific Revolution, and Enlightenment all followed directly. It was the first true "information revolution" — knowledge became a commodity, not a privilege.
Problem solved: Information could only travel as fast as a horse, ship, or carrier pigeon — days or weeks between continents.
Impact: For the first time, information moved at near light-speed. The telegraph shrank the world: stock markets synchronised, wars were coordinated in real time, and newspapers could report same-day events from distant cities. It was the "Victorian Internet."
Problem solved: Vacuum tubes were large, fragile, power-hungry, and generated enormous heat. ENIAC used 18,000 tubes and filled a room.
Impact: The transistor is arguably the most important invention of the 20th century. It enabled miniaturisation, leading to integrated circuits (1958), microprocessors (1971), and eventually smartphones containing billions of transistors. Without it, nothing in modern technology exists.
Problem solved: The Internet existed but was text-only, fragmented, and inaccessible to non-technical users. There was no standard way to link, publish, or browse information.
Impact: HTML + HTTP + URLs created a universal information platform. E-commerce ($5.8 trillion in 2023), social media (4.9 billion users), remote work, streaming, online education — all built on Berners-Lee's three inventions. He chose not to patent them, keeping the Web free and open.
Problem solved: Computing required being at a desk. Mobile phones could call and text but couldn't meaningfully access the Internet or run applications.
Impact: The iPhone combined GPS, camera, internet, apps, and a touch screen into a pocket device. Today, 6.8 billion people have smartphones — more than have running water. The app economy generates $500B+ annually. Mobile became the primary way humanity accesses information.
Problem solved: AI could classify and predict, but couldn't create, reason, or converse naturally. Interacting with computers still required learning their language.
Impact: Large Language Models understand and generate human language, code, and images. ChatGPT reached 100M users in 2 months. AI agents now write software, conduct research, and assist in medical diagnosis. We are witnessing the fastest technological adoption in human history, with implications still unfolding.
Time between major paradigm shifts keeps shrinking exponentially
| Era | Duration | Key Breakthrough | Approx. Years |
|---|---|---|---|
| ● Pre-Mechanical | 3000 BCE – 1450 | Writing, Abacus, Paper | ~4,450 |
| ● Mechanical | 1450 – 1840 | Printing Press, Calculators | ~390 |
| ● Electromechanical | 1840 – 1940 | Telegraph, Telephone, Radio | ~100 |
| ● Electronic | 1940 – 1980 | Transistor, IC, Microprocessor | ~40 |
| ● Digital & Internet | 1980 – 2010 | WWW, Mobile, Cloud | ~30 |
| ● AI & Quantum | 2010 – Present | Deep Learning, LLMs, Quantum | ~16+ |
From 4,450 years → 390 → 100 → 40 → 30 → 16... each era is roughly 3–10x shorter than the last.