Our world is built on technology.
Do you want to know how it works?

Throughout history, from the abacus...

to the microprocessor…

Scroll to explore

technology is advancing faster than ever as people dedicate their lives to solving problems – pushing the boundaries of knowledge to shape our world.

Explore this journey

greece

The Antikythera Mechanism

Over 2,000 years ago, the first known example of an analogue computer was developed. Known as the Antikythera mechanism, after the island of Antikythera where its remains were discovered in a shipwreck in 1901, it was designed to perform a calculation mechanically.

Even though it didn't come close to modern computational ability, the device is thought to have contained complex gear mechanisms that could calculate the positions of celestial bodies at a level of engineering complexity not seen again for 1,000 years.



EGYPT

The first programmable robot

Greek engineer Hero's automatic theatre and programmable cart could perhaps be the first documented example of a reprogrammable robot and self driving vehicle. A cord pulled by weights, and wrapped around two wheel axles could also be looped around pegs to change a wheel's direction and determine a pre-programmed path for it to follow.

The first programmable robot

Upper Mesopotamia

Al-Jazari's castle clock

Ismail Al-Jazari's castle clock is thought to be the first example of a programmable analogue computer. It combined clockwork calculation with intricate automated movement, and featured the ability to be re-programmed.

An inventor and clockmaker, Ismail Al-Jazari produced a number of complex robotic clocks and automata during this lifetime.

Al-Jazari's castle clock

france / germany

Mechanical calculator

Mechanical calculator

Blaise Pascal (France) invented a mechanical calculator in the mid 17th century. Gottfried Leibniz (Germany) designed some improvements to this calculator, and studied a field of mathematics called "universal combinatorics”.

Perhaps the first “theoretical computer scientist”, Leibniz’s work appears to have been the first attempt to formally symbolise logic and describe how computation can be performed using binary.

france

THE
JACQUARD
MACHINE
THE JACQUARD MACHINE

Joseph Jacquard invented a machine that could read from a chain of punch cards to automatically control the operation of a loom to change a woven pattern. The use of replaceable punch cards for programming, data entry and automation later became the norm in computing, remaining in use up until the mid 1980s.

ENGLAND

The
analytical
engine
The
analytical
engine

The Analytical Engine was a machine proposed by English mathematician Charles Babbage. He had previously designed a large mechanical calculating machine called a Difference Engine, but wanted to explore the concept of a machine which could be described as a general purpose computer, a concept we now call “Turing complete”. If it had actually been built, it would have been the first ever machine capable of performing all the same basic computations as a modern-day computer (only slower), but he never finished building it.

United Kingdom

The first Computer PROGRAM

English mathematician and writer Ada Lovelace had become interested in the Analytical Engine and was translating a paper describing it written by Italian mathematician Luigi Menabrea. She annotated this paper with a description of how to program the machine to perform an algorithm calculating Bernoulli numbers, making it the first published complex computer program.

The first computer
Image Credit to Entity Magazine.

United Kingdom

The Turing Machine

English mathematician and cryptanalyst Alan Turing published perhaps the most influential paper on computer science which described a theoretical machine that could compute any conceivable mathematical calculation represented as an algorithm.

Turing went on to prove that a mathematical problem called the “Entscheidungsproblem”, first proposed by Leibniz, had no solution. At the same time, an American mathematician Alonzo Church came to similar conclusions using a different method, and the two approaches were later shown to be equivalent. This was the first time a definition of “universal computation” was formalised.

GERMANY

Z

3

Z3 reconstruction in 2010 by Horst Zuse

Mathematicians worked on theoretical computer science problems throughout the 1800s, with the invention of the “logic gate” appearing at the turn of the century. This led to the development of the Z3, which used electromechanical relays to implement logic gates.

The Z3 was a computer designed by Konrad Zuse in 1935, and completed in 1941 (Berlin). It was the world's first working, programmable, fully automatic, digital computer and was programmed using the first real computer programming language, “Plankalkül”. This machine was later shown to be capable of the universal computation described by Alan Turing.

UNITED KINGDOM

COLOSSUS, ENIAC, AND EDVAC

COLOSSUS, ENIAC, AND EDVAC

Colossus was a set of computers developed by the British in the years 1943–1945 to help break German codes during WW2. Together they are regarded as the world's first programmable, electronic, digital computer, but their existence was kept secret until the 1970s. In 1945 another electronic, digital computer called ENIAC was built by the United States Army for general-purpose research use.

This was followed by the EDVAC computer, which computer scientist John von Neumann worked on to produce the first “stored-program computer”. Variations of von Neumann’s designs are still used in modern-day computers.

USA

THE TRANSISTOR AND THE INTEGRATED CIRCUIT

The most significant development in the 1950s and 1960s was the development and use of the transistor in computing technology. Previous electronic computers used vacuum tubes as one of the main components, which were large and consumed a lot of power. The resulting computers were very large - the size of a room.

The introduction of the transistor meant that the same functionality could be achieved with a much smaller device, using much less power. This led to significant improvements and miniaturised computer circuit boards. In the 1960s, transistors were made even smaller with the development of integrated circuits (or “chips”).

USA

Microprocessors

In 1971 Intel released the first commercially available microprocessor, a computer processor contained on a single integrated circuit (or “chip”). This soon led to the development of Kenbak-1, the world’s first “personal computer”, followed by the Apple II, PET2001, and TRS-80 computers in 1977, which were made available as consumer products.

Three Proud Parents: Posing during induction ceremonies for the National Inventors Hall of Fame in 1996, Federico Faggin, Marcian “Ted” Hoff Jr., and Stanley Mazor [from left] show off the pioneering microprocessor they created in the early 1970s, the Intel 4004.

USA

AI Boom and beginnings of the Internet

With significant computing power now commonly available, the 1980s saw a boom in the study of artificial intelligence. Much of this effort focused on “expert systems”, computer programs which can learn from human experts in order to automatically make similar decisions.

In the 1970s the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense had built a network of computers called ARPANET which let computers communicate with each other via “packet switching”.

By the 1980s the National Science Foundation had adapted this technology to connect supercomputers across several centres across the United States. This network, called the NSFNET, created the beginnings of the Internet.

A visualization of the completed NSFNET T1 backbone in September 1991.

NEW YORK

DEEP BLUE VERSUS GARRY KASPAROV

This was no ordinary game of chess. In 1997 an IBM supercomputer called Deep Blue played against world champion Gary Kasparov in a game of chess. Deep Blue won, marking the first defeat of a reigning world chess champion by a computer under tournament conditions.

Kasparov argued that the computer must actually have been controlled by a real grand master. He and his supporters believed that Deep Blue’s playing was too human to be that of a machine.

It appeared that artificial intelligence had reached a stage where it could outsmart humanity – at least at a game that had long been considered too complex for a machine.

GARRY KASPAROV

STANFORD UNIVERSITY

How did Goggle start?
Google Founders

The Google story begins in 1995 at Stanford University. Larry Page was considering Stanford for grad school and Sergey Brin, a student there, was assigned to show him around.

Working from their dorm rooms, they built a search engine that used links to determine the importance of individual pages on the World Wide Web.

Originally called Backrub (and thankfully later changed to Google), the search engine caught the attention of not only the academic community, but Silicon Valley investors as well. In August 1998, Sun co-founder Andy Bechtolsheim wrote Larry and Sergey a check for $100,000, and Google Inc. was officially born.

HAVARD UNIVERSITY

Facebook was founded in 2004 by Mark Zuckerberg, Eduardo Saverin, Dustin Moskovitz, and Chris Hughes, all of whom were students at Harvard University. Facebook became the largest social network in the world, with more than one billion users as of 2012, and about half that number were using Facebook every day.

In February 2012 Facebook filed to become a public company. Its initial public offering (IPO) in May raised $16 billion, giving it a market value of $102.4 billion.

USA

First iPhone
released

After months of rumours and speculation, Steve Jobs unveiled the first iPhone on Jan 9 2007. The device started at $499 for a whopping 4GB and $599 for the 8GB.

It’s hard to imagine now, but the first iteration of the iPhone didn't have many of the features we take for granted today, like copy and paste, 3G and definitely not 5G, or even apps.

Ironically, critics at the time said that the phone was too expensive to do well in the market.

UNKNOWN LOCATION

₿itcoin
Created

Bitcoin is a digital currency that was created in January 2009. It follows the ideas set out in a whitepaper by the mysterious and pseudonymous Satoshi Nakamoto.

The identity of the person or persons who created the technology is still a mystery. Bitcoin offers the promise of lower transaction fees than traditional online payment mechanisms and, unlike government-issued currencies, it is operated by a decentralized authority.

Bitcoin's history as a store of value has been turbulent; the cryptocurrency skyrocketed up to roughly $20,000 per coin in 2017, but less than a year later, it was trading for less than half of that.
As the earliest virtual currency to meet widespread popularity and success, bitcoin has inspired a host of other cryptocurrencies in its wake.

WORLDWIDE

AI RESURGENCE AND DEEP LEARNING

From 1983 - 2010, research funding ebbed and flowed, and research in AI continued to gather steam although “some computer scientists and software engineers would avoid the term artificial intelligence for fear of being viewed as wild-eyed dreamers”

During 1980s and 90s, researchers realized that many AI solutions could be improved by using techniques from mathematics and economics such as game theory, stochastic modeling, classical numerical methods, operations research and optimisation.

Better mathematical descriptions were developed for deep neural networks as well as evolutionary and genetic algorithms, which matured during this period. All of this led to new sub-domains and commercial products in AI being created.


SEOUL, South korea

LEE SEDOL

VS

ALPHAGO

In March 2016 a historic match was underway between Lee Sedol, one of the world's best Go players, and AlphaGo, an artificially intelligent system designed by a team of researchers at DeepMind, a London AI lab now owned by Google.

The machine claimed victory in the best-of-five series, winning four games and losing only one. It marked the first time a machine had beaten the very best at this ancient and enormously complex game a feat that, until recently, experts didn't expect would happen for another ten years.

WORLDWIDE

Car manufacturers start investing heavily in autonomous car technology

Beyond trendy names like Tesla chasing self-driving cars, a host of auto brands and other tech heavyweights such as Amazon and Apple begin investing in autonomous car technology.

Private technology companies working on autonomous vehicles and related technologies attract record levels of funding. Along with early-stage startups and venture capital funds, large corporations are also angling to get a slice of the self-driving pie.

Strategic collaborations among automakers and nascent technology companies begin to occur while acquisition activity begins to heat up off the back of Tesla’s initial success and a growing market in the U.S. and China. The auto industry starts to evolve rapidly, built largely on the investments in AI and automated manufacturing, turning many of today’s legacy automakers into technology companies.

USA

GOOGLE X NASA

ANNOUNCE QUANTUM

SUPREMACY

Quantum chip
Google, in partnership with NASA and Oak Ridge National Laboratory, has demonstrated the ability to compute in seconds what would take even the largest and most advanced supercomputers thousands of years, achieving a milestone known as quantum supremacy.

The achievement of quantum supremacy means that the processing power and control mechanisms now exist for scientists to run their code with confidence and see what happens beyond the limits of what could be done on supercomputers. Experimentation with quantum computing is now possible in a way it never has been before.

USA

IBM promises 1000-qubit quantum computer by 2023

Despite hundreds of millions of dollars pouring into the field, today’s quantum computers are still pretty rudimentary. IBM’s largest quantum processors today are 65 qubits—a shade behind their biggest competitor, Google, which has a 72-qubit device. That’s a long way from the thousands of qubits likely to be needed to do practical computations."

Current quantum computing shows promise in tackling problems out of the reach of conventional computers. But success with more sophisticated machines, like the ones IBM plans, will open up possibilities such as designing new materials for solar panels or electric vehicle batteries, making package deliveries faster or investing money more profitably.
Chandelier

USA

Elon Musk reveals brain computer interface chip

Research on brain computer interfaces first began in the 1970s with a grant to the University of California Los Angeles from the National Science Foundation but it wasn't until 2020 when Elon Musk took a short break from tweeting about Dogecoin to unveil the progress Neuralink Corporation was making on an implantable brain-machine interface.

Little is known about Neuralink beyond Elon Musk's involvement and sporadic media events but the company is thought to be working on an Application-Specific Integrated Circuit that it hopes will convert information obtained from neurons into binary code.

While tangible progress would appear to be far off, Musk has stated that his long-term goal is to achieve "symbiosis with artificial intelligence" to head off what he views as the existential threat to humanity posed by AI.
Elon musk

USA

computer algorithms for predicting RNA structures for vaccines (covid19)

What’s the secret behind the speed of the COVID Vaccine development? Computational biology.

The COVID-19 virus is made of RNA, which manufactures the spike protein and all the other proteins that allow it to survive. RNA structures are tricky to predict compared with protein structures for a few reasons.

Within weeks of the genome of COVID-19 being sequenced, researchers at Moderna and Pfizer/BioNTech were able to produce a vaccine utilising messenger RNA (mRNA). While both companies had worked on the technology for years, this was the first time a vaccine had been proceed so quickly and on such a large scale.

To trigger an immune response, many vaccines put a weakened or inactivated germ into our bodies. Not mRNA vaccines. Instead, they teach our cells how to make a protein—or even just a piece of a protein—that triggers an immune response inside our bodies. That immune response, which produces antibodies, is what protects us from getting infected if the real virus enters our bodies. (CDC 2021)

The mRNA vaccine platform has the potential to transform immunology with a range of vaccines now under development for HIV, malaria and a number of rare deadly diseases.
2030
2040
2050
To this day, computer science continues to evolve, shaping the decades to come…
2060
2070
2090
2
0
2
1
Now it's your turn...
Get prepared for the future
Learn the skills that are driving the technological revolution
Learn Computer Science