<!-- show content if JS disabled --> <style> .delay-enter { opacity: 1 !important; } </style>
History of Machine Learning Header

A history of machine learning

Just fifty years ago, machine learning was still the stuff of science fiction. Today it’s an integral part of our lives, helping us do everything from finding photos to driving cars. We’ve come very far, very fast, thanks to countless philosophers, filmmakers, mathematicians, and computer scientists who fueled the dream of learning machines

  • Laying the Groundwork

  • From Theory to Reality

  • Modern Machine Learning

Laying The Groundwork

Laying the Groundwork

Without these early thinkers and tinkerers, there’d be no computing “machines,” much less machine learning.

1642

French teen builds the first mechanical calculator

Photo of a 'Pascaline' circa 1642

Blaise Pascal was 19 when he made an “arithmetic machine” for his tax collector father. It could add, subtract, multiply, and divide. Three centuries later, the IRS uses machine learning to combat tax evasion.1, 2, 3

READ MORE READ LESS

1679

The modern binary system is born

German mathematician, philosopher, and occasional poet Gottfried Wilhelm Leibniz devised the system of binary code that laid the foundation for modern computing.4, 5

READ MORE READ LESS

1770

A chess-playing automaton debuts, then dupes Europe for decades

“The Turk.” Illustration showing the workings of the chess automaton

A moving, mechanical device designed to imitate a human, “The Turk” fooled even Napoleon into thinking it could play chess. The jig was up in 1857 when The Turk’s final owner revealed how a person hidden inside moved its arms.6, 7, 8

READ MORE READ LESS

1834

The "father of the computer" invents punch-card programming

Englishman Charles Babbage conceived a general all-purpose device that could be programmed with punched cards. His Analytical Machine was never built, yet nearly all modern computers rely on its logical structure.9, 10

READ MORE READ LESS

1842

Ada Lovelace's algorithm makes her the world's first computer programmer

The 27-year-old mathematician described a sequence of operations for solving mathematical problems using Charles' Babbage's theoretical punch-card machine. In the 70s, the US Department of Defense paid homage, naming a new software language Ada.11, 12, 13

READ MORE READ LESS

1847

A mystic’s algebra makes CPUs possible more than a century before they’re invented

Philosopher and closet mystic George Boole created a form of algebra in which all values can be reduced to “true” or “false.” Essential to modern computing, Boolean logic helps a CPU decide how to process new inputs.14, 15, 16

READ MORE READ LESS

1927

AI debuts on the silver screen

Set in 2026 Berlin, Fritz Lang’s expressionist sci-fi film “Metropolis” introduced moviegoers to the idea of a thinking machine. His character, “False Maria,” was the first robot ever depicted on film.17, 18, 19

READ MORE READ LESS

1936

Alan Turing conceives his "Universal Machine"

Inspired by how we follow specific processes to perform tasks, English logician and cryptanalyst Alan Turing theorized how a machine might decipher and execute a set of instructions. His published proof is considered the basis of computer science.20, 21

READ MORE READ LESS
From Theory to Reality

From Theory to Reality

In just half a century, what was once science fiction becomes real.

1943

A human "neural network" is modeled with electrical circuits

A neurophysiologist and a mathematician co-wrote a paper on how human neurons might work. To illustrate the theory, they modeled a neural network with electrical circuits. In the 1950s, computer scientists would begin applying the idea to their work.22, 23

READ MORE READ LESS

1952

A computer improves its checkers game

Machine learning pioneer Arthur Samuel created a program that helped an IBM computer get better at checkers the more it played. Machine learning scientists often use board games because they are both understandable and complex.24, 25, 26, 27

READ MORE READ LESS

1959

A neural network learns to make phone cals clearer

In computing, a “neural network” is a system modeled on the human nervous system. The first neural network applied to a real world problem, Stanford’s MADALINE used an adaptive filter to remove echoes over phone lines. It’s still in use today.28, 29

READ MORE READ LESS

1968

Kubrick’s 2001 sets a high bar for computer intelligence

While researching his new film, director Stanley Kubrick visited Marvin Minsky of MIT’s Artificial Intelligence Lab to ask if the intelligent computer he was imagining might actually exist by 2001. An optimistic Minsky believed it would.30, 31

READ MORE READ LESS

1979

The Stanford Cart takes a slow but significant spin

In 1960, a grad student set out to crack the problem of controlling a moon rover from earth by rigging a buggy with a TV camera and remote control. Two decades and five hours later, the Stanford Cart navigated a chair-filled room without human help.32, 33

READ MORE READ LESS

1982

Movie sudiences meer Blade Runner's replicants

Ridley Scott’s film, based on a Philip K. Dick novel, asks what happens if machines get smart enough to develop emotions. Though a box office flop, Blade Runner went on to become a sci-fi classic.34

READ MORE READ LESS

1985

NETtalk teaches itself to pronounce new words

Invented by Terry Sejnowski and Charles Rosenberg, this artificial neural network taught itself how to correctly pronounce 20,000 words in one week. Early outputs sounded like gibberish, but with training its speech became clearer.35, 36

READ MORE READ LESS

1997

Deep Blue beats a chess champion

When IBM’s Deep Blue beat chess grandmaster Garry Kasparov, it was the first time a computer had bested a human chess expert — and possibly the last. Kasparov demanded a rematch, but IBM declined and immediately retired Deep Blue.37, 38

READ MORE READ LESS

1999

Computer-aided diagnosis catches more cancers

Computers can’t cure cancer (yet), but they can help us diagnose it. The CAD Prototype Intelligent Workstation, developed at the University of Chicago, reviewed 22,000 mammograms and detected cancer 52% more accurately than radiologists did.39

READ MORE READ LESS
Modern Machine Learning

Modern Machine Learning

Machine learning moves out of the lab and into our lives with applications across industries.

2006

Neural net research gets a reboot as “deep learning”

When his field fell off the academic radar, computer scientist Geoffrey Hinton rebranded neural net research as “deep learning.” Today, the internet’s heaviest hitters use his techniques to improve tools like voice recognition and image tagging.40, 41, 42

READ MORE READ LESS

2009

BellKor's Pragmatic Chaos nets the $1M Netflix prize

In 2006, Netflix offered $1M to anyone who could beat its algorithm at predicting consumer film ratings. The BellKor team of AT&T scientists took the prize three years later, beating the second-place team by mere minutes.43, 44

READ MORE READ LESS

2011

Watson computer wins at Jeopardy!

Though not a perfect player, IBM’s Watson did manage to outwit two Jeopardy! champions in a three-day showdown. Plans for this technology include powering a computerized doctor’s assistant.45

READ MORE READ LESS

2012

Google Brain detects human faces in images

A neural network created by Google learned to recognize humans and cats in YouTube videos — without ever being being told how to characterize either. It taught itself to detect felines with 74.8% accuracy and faces with 81.7%.46, 47

READ MORE READ LESS

2014

Chatbot "Eugene Goostman" passes the Turing Test

Devised by cryptanalyst Alan Turing in 1950, this test requires a machine to fool a person into thinking it’s human through conversation. Sixty years to the day after Turing’s death, a chatbot convinced 33% of human judges that it was a Ukrainian teen.48

READ MORE READ LESS

2014

Computers help improve the ER experience

Healthtech began using event simulation to predict ER wait times based on data like staffing levels, medical histories, and hospital layouts. These predictions help hospitals reduce the wait, a key factor in better patient outcomes.49, 50

READ MORE READ LESS

2015

A computer wins at the world's hardest boardgame

Google’s AlphaGo was the first program to best a professional player at Go, considered the most difficult board game in the world. With this defeat, computers officially beat human opponents in every classical board game.51, 52

READ MORE READ LESS

2015

Machines and humans pair up to fight fraud online

When PayPal set out to fight fraud and money laundering on its site, it took a hybrid approach. Human detectives define the characteristics of criminal behavior, then a machine learning program uses those parameters to root out the bad guys on the PayPal site.53, 54

READ MORE READ LESS

2016

Read my lips, LipNet

Kubrick’s fictional HAL 9000 could read lips in 2001. It would take an Oxford team a little longer, but the results were no less impressive. This artificial-intelligence system identified lip-read words with an accuracy of 93.4%.55

READ MORE READ LESS

2016

Natural language processing gives life to a digital personal shopper

The North Face became the first retailer to use IBM Watson’s natural language processing in a mobile app. The Expert Personal Shopper helps consumers find what they’re looking for through conversation, just as a human sales associate would.56, 57

READ MORE READ LESS

2017

A machine learns how to stop online trolling

As part of its anti-harassment efforts, Alphabet’s Jigsaw team built a system that learned to identify trolling by reading millions of website comments. The underlying algorithms could be a huge help for sites with limited resources for moderation.58, 59

READ MORE READ LESS

2017

What's next in machine learning?

How far this technology will take us remains to be seen, but applications in the works span everything from improving in-store retail experiences with IoT to boosting security with biometric data to predicting and diagnosing disease.60

READ MORE READ LESS

Sources

1.

"A Brief History of Computers" The University of Alabama in Huntsville

3.

"Blaise Pascal" Britannica

8.
10.

"The Babbage Engine" Computer History Museum

13.

"Ada Lovelace" Wikipedia

15.

"George Boole" Britannica

16.

"George Boole" Wikipedia

17.
18.

"Metropolis" Metropolis 1927

22.

"Neural Networks History: The 1940's to the 1970's" Department of Engineering: Computer Science, Stanford University

23.

"Artificial Neural Networks Technology" Department of Psychology, University of Toronto

24.
26.

"Arthur Samuel" Wikipedia

27.

"Arthur Lee Samuel" IEEE Computer Society

28.

"Neural Networks History: The 1940's to the 1970's" Department of Engineering: Computer Science, Stanford University

29.

"Perceptron and Adaline" Learn Artificial Neural Networks

30.

"Kubrick" The New Yorker

31.

"Marvin Minsky, pioneer in artificial intelligence, dies at 88" The Tech, Massachusetts Institute of Technology

33.

"Stanford Cart" Stanford University

34.
35.

"Learning, Then Talking" The New York Times

39.
41.
50.

"Predictive Data Can Reduce Emergency Room Wait Times" Graduate School of Stanford Business, Stanford University

52.
55.