HISTORY OF COMPUTERS (additional)
19TH CENTURY
1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched
wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.
1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that
would be able to compute tables of numbers. Funded by the British government, the project, called the
"Difference Engine" fails due to the lack of technology at the time, according to the University of
Minnesota.
EARLY 20TH CENTURY
1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the
Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer.
1936: Alan Turing, a British scientist and mathematician, presents the principle of a universal machine,
later called the Turing machine, in a paper called "On Computable Numbers". Turing machines are
capable of computing anything that is computable. The central concept of the modern computer is
based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an
electro-mechanical device designed to decipher Nazi codes during World War II.
1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The
pair decide the name of their new company by the toss of a coin, and HewlettPackard's first
headquarters are in Packard's garage.
1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital
computer, according to Gerard O'Regan's book "A Brief History of Computing". The machine was
destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the
defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950.
1941 – J.V. Atanasoff and graduate student Clifford Berry devise a computer capable of solving 29
equations at the same time. The first time a computer can store data in its primary memory.
- J.V. Atanasoff is a professor of physics and mathematics at Iowa State University who submitted a
grant proposal to build the first electric-only computer, without using gears, cams, belts or
shafts.
1945 – University of Pennsylvania academics John Mauchly and J. Presper Eckert create an Electronic
Numerical Integrator and Calculator (ENIAC). It was Turing-complete and capable of solving “a vast class
of numerical problems” by reprogramming, earning it the title of “Grandfather of computers.”
1946 – The UNIVAC I (Universal Automatic Computer) was the first general-purpose electronic digital
computer designed in the United States for corporate applications.
1949 – The Electronic Delay Storage Automatic Calculator (EDSAC), developed by a team at the
University of Cambridge, is the “first practical stored-program computer.”
1950 – The Standards Eastern Automatic Computer (SEAC) was built in Washington, DC, and it was the
first stored-program computer completed in the United States.
LATE 20TH CENTURY
1953 – Grace Hopper, a computer scientist, creates the first computer language, which becomes known
as COBOL, which stands for COmmon, Business-Oriented Language. It allowed a computer user to offer
the computer instructions in English-like words rather than numbers.
1954 – John Backus and a team of IBM programmers created the FORTRAN programming language, an
acronym for FORmula TRANslation. In addition, IBM developed the 650.
1958 – The integrated circuit, sometimes known as the computer chip, was created by Jack Kirby and
Robert Noyce.
1962 – Atlas, the computer, makes its appearance. It was the fastest computer in the world at the time,
and it pioneered the concept of “virtual memory.”
1964 – Douglas Engelbart proposes a modern computer prototype that combines a mouse and a
graphical user interface (GUI).
1969 – Bell Labs developers, led by Ken Thompson and Dennis Ritchie, revealed UNIX, an operating
system developed in the C programming language that addressed program compatibility difficulties.
1970 – The Intel 1103, the first Dynamic Access Memory (DRAM) chip, is unveiled by Intel.
1971 – The floppy disc was invented by Alan Shugart and a team of IBM engineers. In the same year,
Xerox developed the first laser printer, which not only produced billions of dollars but also heralded the
beginning of a new age in computer printing.
1973 – Robert Metcalfe, a member of Xerox’s research department, created Ethernet, which is used to
connect many computers and other gear.
1974 – Personal computers were introduced into the market. The first were the Altair Scelbi & Mark-8,
IBM 5100, and Radio Shack’s TRS-80.
1975 – Popular Electronics magazine touted the Altair 8800 as the world’s first minicomputer kit in
January. Paul Allen and Bill Gates offer to build software in the BASIC language for the Altair.
1976 – Apple Computers is founded by Steve Jobs and Steve Wozniak, who expose the world to the
Apple I, the first computer with a single-circuit board.
1977 – At the first West Coast Computer Faire, Jobs and Wozniak announce the Apple II. It has colour
graphics and a cassette drive for storing music.
1978 – The first computerized spreadsheet program, VisiCalc, is introduced.
1979 – WordStar, a word processing tool from MicroPro International, is released.
1981 – IBM unveils the Acorn, their first personal computer, which has an Intel CPU, two floppy drives,
and a colour display. The MS-DOS operating system from Microsoft is used by Acorn.
1983 – The CD-ROM, which could carry 550 megabytes of pre-recorded data, hit the market. This year
also saw the release of the Gavilan SC, the first portable computer with a flip-form design and the first to
be offered as a “laptop.”
1984 – Apple launched Macintosh during the Superbowl XVIII commercial. It was priced at $2,500
1985 – Microsoft introduces Windows, which enables multitasking via a graphical user interface. In
addition, the programming language C++ has been released.
1990 – Tim Berners-Lee, an English programmer and scientist, creates HyperText Markup Language,
widely known as HTML. He also coined the term “WorldWideWeb.” It includes the first browser, a server,
HTML, and URLs.
1993 – The Pentium CPU improves the usage of graphics and music on personal computers.
1995 – Microsoft’s Windows 95 operating system was released. A $300 million promotional campaign
was launched to get the news out. Sun Microsystems introduces Java 1.0, followed by Netscape
Communications’ JavaScript.
1996 – At Stanford University, Sergey Brin and Larry Page created the Google search engine.
1998 – Apple introduces the iMac, an all-in-one Macintosh desktop computer. These PCs cost $1,300 and
came with a 4GB hard drive, 32MB RAM, a CD-ROM, and a 15-inch monitor.
1999 – Wi-Fi, an abbreviation for “wireless fidelity,” is created, originally covering a range of up to 300
feet.
21ST CENTURY
2000 – The USB flash drive is first introduced in 2000. They were speedier and had more storage space
than other storage media options when used for data storage.
2001 – Apple releases Mac OS X, later renamed OS X and eventually simply macOS, as the successor to
its conventional Mac Operating System.
2003 – Customers could purchase AMD’s Athlon 64, the first 64-bit CPU for consumer computers.
2004 – Facebook began as a social networking website.
2005 – Google acquires Android, a mobile phone OS based on Linux.
2006 – Apple’s MacBook Pro was available. The Pro was the company’s first dual-core, Intelbased mobile
computer.
2007 – The first iPhone was produced by Apple, bringing many computer operations into the palm of our
hands. Amazon also released the Kindle, one of the first electronic reading systems, in 2007.
2009 – Microsoft released Windows 7.
2011 – Google introduces the Chromebook, which runs Google Chrome OS.
2014 – The University of Michigan Micro Mote (M3), the world’s smallest computer, was constructed.
2015 – Apple introduces the Apple Watch. Windows 10 was also released by Microsoft.
2016 – The world’s first reprogrammable quantum computer is built.