EIE411
Computer Organization and
Architecture
Course Lecturers: Prof Samuel Daramola & Mr. Emmanuel Simonyan
Department of Electrical and Information Engineering
Covenant University, Nigeria
Course Outline
[Link] Concepts and Computer Evolution
History/Generations, Computer Evolution, Moore’s Law; Computer as a layered system;
Computer system: block diagram, functions, examples, dataflow, control line; Von Neumann &
Harvard architecture (principle of operation, advantages, disadvantages); CISC & RISC;
Computer Performance Metrics (Clock Speed, Instruction Execution Rate, MIPS rate, MFLOPS,
Mean etc.)
[Link] Level Representation of Data
Number Systems; Numeric data representation and number bases; Fixed- and floating-point
systems; Signed and two’s-complement representations; Representation of non-numeric data
(character codes, graphical data); Computer Arithmetic & ALU
[Link] Level Machine Organization
Computer Organization, Instruction Cycle (Fetch-Decode-Execute); Instruction Set Architecture
(ISA) - Instruction formats, Instruction Sets, Addressing Modes
Course Outline
[Link] System Organization and Architecture
Storage systems and their technology; Memory hierarchy; Main memory organization and
operations; Cache memory (address mapping, block size, replacement etc.); Performance
Metrics (Latency, cycle time, bandwidth etc.)
[Link] and Communication
I/O fundamentals: handshaking, buffering, programmed I/O, interrupt-driven I/O ; Interrupt
structures: vectored and prioritized, interrupt acknowledgment; External storage, physical
organization, and drives; Buses: bus protocols, arbitration, direct-memory access (DMA)
[Link] Organization
Control Unit (Hardwired vs. Microprogrammed Control); A peep into Instruction pipelining;
Subroutines, Interrupts, Multicore organization, SIMD vs. MIMD; Multiprocessing and
Alternative Architectures (RISC, Parallel, Multiprocessor), Performance Enhancements
(Superscalar architecture, Amdahl’s Law, Little’s Law etc)
Recommended Texts
• William Stallings. Computer Organization and
Architecture. 11th Edition, Pearson. 2018
• Linda Null and Julia Lobur. The Essentials
of Computer Organization and Architecture.
4th Edition, Jones & Bartlett Learning. 2015
• Carl Hamacher, Zvonko Vranesic, Safwat Zaky,
Naraig Manjikian: Computer Organization
and Embedded Systems (6th edition) McGraw
Hill,2012
Module 1:
Basic Concepts
and Computer
Evolution
Module 1 Objectives
At the end of this module, the students should be able to discuss the following concepts:
a) Overview of Computer Organization and Architecture
b) Computer Evolution
c) Moore’s Law
d) Computer as a layered system
e) Other processor-related technologies suchas application-specific processors, embedded
systems, multicore computer structure, cloud computing
Basic Concepts
Why is this course important?
Compiler Design: Understanding computer architecture is crucial for compiler
developers when they need to translate high-level programming languages into
machine code that effectively interacts with the hardware component.
Design better programs, including system software such as compilers, operating
systems, and device drivers.
Troubleshooting: When you are faced with a debugging performance problem on a
system, a solid grasp of architecture can help identify the root cause of the issue and
know the right question to ask.
Optimize program behavior.
Evaluate (benchmark) computer system performance
Understand time, space, and price tradeoffs
Basic Concepts cont’d
Real-World Applications:
Smartphones: Optimizing performance, and power consumption.
Data Centers: Scaling, efficiency, reliability.
Embedded Systems: Real-time processing, low power consumption.
Artificial Intelligence: High-performance computing, parallel processing.
Broader application of Computer Organization and Architecture
Modularity: Break down complex problems into manageable modules.
Abstraction: Focus on essential components, ignoring non-essential details.
Scalability: Design systems to adapt to growth and change.
Efficiency: Optimize resources for maximum performance.
Interconnectedness: Understand how components interact and impact overall systems.
Adaptability: Be prepared to update and refine systems as technology evolves.
Problem-solving: Analyze and debug complex issues systematically.
What is a Computer?
What is a Computer?
A computer is a device made up of five functionally independent units: input,
memory, arithmetic and logic, output, and control units.
These units work together to store and process information.
Instructions, or machine instructions, are explicit commands that govern the input,
storage, processing, and output operations.
Data is the information that is being stored and processed by the computer.
Instructions and data are encoded in a binary format that can be understood by the
computer's hardware
How does a Computer
Work?
How does a Computer work?
The computer receives information in the form of programs and data
through an input unit and stores it in the memory.
Information stored in the memory is fetched under program control
into an arithmetic and logic unit, where it is processed.
Processed information leaves the computer through an output unit.
All activities in the computer are directed by the control unit.
Basic functional units of a computer
Source: Carl Hamacher 2012
How does a Computer work?
The input unit accepts information from human operators or other computers.
Examples of common input devices include keyboards, touchpads, mice, joysticks,
trackballs, microphones and cameras.
Digital communication facilities, such as the Internet, can also provide input to a
computer from other computers and database servers.
The memory unit stores programs and data. There are two classes of storage:
primary and secondary.
Primary memory, also called main memory, is a fast memory that operates at
electronic speeds.
Programs must be stored in the main memory while they are being executed.
The main memory is organized so that the data can be stored or retrieved in
groups of fixed size called words. The number of bits in each word is referred to
as the word length of the computer.
Secondary memory provides backup storage.
How does a Computer work?
Arithmetic and logic unit: this unit performs arithmetic and logical operations on
the data. The arithmetic and logic unit and control unit circuits are often referred to
as the processor.
The output unit sends the processed information to the outside world. Examples of
common output devices include displays, speakers, and motors.
The control unit coordinates all of the activities of the computer. This unit directs the
flow of information between the other units. It also interprets and executes the
instructions in the programs.
Interconnection Network: Provides the communication pathway for the functional
units to exchange information and coordinate actions. One example is the system bus.
Computer Organization
and Architecture
Basic Concepts
Computer Organization refers to the physical components of a computer system
and how they are interconnected. It deals with the operational units and the
connections that implement the architectural specifications.
Forexample, it specifies the type of memory used, the control signals, and the bus
structure.
Computer Architecture defines the logical structure and behavior of the computer
system as seen by a programmer. It includes aspects like the instruction set, data
types, addressing modes, and memory organization.
For instance, it defines the number of registers available, the instruction formats, and
the types of operations that can be performed.
Inessence, computer organization describes how the computer is physically put
together, while computer architecture defines its logical functionality.
Basic Concepts
Computer Level Hierarchy
Complex computer systems employ a “divide and conquer”
approach, where each module solves a smaller problem.
Complex computer systems are made up of layers or series of
virtual machine layers.
Each virtual machine layer is an abstraction of the level below it.
The machines at each level execute their own particular
instructions, calling upon machines at lower levels to perform
tasks as required.
Computer circuits ultimately carry out the work.
Computer Level Hierarchy
Level 6: The User Level
Program execution and user interface level.
The level with which we are most familiar.
Level 5: High-Level Language Level
The level with which we interact when we write
programs in languages such as C, Pascal, Lisp, and
Java.
Level 4: Assembly Language Level
Acts upon assembly language produced from Level
5, as well as instructions programmed directly at
this level.
Computer Level Hierarchy
Level 3: System Software Level
Controls executing processes on the system.
Protects system resources.
Assembly language instructions often pass through
Level 3 without modification.
Level 2: Machine Level
Also known as the Instruction Set Architecture
(ISA) Level.
Consists of instructions that are particular to the
architecture of the machine.
Programs written in machine language need no
compilers, interpreters, or assemblers.
Computer Level Hierarchy
Level 1: Control Level
A control unit decodes and executes instructions and moves
data through the system.
Control units can be microprogrammed or hardwired.
A microprogram is a program written in a low-
level language that is implemented by the hardware.
Hardwired control units consist of hardware that directly
executes machine instructions.
Level 0: Digital Logic Level
This level is where we find digital circuits (the chips).
Digital circuits consist of gates and wires.
These components implement the mathematical logic of all
other levels.
Computer Types
Since their introduction in the 1940s, digital computers have evolved into
many types that vary widely in size, cost, computational power, and intended
use. Modern
computers can be divided roughly into five general categories:
Embedded computers are integrated into a larger device or system
in order to automatically monitor and control a physical process or
environment. They are used for a specific purpose rather than for
general processing tasks.
Typical applications include industrial and home automation,
appliances, telecommunication products, and vehicles. Users may not
even be aware of the role that computers play in such systems.
Computer Types
Personal computers have achieved widespread use in homes, educational
institutions, and business and engineering office settings, primarily for dedicated
individual use.
They support a variety of applications such as general computation, document
preparation,
computer-aided design, audiovisual entertainment, interpersonal communication, and
Internet browsing. A number of classifications are used for personal computers.
Desktop computers serve general needs and fit within a typical personal workspace.
Workstation computers offer higher computational capacity and more powerful
graphical display capabilities for engineering and scientific work.
Finally, Portable and Notebook computers provide the basic features of a personal
computer in a smaller lightweight package. They can operate on batteries to provide
mobility.
Computer Types
Servers and Enterprise systems are large computers that are meant to be shared by a potentially
large number of users who access them from some form of personal computer over a public or
private network.
Such computers may host large databases and provide information processing for a government
agency or a commercial organization.
Supercomputers and Grid computers normally offer the highest performance. They are the most
expensive and physically the largest category of computers. Supercomputers are used for the highly
demanding computations needed in weather forecasting, engineering design and simulation, and
scientific work.
Cloud computing. Personal computer users access widely distributed computing and storage
server resources for individual, independent, computing needs. The Internet provides the necessary
communication facility.
Cloud hardware and software service providers operate as a utility, charging on a pay-as-you-use
basis.
Things as a service: infrastructure as a service (IAAS), Platform as a service (PAAS), Software as a
service (SAAS)
History of Computers
History of Computers:
Gen. Zero
The evolution of computing machinery has taken place
over several centuries.
usually classified into different generations
according to the technology of the era.
Generation Zero: Mechanical Calculating Machines
(1642 - 1945)
Examples
Calculating Clock - Wilhelm Schickard (1592 - 1635).
Pascaline - Blaise Pascal (1623 - 1662).
Difference Engine - Charles Babbage (1791 - 1871),
also designed but never built the Analytical Engine.
Punched card tabulating machines - Herman Hollerith
(1860 - 1929).
History of Computers:
1 Gen.
st
First Generation: Vacuum Tubes Computers (1940 - 1956)
This was when machine language was developed for the use of
computers. They used vacuum tubes for the circuitry digital
logic elements and memory
Example – ENIAC
• Electronic Numerical Integrator and Computer (ENIAC) by John
Mauchly and J. Presper Eckertat the University of Pennsylvania,
1946
Example – IBM 650
The IBM 650 first mass-produced computer. (1955). It was
phased out in 1969.
History of Computers:
1 Gen.
st
Example - IAS computer
Fundamental design approach was the stored program
concept
Attributed to the mathematician John von Neumann
First publication of the idea was in 1945 for the EDVAC
Design began at the Princeton Institute for Advanced Studies
Completed in 1952
Prototype of all subsequent general-purpose computers
Programming was done in machine language
No operating system
Programming and maintenance done by one group of people
ENIAC – THE FIRST
ELECTRONIC COMPUTER
(1946)
• 18,000
tubes
• 300 Tn
• 170 KWatt
30
History of Computers: 2 Gen. nd
Second Generation: Transistors (Transistorized) Computers (1954 - 1965)
Was invented at Bell Labs in 1947. It was not until the late 1950’s that fully
transistorized computers were commercially available
Features
Smaller and Cheaper
Dissipates less heat than a vacuum tube
Is a solid-state device made from silicon
Introduced:
More complex arithmetic and logic units and control units
The use of high-level programming languages
31
Provision of system software which provided the ability to:
Load programs
Move data to peripherals
Libraries perform common computations
History of Computers: 2 Gen. (1955-1965)
nd
Transistor-based
Fairly reliable
Clear distinction between designers, manufacturers, users,
programmers, and support personnel.
Only afforded by governments, universities or large companies
(millions $)
Program was first written on paper (FORTRAN) and then
punched into cards
32
Cards were then delivered to the user.
Mostly used for scientific and technical calculations
Solving differential equations
History of Computers: 2 Gen. nd
Examples
a) IBM 7094 (scientific)
and 1401 (business)
b) Digital Equipment
Corporation (DEC)
PDP-1
c) Univac 1100
d) Control Data 33
Corporation 1604
. . . and many
others.
History of Computers: 3rd Gen.
Third Generation: Integrated Circuit Computers (1965 - 1980)
The integrated circuit was invented in 1958
Discrete component
A single, self-contained transistor is called a discrete component
Throughout the 1950s and early 1960s, electronic equipment was
composed largely of discrete components such as transistors, resistors
and capacitors.
Manufactured separately, packaged in their own containers, and
soldered or wired together onto masonite-like circuit boards
The manufacturing process was expensive and cumbersome
The two most important members of the third generation were the
IBM System/360 and the DEC PDP-8. Others are Cray-1
supercomputer 2
5
Computer manufacturers of this era were characterized as IBM and
the BUNCH (Burroughs, Unisys, NCR, Control Data, and Honeywell).
ABOUT INTEGRATED-CIRCUIT COMPUTERS (3RD GEN.)
A computer consists of gates, memory cells, and
interconnections among these elements.
The gates and memory cells are constructed of simple
digital electronic components:
Data storage: provided by memory cells.
Data processing: provided by gates.
Data movement: the paths among components are used
to move data from memory to memory and from memory
through gates to memory.
Digital electronic components (such as transistors,
resistors, and conductors) can be fabricated from a
semiconductor (such as silicon)
Many transistors can be produced at the same time on a
2
single wafer of silicon 6
Transistors can be connected with a processor
metallization to form circuits.
ABOUT INTEGRATED CIRCUIT COMPUTERS – 3RD
GEN.
ICs revolutionized electronics and started the era of
microelectronics. Microelectronics means, literally, “small
electronics.
Since the beginnings of digital electronics and the
computer industry, there has been a persistent and
consistent trend toward the reduction in the size of
digital electronic circuits.
Only two fundamental types of components are required
for the computer: gates and memory cells.
A gate is a device that implements a simple Boolean or
logical function, such as AND gate, or OR gate.
Such devices are called gates because they control data
flow in much the same way that canal gates control the
flow of water.
The memory cell is a device that can store 2 one bit of
7
data; that is, the device can be in one of two stable states
at any time.
By interconnecting large numbers of these
fundamental devices, we can construct a computer.
HISTORY OF COMPUTERS – 4th Gen.
The Fourth Generation: VLSI Computers (1971 - ????)
Very large-scale integrated circuits (VLSI) have more than 10,000
components per chip.
Enabled the creation of microprocessors; the first was the 4-bit Intel
4004.
Later versions, such as the 8080, 8086, and 8088 spawned the idea
of “personal computing.”
Significantly cheaper
User-friendly software
2 dominant operating systems: 3
7
MS DOS: IBM PC (8088, 80286, 80386, 80486)
UNIX: RISC workstations
Each machine runs its own operating system
Users don’t care where their programs are being executed
Moore’s Law (1965) on Integration
Gordon Moore, Intel founder
“The density of transistors in an integrated circuit will double every
year.”
Contemporary version: “The density of silicon chips doubles every 18
months.”
COMPUTER GENERATIONS - SUMMARY
3
8
It has become widely accepted to classify computers into generations
based on the fundamental hardware technology employed.
Each new generation is characterized by greater processing
performance, larger memory capacity, and smaller size than the
previous one.
MOORE’S
LAW
• Moore's law is a term used to refer to the observation made by Gordon
Moore in 1965 that the number of transistors in a dense integrated circuit
(IC) doubles about every two years
4
0
4
1
4
2
3
5
3
6
LATER COMPUTER
GENERATIONS
VLSI Technology improved semiconductor Memories and
Microprocessors
4
5
SEMICONDUCTOR
MEMORY • Chip was about the size of a single core
In 1970 Fairchild produced the • Could hold 256 bits of memory
first relatively capacious • Non-destructive
semiconductor
• Much faster than core
memory
• There has been a continuing and rapid decline in
In 1974 the price per bit of
semiconductor memory dropped memory cost accompanied by a corresponding
below the price per bit of core increase in physical memory density
memory • Developments in memory and processor technologies
changed the nature of computers in less than a decade
• Each generation has provided 4x the storage density
Since 1970 semiconductor of the previous generation, accompanied by declining
memory has been through at least cost per bit and declining access time
13 generations • Generations include K, 4K,16K, 64K, 256K, 1M, 4M,
16M, 64M, 256M, 1G, 4G, 8G)
46
MICROPROCESSORS
The density of elements on processor chips continued to rise
More and more elements were placed on each chip so that fewer and fewer
chips were needed to construct a single computer processor
1971 Intel developed 4004
First chip to contain all of the components of a CPU on a single chip
Birth of microprocessor
1972 Intel developed 8008
First 8-bit microprocessor
1974 Intel developed 8080 4
7
First general purpose microprocessor
Faster, has a richer instruction set, has a large addressing capability
Similar or identical instruction
set
Similar or identical operating
system
MICROPROCESSOR
Increasing speed S
AND
Increasing number of I/O ports FAMILY
Increasing memory size
CHARACTERI
STICS: THE
Increasing cost CONCEPT OF
COMPATIBLE
COMPUTERS
48
THE EVOLUTION OF
PROCESSOR ARCHITECTURE
Two processor families are the Intel x86 and the ARM
architectures
Current x86 offerings represent the results of decades of
design effort on complex instruction set computers (CISCs)
An alternative approach to processor design is the reduced
instruction set computer (RISC)
4
9
ARM architecture is used in a wide variety of embedded
systems and is one of the most powerful and best-designed
RISC-based systems on the market
What is their current market share?
EVOLUTION OF
4
3
INTEL
MICROPROCESSORS
EVOLUTION OF INTEL MICROPROCESSORS
5
1
(a) 1970s
Processors
EVOLUTION OF INTEL MICROPROCESSORS
5
2
(b) 1980s
Processors
EVOLUTION OF INTEL
MICROPROCESSORS
5
3
(c) 1990s
Processors
EVOLUTION OF INTEL
MICROPROCESSORS
5
4
(d) Recent
Processors
HIGHLIGHTS OF THE EVOLUTION OF THE
INTEL PRODUCT LINE (I)
8080
World’s first general-purpose microprocessor
8-bit machine, 8-bit data path to memory
Was used in the first personal computer (Altair)
8086
A more powerful 16-bit machine
Has an instruction cache, or queue, that prefetches a few instructions
before they are executed
The first appearance of the x86 architecture
The 8088 was a variant of this processor and used in IBM’s first personal
computer (secured the success of Intel)
80286
Extension of the 8086 enabling addressing a 16-MB memory instead of
just 1MB 4
8
HIGHLIGHTS OF THE EVOLUTION OF THE
INTEL PRODUCT LINE (II)
80386
Intel’s first 32-bit machine
First Intel processor to support multitasking
80486
Introduced the use of much more sophisticated and powerful cache
technology and sophisticated instruction pipelining
Also offered a built-in math coprocessor
Pentium
Intel introduced the use of superscalar techniques, which allow multiple
instructions to execute in parallel
Pentium Pro
Continued the move into superscalar organization with aggressive use of
register renaming, branch prediction, data flow analysis, and speculative
execution 4
9
HIGHLIGHTS OF THE EVOLUTION OF THE
Pentium II INTEL PRODUCT LINE (III)
Incorporated Intel MMX technology, which is designed specifically to
process video, audio, and graphics data efficiently
Pentium III
Incorporated additional floating-point instructions
Streaming SIMD Extensions (SSE)
Pentium 4
Includes additional floating-point and other enhancements for
multimedia
Core
First Intel x86 micro-core
Core 2
Extends the Core architecture to 64 bits
Core 2 Quad provides four cores on a single chip
More recent Core offerings have up to 10 cores per chip
Extensions
An important instruction setthe architecture was the Advanced Vector
addition to 5
0
HOW MANY
GENERATIONS OF
INTEL PROCESSORS ARE
THERE?
WHAT ARE THEIR NAMES?
5
1
EVOLUTION OF
5
2
ARM
PROCESSORS
ARM PROCESSORS
ARM stands for Advanced RISC Machine or Acorn RISC Machine.
It is one of the most licensed and extensive processor cores in the world.
It refers to a processor architecture that has evolved from RISC design
principles and is used in embedded systems.
The first ARM processor was introduced by Cambridge University in
1978 while the first ARM processor was produced by Acorn Group of
Computers in 1985.
ARM was founded and became very popular in 1990.
In 2007, ARM processors were used in more than 98% of mobile phones
and approximately 10 billion processors were shipped in 2008.
ARM Chips are high-speed processors that are known for their small die
size and low power requirements.
In general, ARM is a 16-bit/32-bit Processor or Controller that acts
as a heart in advanced digital products.
Probably the most widely used embedded processor 5 architecture and
indeed the most widely used processor architecture3 of any kind in the
world
@ARM
5
4
5
5
ARM
PRODUCTS
ARM Holdings licenses a number of specialized microprocessors and
related technologies.
The bulk of their product line is the Cortex family of microprocessor
architectures.
There are three Cortex architectures, conveniently labeled with the
initials A, R, & M.
The Cortex-A and Cortex-A50 are application processors, intended for mobile
devices such as smartphones and eBook readers, as well as consumer devices
such as digital TV and home gateways.
The two architectures use both the ARM and Thumb-2 instruction sets
The principal difference is that the Cortex-A is a 32-bit machine, and the
Cortex-A50 is a 64-bit machine.
The Cortex-R is designed to support real-time applications, in which the
timing of events needs to be controlled with rapid response to events.
Cortex-M series processors have been developed primarily for the
microcontroller domain
where the need for fast, highly deterministic interrupt management is
coupled with the desire for extremely low gate count and lowest possible
power consumption.
56
@ARM
5
7
Security Analog Interfaces Timers &Triggers Parallel I/O Ports Serial Interfaces
Periph Timer/
bus counter Pin
Hard- USART USB
int reset
ware A/D D/A Real
AES con- con- energy
Low time ctr
verter verter General External Low-
Pulse Watch- purpose Inter- UART energy
counter dog tmr I/O
rupts UART
Peripheral bus
32-bit bus
Voltage Voltage High fre- High freq Flash SRAM DMA
regula- compar- quency RC crystal memory Debug control-
tor ator oscillator oscillator 64 kB memory inter- ler
64 kB face
Brown- Low fre- Low freq Memory
Power- protec- Cortex-M3 processor
out quency RC crystal
on reset tion unit
de- oscillator oscillator
tector Core and memory
Energy management Clock management
Microcontroller Chip ICode SRAM &
interface peripheral I/F
Bus matrix
Debug logic
Memory
DAP protection unit
ARM
NVIC core ETM
Cortex-M3 Core
NVIC ETM Cortex-M3
interface interface
Processor
32-bit ALU
Hardware 32-bit 5
divider multiplier 8
Control Thumb
logic
decode
Instruction Data
interface interface
Figure 1.16 Typical Microcontroller Chip Based on Cortex-M3
5
MODULE
9
SUMMARY
SUMMARY - BASIC
CONCEPTS AND COMPUTER
EVOLUTION
Organization and architecture General-purpose Processors
INTEL processors (+evolution)
Semiconductor Technology
ARM processors (+evolution)
Integration
Transistors & Moore’s Law Application-specific Processors
Digital Signal processors
Computer Evolution Multimedia processors
The First Generation: Vacuum tubes
Neural processors
The Second Generation: Transistors
The Third Generation: Integrated Circuits Assembler/Compiler Technology
Later generations High-Level Language (HLL)
Embedded Systems
Computer System
Computer Structure Cloud Computing
Computer Function
6
0
Assignment 1
Read and Research on the following topics
a) Harvard vs. Von Neumann Architecture
b) Components of Von Neumann architecture and the function of basic computer components
c) CISC vs. RISC
Recommendation Resources
• [Link]
• [Link]
Thank you!