Unit 04 Memory Updated
Unit 04 Memory Updated
Memory Organization
What is Cache Memory
• Cache memory is crucial for speeding up data access in computer
systems.
• Cache memory is a high-speed buffer between RAM and CPU.
Memory memory.
• An essential component for enhancing the
overall efficiency of computer systems.
• Definition: Secondary/Auxiliary Memory is a
Secondary/Auxiliary type of non-volatile, long-term storage that
Memory complements primary memory.
• Importance: Essential for storing large
amounts of data permanently, even when the
computer is powered off.
• Overview: Secondary memory acts as a
bridge between the CPU and long-term
storage, enabling efficient data management.
Characteristics of Secondary
Memory
• Optical Drives
• CD, DVD, and Blu-ray drives.
• Read and Write Capabilities: Used for data distribution and storage.
• USB Drives (Flash Drives)
• Portable and Convenient: Easily transportable for data transfer.
• Applications: Backup, data sharing.
Magnetic Tape
• The resistance state can then be measured. This principle is called ‘memrister’
and relies on the principle of hysteresis.
Phase Change Memory (PCM):
Key Features Fast Access Times: Enables quick read and write
and operations, improving overall system performance.
SSDs use three main types of memory: single-, multi- and triple-level
cells. Single-level cells can hold one bit of data at a time -- a one or
zero. Single-level cells (SLCs) are the most expensive form of SSD, but
are also the fastest and most durable. Multi-level cells (MLCs) can hold
two bits of data per cell and have a larger amount of storage space in
the same amount of physical space as a SLC. However, MLCs have
slower write speeds. Triple-level cells (TLCs) can hold three bits of data
in a cell. Although TLCs are cheaper, they also have slower write speeds
and are less durable than other memory types
Memory Address Map is a pictorial
representation of assigned address
space for each chip in the system
Application Requirements:
• Understand the specific needs of the application or system.
• Consider capacity, speed, and type of memory based on
application demands.
Capacity and Size:
• Determine required storage capacity.
• Consider physical size constraints of the memory.
Speed and Performance:
• Assess the speed requirements for data access and transfer.
• Balance speed with power consumption.
Volatility and Persistence:
• Decide on volatile or non-volatile memory based on data retention needs.
Power Consumption:
• Evaluate power requirements and energy efficiency.
Reliability and Endurance:
• Assess data integrity, reliability, and the number of read/write cycles.
Principle of Locality
• The Principle of Locality is a fundamental concept in
computer science and computer architecture. It refers to the
tendency of a program to access a relatively small portion of
its address space at any given time. There are two main
aspects of the Principle of Locality:
Temporal Locality:
• Definition: Temporal locality, also known as the "recently
used" principle, states that if a particular memory location is
accessed, it is likely to be accessed again in the near future.
• Example: In a loop, if a variable is accessed in one iteration,
it's highly probable that it will be accessed in subsequent
iterations.
Spatial Locality:
• Definition: Spatial locality, also
known as the "closeness" principle,
states that if a particular memory
location is accessed, nearby memory
locations are also likely to be
accessed in the near future.
• Example: When iterating through an
array, accessing one element makes
it likely that neighboring elements
will be accessed soon.
The transformation of data from main memory
to cache memory is referred to as a mapping
process, there are three types of mapping:
Associative mapping
Cache
mapping
techniques Direct mapping
Set-associative mapping
The fastest and most flexible cache
organization uses an associative memory
• Te = Tc + (1 - h) Tm