Amortized analysis for increment in counter
Last Updated :
27 Jan, 2023
Amortized analysis refers to determining the time-averaged running time for a sequence (not an individual) operation. It is different from average case analysis because here, we don't assume that the data arranged in average (not very bad) fashion like we do for average case analysis for quick sort. That is, amortized analysis is worst case analysis but for a sequence of operation rather than an individual one. It applies to the method that consists of the sequence of operation, where a vast majority of operations are cheap but some of the operations are expensive. This can be visualized with the help of binary counter which is implemented below.
Let's see this by implementing an increment counter in C. First, let's see how counter increment works.
Let a variable i contains a value 0 and we performs i++ many time. Since on hardware, every operation is performed in binary form. Let binary number stored in 8 bit. So, value is 00000000. Let's increment many time. So, the pattern we find are as :
00000000, 00000001, 00000010, 00000011, 00000100, 00000101, 00000110, 00000111, 00001000 and so on .....
Steps :
1. Iterate from rightmost and make all one to zero until finds first zero.
2. After iteration, if index is greater than or equal to zero, then make zero lie on that position to one.
C++
#include <bits / stdc++.h>
using namespace std;
int main()
{
char str[] = "10010111";
int length = strlen(str);
int i = length - 1;
while (str[i] == '1') {
str[i] = '0';
i--;
}
if (i >= 0)
str[i] = '1';
printf("% s", str);
}
Java
import java.util.*;
class GFG{
public static void main(String args[])
{
String st = "10010111";
char[] str = st.toCharArray();
int lengthh = st.length();
int i = lengthh - 1;
while (str[i] == '1')
{
str[i] = '0';
i--;
}
if (i >= 0)
str[i] = '1';
System.out.print( str);
}
}
// This code is contributed by sanjoy_62
Python3
# Python code for above approach
str = "10010111"
length=len(str)
i=length-1
while(str[i]=='1'):
str=str[:i]+'0'+str[i+1:]
i=i-1
if(i>=0):
str=str[:i]+'1'+str[i+1:]
print(str)
# This code is contributed by Pushpesh Raj.
C#
using System;
public class GFG{
public static void Main(String []args)
{
String st = "10010111";
char[] str = st.ToCharArray();
int lengthh = st.Length;
int i = lengthh - 1;
while (str[i] == '1')
{
str[i] = '0';
i--;
}
if (i >= 0)
str[i] = '1';
Console.Write( str);
}
}
// This code is contributed by Rajput-Ji
JavaScript
// Javascript code for above approach
let str = "10010111";
let length = str.length;
let i = length - 1;
while (str[i] == '1') {
str = str.slice(0,i)+'0'+str.slice(i+1,length);
i--;
}
if (i >= 0)
str=str.slice(0,i)+'1'+str.slice(i+1,length);
console.log(str);
// This code is contributed by Aman Kumar
Output:
10011000
Time complexity: O(n) where n is the length of the string.
Auxiliary Space: O(1)
On a simple look on program or algorithm, its running cost looks proportional to the number of bits but in real, it is not proportional to a number of bits. Let's see how!
Let's assume that increment operation is performed k time. We see that in every increment, its rightmost bit is getting flipped. So, the number of flipping for LSB is k. For, second rightmost is flipped after a gap, i.e., 1 time in 2 increments. 3rd rightmost - 1 time in 4 increments. 4th rightmost - 1 time in 8 increments. So, the number of flipping is k/2 for 2nd rightmost bit, k/4 for 3rd rightmost bit, k/8 for 4th rightmost bit and so on ...
Total cost will be the total number of flipping, that is,
C(k) = k + k/2 + k/4 + k/8 + k/16 + ...... which is Geometric Progression series and also,
C(k) < k + k/2 + k/4 + k/8 + k/16 + k/32 + ...... up to infinity
So, C(k) < k/(1-1/2)
and so, C(k) < 2k
So, C(k)/k < 2
Hence, we find that average cost for increment a counter for one time is constant and it does not depend on the number of bit. We conclude that increment of a counter is constant cost operation.
References :
- http://www.cs.cornell.edu/courses/cs3110/2013sp/supplemental/recitations/rec21.html
- http://faculty.cs.tamu.edu/klappi/csce411-s17/csce411-amortized3.pdf
Similar Reads
Static Hazards in Digital Logic
A hazard, if exists, in a digital circuit causes a temporary fluctuation in the output of the circuit. In other words, a hazard in a digital circuit is a temporary disturbance in the ideal operation of the circuit which if given some time, gets resolved itself. These disturbances or fluctuations occ
4 min read
Introduction of Sequential Circuits
Sequential circuits are digital circuits that store and use the previous state information to determine their next state. Unlike combinational circuits, which only depend on the current input values to produce outputs, sequential circuits depend on both the current inputs and the previous state stor
7 min read
Flip-Flop types, their Conversion and Applications
In this article, we will go through the Flip-Flop types, their Conversion and their Applications, First, we will go through the definition of the flip-flop with its types in brief, and then we will go through the conversion of the flip-flop with its applications, At last, we will conclude our articl
7 min read
Synchronous Sequential Circuits in Digital Logic
Synchronous sequential circuits are digital circuits that use clock signals to determine the timing of their operations. They are commonly used in digital systems to implement timers, counters, and memory elements.What is Sequential Circuit?A sequential circuit is a digital circuit, whose output dep
5 min read
Counters in Digital Logic
A Counter is a device which stores (and sometimes displays) the number of times a particular event or process has occurred, often in relationship to a clock signal. Counters are used in digital electronics for counting purpose, they can count specific event happening in the circuit. For example, in
4 min read
Ring Counter in Digital Logic
A ring counter is a typical application of the Shift register. The ring counter is almost the same as the shift counter. The only change is that the output of the last flip-flop is connected to the input of the first flip-flop in the case of the ring counter but in the case of the shift register it
7 min read
n-bit Johnson Counter in Digital Logic
Prerequisite - Counters Johnson counter also known as creeping counter, is an example of synchronous counter. In Johnson counter, the complemented output of last flip flop is connected to input of first flip flop and to implement n-bit Johnson counter we require n flip-flop. It is one of the most im
5 min read
Ripple Counter in Digital Logic
Counters play a crucial role in digital logic circuits, enabling tasks such as clock frequency division and sequencing. This article explores the concept of ripple counters, a type of asynchronous counter, their operation, advantages, and disadvantages in digital logic design. What is a Counter?Coun
5 min read
Design counter for given sequence
A Counter is a device which stores (and sometimes displays) the number of times a particular event or process has occurred, often in relationship to a clock signal. Counters are used in digital electronics for counting purpose, they can count specific event happening in the circuit. For example, in
3 min read
Master-Slave JK Flip Flop
Prerequisite -Flip-flop types and their ConversionRace Around Condition In JK Flip-flop - For J-K flip-flop, if J=K=1, and if clk=1 for a long period of time, then Q output will toggle as long as CLK is high, which makes the output of the flip-flop unstable or uncertain. This problem is called race
4 min read